Amazon AppFlow supports Amazon Honeycode as a destination. Using AppFlow you can import data from S3, SalesForce or another supported SaaS application sources into Honeycode, with just a few clicks! However, when running multiple AppFlow imports into Honeycode (we will call these "Flows") you may want to consider some of the best practices below. These best practices will help you improve performance and avoid stuck imports. Finally, we will tell you how to get help if you experience problems.
While AppFlows can share connections to Honeycode it is recommended to create unique connections for each Flow. For example, when importing a Customers table, and an Orders table from S3, we recommend creating Honeycode_Customers_Import_connection, and Honeycode_Orders_Import_connection connections for the two import flows, Customer_Flow and Orders_Flow, respectively. It does not matter if Customers and Orders come from the same or different data sources; but it is recommended that you create different connections to Honeycode for each Flow.
This is particularly helpful if you see: "Authentication Error with status code 403" errors in your flow executions.
If you need to run multiple scheduled flows, we recommend you do not run them at the exact same time; instead separate them by 15 minutes or more. Running flows one after the other, rather than in parallel, avoids timeouts. If you are seeing 504 errors you should schedule the Flows to run at least 15 minutes after the previous one finishes, or give them ample time between starts (for example morning vs. evening, or 8AM and 10AM every morning).
When creating an AppFlow connection to Honeycode please use a username that has an Owner or Collaborator role in the workbook you are trying to import data into; the simplest way to do this is to be logged into Honeycode with a user who is the Owner or Collaborator in the workbook you are trying to import into.
Please note that this is not an Administrator for the whole team, but specifically Owner or Collaborator on a given workbook. If you create the workbook yourself you are automatically an Owner, and if you are invited to the workbook (not just the app) you will be given the Collaborator role. The only difference between the two is that Collaborator cannot delete the workbook, whereas the Owner can. You can read more about roles and workbook/app sharing in this article.
We recommend limiting the number of rows AppFlow imports into Honeycode to 500 or less rows at a time. You can do this by setting filters or smaller time ranges. This recommendation helps because it avoids over running the write buffer allocated to each application in Honeycode.
While it is possible to run an automation immediately after an imported row has been appended/added to a table, it is recommended that "processing" automations should be done later, after the import has completed. For example, you would want to let the import job write all 300 rows into an import table before you run an automation to process the import and update the relevant/related tables. To do this, you can add an ImportRowProcessed (date and time) column to your ImportTemp table; this column remains blank during the import. Later in the day, another automation updates the relevant tables and updates the ImportRowProcessed field. Perhaps have another automation that runs once a week and deletes all the records in the ImportTemp table with ImportRowProcessed > 7 days in the past - to clean up your import table.
Imports, just like regular data entry by humans, make API calls against the Honeycode database. Therefore, we recommend scheduling imports at night, or when regular users are not active in the application. When imports don't compete for API calls with your regular users, the apps will be more responsive for your app users.
Following these recommendation will help reduce errors and help your applications perform better.
To get help, free tier users can ask questions in the Community Forum, while paid customers can raise issues from within the product. If you are an AWS Support customer, you can also raise a case in the AWS Support Console.
|Was this article helpful?|