Handling rate limits when sending data (chunking / batching)

When sending batches of data to create or update records, it is important to note that services will often have a limit on the number of records that can be updated in one call.

For example the Salesforce 'batch create' operation only allows you to update 200 records at a time.

So if we try to batch create 1000 records we will see the following error:

So we can use the List Helpers 'Chunk' operation to divide the 1000 records into batches of 200.

The 5 batches of 200 records can then be looped through.

Each batch of 200 can then be transformed to meet the input schema requirements, and passed to Salesforce 'batch create'.

In order to avoid receiving a 429 rate limiting response from the service you should make sure the batch size doesn't mean you are sending batches too rapidly - e.g. processing 1000 records in batches of 5.

Likewise if you are processing 1000s of records, however large your batches may be, you might need to make use of the delay connector to set e.g. a 10 second delay between batches:

If your processing requirements are complex and / or you are dealing with large volumes of data, it would be best practice to send the chunked batches of data to a callable workflow to enable efficient parallel processing.