It illustrates the key steps involved in basic pagination:
At the start of the loop you retrieve the offset token using Data Storage 'Get value' (Scope - Workflow) (Default empty string for first run)
You retrieve the records using a 'List records'-type operation. Passing the offset token where appropriate and the batch size / page size (stored in your Project Config as best practice)
You then process the records as required (in this example each record is being added to a Google Sheet)
At the end of each loop you check if the service (Airtable) has returned an offset token, indicating there are more records to be pulled
If so then you store the offset token using Data storage 'Set value' (Scope - workflow) to be retrieved at the start of the next run
If not then you break the forever loop as there are no more records to process
When building and testing in Tray you can inspect the input and output logs of connector operations to look for key fields such as 'has_more' or 'offset_token'.
To save time when scoping out your project requirements, you can also make use of our Ops Explorer (beta) dev tool to explore sample input and output payloads for different operations to get a quick idea of any pagination requirements:
Aside from the Airtable example above, some other connector operations which require pagination are:
The Stripe connector has a 'List customers' operation which returns a has_more property so you know whether to make another request. It also lets you pass in a value called Starting After so that you can enter the ID of the last customer in the previous list of 100; thus Stripe will know to start the next batch of 100 from the customer which comes after this ID.
The Salesforce connector has a 'Find Records' operation which has a Limit parameter, as well as a Page offset parameter which allows you to set the record to start from (i.e. after the last record from the previous batch)