Notes on using Snowflake
Load data to tableCopy
The Load data to Table operation has files as an input field.
What is required here is a list / array of filenames to load.
It is not possible to use this operation to stream files to a Snowflake data warehouse.
Insert rows in tableCopy
The Insert Rows in Table operation allows you to insert both single and multiple records into a Snowflake table.
When inserting records, ensure they are formatted correctly according to the table's schema. You can verify the expected format using the provided sample payloads
The workflow is designed to receive and process various types of data through a Webhook. For example, let's consider a scenario where the following JSON payload is received:
1{2"table": "employee_data",3"employee_data": [4{5"employee_id": "E001",6"employee_name": "John Doe",7"department": "Sales"8},9{10"employee_id": "E002",11"employee_name": "Jane Smith",12"department": "Marketing"13},14{15"employee_id": "E003",16"employee_name": "Alice Johnson",17"department": "HR"18}19],20"metadata": {21"overwrite": true,22"record_count": 323}24}25
The JSON Transformer connector can be used to convert this data into the appropriate format for insertion into Snowflake. For instance, the following JSONata query is applied to transform the data:
1{2"rows": employee_data.{3"data": [4{5"column": "employee_id",6"value": employee_id7},8{9"column": "employee_name",10"value": employee_name11},12{13"column": "department",14"value": department15}16]17}18}19
The transformed records are then inserted into the employee_data
table using the Snowflake connector, as shown in the screenshot below.
When using the Insert rows in table operation, it is important to note that, for all String/Varchar values in a table, Snowflake will only accept them enclosed in single quotes ' '
. Please see the example below for instructions on doing this correctly using interpolated mode