Operations (sample payloads)

Main operations
Copy

Create database
Copy

Create a database.

Sample Input

1
{
2
"auth": {},
3
"database_type": {
4
"standard": {
5
"database": "NEW_ANALYTICS_DB",
6
"replace": false,
7
"state": "TRANSIENT",
8
"if_not_exists": true,
9
"data_retention_time_in_days": 7,
10
"comment": "New database for analytics data"
11
}
12
}
13
}

Sample Output

1
{
2
"status": "success",
3
"message": "Database 'NEW_ANALYTICS_DB' created successfully"
4
}

Create schema
Copy

Create a schema.

Sample Input

1
{
2
"auth": {},
3
"database": "SALES_DB",
4
"schema": "CUSTOMER_DATA",
5
"replace": false,
6
"state": "TRANSIENT",
7
"if_not_exists": true,
8
"clone": {
9
"source_database": "ARCHIVE_DB",
10
"source_schema": "OLD_CUSTOMER_DATA",
11
"time_travel": {
12
"at_before": "AT",
13
"statement": {
14
"timestamp": "2023-06-15T10:30:00Z"
15
}
16
}
17
},
18
"data_retention_time_in_days": 14,
19
"comment": "Schema for storing customer-related data"
20
}

Sample Output

1
{
2
"status": "success",
3
"message": "Schema 'CUSTOMER_DATA' created successfully in database 'SALES_DB'"
4
}

Create stage
Copy

Create a new named internal or external stage to use for loading data from files into Snowflake tables and unloading data from tables into files.

Sample Input

1
{
2
"auth": {
3
"database": "MY_DATABASE",
4
"schema": "MY_SCHEMA"
5
},
6
"stage_name": "MY_STAGE",
7
"stage_type": {
8
"aws_s3_bucket": {
9
"bucket": "my-s3-bucket",
10
"path": "data/",
11
"credentials": {
12
"aws_key_id": "AKIAIOSFODNN7EXAMPLE",
13
"aws_secret_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
14
},
15
"encryption": {
16
"type": "AWS_SSE_S3"
17
}
18
}
19
},
20
"replace": true,
21
"temporary": false,
22
"stage_file_format": {
23
"type": {
24
"csv": {
25
"compression": "GZIP",
26
"record_delimiter": "\\n",
27
"field_delimiter": ",",
28
"skip_header": 1,
29
"date_format": "AUTO",
30
"time_format": "AUTO",
31
"timestamp_format": "AUTO",
32
"binary_format": "HEX",
33
"escape": "\\",
34
"trim_space": true,
35
"field_optionally_enclosed_by": "\"",
36
"null_if": [
37
"NULL",
38
""
39
],
40
"error_on_column_count_mismatch": true
41
}
42
}
43
},
44
"stage_copy_options": {
45
"on_error": "CONTINUE",
46
"size_limit": 1000000000,
47
"purge": false,
48
"enforce_length": true
49
},
50
"comment": "External stage for CSV files in S3"
51
}

Sample Output

1
{
2
"status": "success",
3
"message": "Stage 'MY_STAGE' created successfully"
4
}

Create table
Copy

Create a table.

Sample Input

1
{
2
"auth": {
3
"database": "MY_DATABASE",
4
"schema": "MY_SCHEMA"
5
},
6
"table": "EMPLOYEES",
7
"state": "PERMANENT",
8
"columns": [
9
{
10
"name": "EMPLOYEE_ID",
11
"type": {
12
"simple_type": "INTEGER"
13
},
14
"not_null": true,
15
"key_type": "primary_key"
16
},
17
{
18
"name": "FIRST_NAME",
19
"type": {
20
"varchar": {
21
"length": 50
22
}
23
},
24
"not_null": true
25
},
26
{
27
"name": "LAST_NAME",
28
"type": {
29
"varchar": {
30
"length": 50
31
}
32
},
33
"not_null": true
34
},
35
{
36
"name": "EMAIL",
37
"type": {
38
"varchar": {
39
"length": 100
40
}
41
},
42
"not_null": true
43
},
44
{
45
"name": "HIRE_DATE",
46
"type": {
47
"simple_type": "DATE"
48
},
49
"not_null": true
50
},
51
{
52
"name": "SALARY",
53
"type": {
54
"number": {
55
"precision": 10,
56
"scale": 2
57
}
58
},
59
"not_null": true
60
}
61
],
62
"comment": "Table to store employee information",
63
"data_retention_time_in_days": 30
64
}

Sample Output

1
{
2
"status": "success",
3
"message": "Table EMPLOYEES created successfully in MY_DATABASE.MY_SCHEMA"
4
}

Create warehouse
Copy

Creates a new virtual warehouse in the system. Creating a warehouse includes specifying its size. Initial creation of a warehouse may take some time to provision the servers, unless the warehouse is set to be created initially in a SUSPENDED state.

Sample Input

1
{
2
"auth": {},
3
"warehouse": "MY_NEW_WAREHOUSE",
4
"replace": false,
5
"if_not_exists": true,
6
"warehouse_properties": {
7
"warehouse_size": "MEDIUM",
8
"max_cluster_count": 3,
9
"min_cluster_count": 1,
10
"scaling_policy": "STANDARD",
11
"auto_suspend": 300,
12
"auto_resume": true,
13
"resource_monitor": "MY_RESOURCE_MONITOR",
14
"initially_suspended": false
15
},
16
"warehouse_parameters": {
17
"max_concurrency_level": 8,
18
"statement_queued_timeout_in_seconds": 60,
19
"statement_timeout_in_seconds": 3600
20
},
21
"comment": "New warehouse for data analytics team"
22
}

Sample Output

1
{
2
"status": "success",
3
"message": "Warehouse MY_NEW_WAREHOUSE created successfully"
4
}

Custom query
Copy

Send a custom SQL query to Snowflake.

Sample Input

1
{
2
"auth": {
3
"warehouse": "COMPUTE_WH"
4
},
5
"query": "SELECT * FROM sales_data WHERE total_amount > 1000 ORDER BY sale_date DESC LIMIT 5;"
6
}

Sample Output

1
{
2
"status": "success",
3
"rows": [
4
{
5
"sale_id": 1234,
6
"customer_name": "John Doe",
7
"product_name": "Laptop",
8
"total_amount": 1599.99,
9
"sale_date": "2023-05-15"
10
},
11
{
12
"sale_id": 1235,
13
"customer_name": "Jane Smith",
14
"product_name": "Smartphone",
15
"total_amount": 1299.99,
16
"sale_date": "2023-05-14"
17
},
18
{
19
"sale_id": 1236,
20
"customer_name": "Bob Johnson",
21
"product_name": "4K TV",
22
"total_amount": 1799.99,
23
"sale_date": "2023-05-13"
24
},
25
{
26
"sale_id": 1237,
27
"customer_name": "Alice Brown",
28
"product_name": "Gaming Console",
29
"total_amount": 1099.99,
30
"sale_date": "2023-05-12"
31
},
32
{
33
"sale_id": 1238,
34
"customer_name": "Charlie Wilson",
35
"product_name": "Desktop Computer",
36
"total_amount": 1499.99,
37
"sale_date": "2023-05-11"
38
}
39
]
40
}

Delete database
Copy

Removes a database from the system.

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB"
4
},
5
"database": "SALES_DB",
6
"if_exists": true,
7
"cascade_restrict": "CASCADE"
8
}

Sample Output

1
{
2
"status": "success",
3
"message": "Database 'SALES_DB' has been successfully deleted."
4
}

Delete files from stage
Copy

Removes files that have been staged (i.e. uploaded from a local file system or unloaded from a table)

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB",
4
"schema": "PUBLIC"
5
},
6
"stage": "MY_STAGE",
7
"path": "customer_data/",
8
"pattern": ".*\\.csv"
9
}

Sample Output

1
{
2
"status": "SUCCESS",
3
"rows": [
4
{
5
"name": "customer_data/customers_2023-01-01.csv",
6
"result": "DELETED"
7
},
8
{
9
"name": "customer_data/customers_2023-01-02.csv",
10
"result": "DELETED"
11
},
12
{
13
"name": "customer_data/customers_2023-01-03.csv",
14
"result": "DELETED"
15
}
16
]
17
}

Delete rows from table
Copy

Delete rows from the target table using optional query and additional (non-target) tables.

Sample Input

1
{
2
"auth": {
3
"warehouse": "COMPUTE_WH",
4
"database": "SALES_DB",
5
"schema": "PUBLIC"
6
},
7
"table": "CUSTOMERS",
8
"using": "ORDERS",
9
"where": "CUSTOMERS.CUSTOMER_ID NOT IN (SELECT DISTINCT CUSTOMER_ID FROM ORDERS WHERE ORDER_DATE > '2023-01-01')"
10
}

Sample Output

1
{
2
"status": "success",
3
"number of rows deleted": 152
4
}

Delete schema
Copy

Delete a schema from the current or specified database.

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB"
4
},
5
"schema": "CUSTOMER_SCHEMA",
6
"if_exists": true,
7
"cascade_restrict": "CASCADE"
8
}

Sample Output

1
{
2
"status": "success",
3
"message": "Schema CUSTOMER_SCHEMA has been successfully deleted from database SALES_DB."
4
}

Delete stage
Copy

Removes the specified named internal or external stage from the current/specified schema. The status of the files in the stage depends on the stage type.

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB",
4
"schema": "PUBLIC"
5
},
6
"stage": "CUSTOMER_DATA_STAGE",
7
"if_exists": true
8
}

Sample Output

1
{
2
"status": "success",
3
"message": "Stage 'CUSTOMER_DATA_STAGE' successfully deleted."
4
}

Delete table
Copy

Delete a table from the current or specified schema.

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB",
4
"schema": "PUBLIC"
5
},
6
"table": "CUSTOMERS",
7
"if_exists": true,
8
"cascade_restrict": "CASCADE"
9
}

Sample Output

1
{
2
"status": "success",
3
"message": "Table CUSTOMERS successfully deleted from SALES_DB.PUBLIC schema."
4
}

Delete warehouse
Copy

Removes the specified virtual warehouse from the system.

Sample Input

1
{
2
"auth": {},
3
"warehouse": "MY_WAREHOUSE",
4
"if_exists": true
5
}

Sample Output

1
{
2
"status": "success",
3
"message": "Warehouse 'MY_WAREHOUSE' has been successfully deleted."
4
}

Insert rows in table
Copy

Updates a table by inserting one or more rows into the table.

Sample Input

1
{
2
"auth": {
3
"warehouse": "COMPUTE_WH",
4
"database": "SALES_DB",
5
"schema": "PUBLIC"
6
},
7
"warehouse": "COMPUTE_WH",
8
"table_database": "SALES_DB",
9
"table_schema": "PUBLIC",
10
"table": "CUSTOMERS",
11
"rows": [
12
{
13
"data": [
14
{
15
"column": "CUSTOMER_ID",
16
"value": "'C001'"
17
},
18
{
19
"column": "FIRST_NAME",
20
"value": "'John'"
21
},
22
{
23
"column": "LAST_NAME",
24
"value": "'Doe'"
25
},
26
{
27
"column": "EMAIL",
28
"value": "'john.doe@example.com'"
29
}
30
]
31
},
32
{
33
"data": [
34
{
35
"column": "CUSTOMER_ID",
36
"value": "'C002'"
37
},
38
{
39
"column": "FIRST_NAME",
40
"value": "'Jane'"
41
},
42
{
43
"column": "LAST_NAME",
44
"value": "'Smith'"
45
},
46
{
47
"column": "EMAIL",
48
"value": "'jane.smith@example.com'"
49
}
50
]
51
}
52
],
53
"overwrite": false
54
}

Sample Output

1
{
2
"status": "success",
3
"rows": [
4
{
5
"number of rows inserted": 2
6
}
7
]
8
}

List copy history
Copy

This operation can be used to query Snowflake data loading history along various dimensions within the last 7 days.

Sample Input

1
{
2
"auth": {},
3
"table": "SALES",
4
"warehouse": "COMPUTE_WH",
5
"start_time": "2023-05-01T00:00:00Z",
6
"end_time": "2023-05-07T23:59:59Z",
7
"database": "MYDB",
8
"schema": "PUBLIC"
9
}

Sample Output

1
{
2
"status": "success",
3
"rows": [
4
{
5
"FILE_NAME": "sales_data_20230502.csv",
6
"STAGE_LOCATION": "@my_stage/sales/",
7
"LAST_LOAD_TIME": "2023-05-02T10:15:30Z",
8
"ROW_COUNT": 15000,
9
"ROW_PARSED": 15000,
10
"FILE_SIZE": 2500000,
11
"FIRST_ERROR_MESSAGE": null,
12
"FIRST_ERROR_LINE_NUMBER": 0,
13
"FIRST_ERROR_CHARACTER_POS": 0,
14
"FIRST_ERROR_COLUMN_NAME": null,
15
"ERROR_COUNT": 0,
16
"ERROR_LIMIT": 100,
17
"STATUS": "LOADED",
18
"TABLE_CATALOG_NAME": "MYDB",
19
"TABLE_SCHEMA_NAME": "PUBLIC",
20
"TABLE_NAME": "SALES",
21
"PIPE_CATALOG_NAME": null,
22
"PIPE_SCHEMA_NAME": null,
23
"PIPE_NAME": null,
24
"PIPE_RECEIVED_TIME": null
25
},
26
{
27
"FILE_NAME": "sales_data_20230503.csv",
28
"STAGE_LOCATION": "@my_stage/sales/",
29
"LAST_LOAD_TIME": "2023-05-03T09:45:12Z",
30
"ROW_COUNT": 14500,
31
"ROW_PARSED": 14500,
32
"FILE_SIZE": 2400000,
33
"FIRST_ERROR_MESSAGE": null,
34
"FIRST_ERROR_LINE_NUMBER": 0,
35
"FIRST_ERROR_CHARACTER_POS": 0,
36
"FIRST_ERROR_COLUMN_NAME": null,
37
"ERROR_COUNT": 0,
38
"ERROR_LIMIT": 100,
39
"STATUS": "LOADED",
40
"TABLE_CATALOG_NAME": "MYDB",
41
"TABLE_SCHEMA_NAME": "PUBLIC",
42
"TABLE_NAME": "SALES",
43
"PIPE_CATALOG_NAME": null,
44
"PIPE_SCHEMA_NAME": null,
45
"PIPE_NAME": null,
46
"PIPE_RECEIVED_TIME": null
47
}
48
]
49
}

List databases
Copy

Lists the databases for which you have access privileges, including the dropped databases that are still within the Time Travel retention period.

Sample Input

1
{
2
"auth": {},
3
"warehouse": "MY_WAREHOUSE",
4
"terse": false,
5
"history": true,
6
"like": "PROD%",
7
"starts_with": "PROD",
8
"limit": {
9
"number": 5,
10
"from": "PROD_DB1"
11
}
12
}

Sample Output

1
{
2
"status": "success",
3
"rows": [
4
{
5
"created_on": "2023-05-15 09:30:22.123 -0700",
6
"name": "PROD_DB1",
7
"is_default": "N",
8
"is_current": "N",
9
"origin": "USER",
10
"owner": "SYSADMIN",
11
"comment": "Production database 1",
12
"options": "RETENTION_TIME = 1",
13
"retention_time": "1"
14
},
15
{
16
"created_on": "2023-05-15 10:15:43.456 -0700",
17
"name": "PROD_DB2",
18
"is_default": "N",
19
"is_current": "Y",
20
"origin": "USER",
21
"owner": "SYSADMIN",
22
"comment": "Production database 2",
23
"options": "RETENTION_TIME = 1",
24
"retention_time": "1"
25
},
26
{
27
"created_on": "2023-05-16 08:45:11.789 -0700",
28
"name": "PROD_DB3",
29
"is_default": "N",
30
"is_current": "N",
31
"origin": "USER",
32
"owner": "SYSADMIN",
33
"comment": "Production database 3",
34
"options": "RETENTION_TIME = 1",
35
"retention_time": "1"
36
}
37
]
38
}

List files in stage
Copy

List the files currently in a stage.

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB",
4
"schema": "PUBLIC"
5
},
6
"stage_name": "MY_STAGE",
7
"path": "customer_data/",
8
"pattern": ".*\\.csv$"
9
}

Sample Output

1
{
2
"rows": [
3
{
4
"name": "customer_data/customers_2023_01.csv",
5
"size": 1048576,
6
"md5": "a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6",
7
"last_modified": "2023-01-15T10:30:00Z"
8
},
9
{
10
"name": "customer_data/customers_2023_02.csv",
11
"size": 2097152,
12
"md5": "q7r8s9t0u1v2w3x4y5z6a7b8c9d0e1f2",
13
"last_modified": "2023-02-15T11:45:00Z"
14
},
15
{
16
"name": "customer_data/customers_2023_03.csv",
17
"size": 3145728,
18
"md5": "g3h4i5j6k7l8m9n0o1p2q3r4s5t6u7v8",
19
"last_modified": "2023-03-15T09:15:00Z"
20
}
21
],
22
"status": "success"
23
}

List schemas
Copy

Lists the schemas for which you have access privileges, including the dropped schemas that are still within the Time Travel retention period.

Sample Input

1
{
2
"auth": {
3
"warehouse": "COMPUTE_WH"
4
},
5
"warehouse": "COMPUTE_WH",
6
"terse": false,
7
"history": true,
8
"like": "SALES%",
9
"in": {
10
"database": "SNOWFLAKE_SAMPLE_DATA"
11
},
12
"starts_with": "SALES",
13
"limit": {
14
"number": 5,
15
"from": "SALES_2022"
16
}
17
}

Sample Output

1
{
2
"status": "success",
3
"rows": [
4
{
5
"is_current": "Y",
6
"name": "SALES_2022",
7
"is_default": "N",
8
"retention_time": 1,
9
"database_name": "SNOWFLAKE_SAMPLE_DATA",
10
"owner": "SYSADMIN",
11
"created_on": "2022-01-01 00:00:00.000 -0800",
12
"options": "TRANSIENT = false",
13
"comment": "Sales data for 2022"
14
},
15
{
16
"is_current": "Y",
17
"name": "SALES_2023",
18
"is_default": "N",
19
"retention_time": 1,
20
"database_name": "SNOWFLAKE_SAMPLE_DATA",
21
"owner": "SYSADMIN",
22
"created_on": "2023-01-01 00:00:00.000 -0800",
23
"options": "TRANSIENT = false",
24
"comment": "Sales data for 2023"
25
},
26
{
27
"is_current": "Y",
28
"name": "SALES_ARCHIVE",
29
"is_default": "N",
30
"retention_time": 90,
31
"database_name": "SNOWFLAKE_SAMPLE_DATA",
32
"owner": "SYSADMIN",
33
"created_on": "2021-01-01 00:00:00.000 -0800",
34
"options": "TRANSIENT = false",
35
"comment": "Archived sales data"
36
},
37
{
38
"is_current": "N",
39
"name": "SALES_2021",
40
"is_default": "N",
41
"retention_time": 1,
42
"database_name": "SNOWFLAKE_SAMPLE_DATA",
43
"owner": "SYSADMIN",
44
"created_on": "2021-01-01 00:00:00.000 -0800",
45
"options": "TRANSIENT = false",
46
"comment": "Sales data for 2021 (dropped)"
47
},
48
{
49
"is_current": "Y",
50
"name": "SALES_FORECAST",
51
"is_default": "N",
52
"retention_time": 1,
53
"database_name": "SNOWFLAKE_SAMPLE_DATA",
54
"owner": "SYSADMIN",
55
"created_on": "2023-06-01 00:00:00.000 -0700",
56
"options": "TRANSIENT = true",
57
"comment": "Sales forecast data"
58
}
59
]
60
}

List stages
Copy

Lists the stages for which you have access privileges.

Sample Input

1
{
2
"auth": {},
3
"warehouse": "MY_WAREHOUSE",
4
"in": {
5
"database": "MY_DATABASE",
6
"schema": "MY_SCHEMA"
7
},
8
"like": "STAGE_%"
9
}

Sample Output

1
{
2
"status": "SUCCESS",
3
"rows": [
4
{
5
"has_encryption_key": "false",
6
"schema_name": "MY_SCHEMA",
7
"name": "STAGE_1",
8
"url": "s3://my-bucket/stage1/",
9
"database_name": "MY_DATABASE",
10
"owner": "ACCOUNTADMIN",
11
"region": null,
12
"cloud": null,
13
"created_on": "2023-05-15T10:30:00.000Z",
14
"has_credentials": "true",
15
"type": "EXTERNAL",
16
"comment": "External stage for data loading"
17
},
18
{
19
"has_encryption_key": "true",
20
"schema_name": "MY_SCHEMA",
21
"name": "STAGE_2",
22
"url": "azure://my-container/stage2/",
23
"database_name": "MY_DATABASE",
24
"owner": "SYSADMIN",
25
"region": null,
26
"cloud": null,
27
"created_on": "2023-05-16T14:45:00.000Z",
28
"has_credentials": "true",
29
"type": "EXTERNAL",
30
"comment": "Encrypted external stage for sensitive data"
31
}
32
]
33
}

List table rows
Copy

Lists rows in one or more tables.

Sample Input

1
{
2
"auth": {
3
"warehouse": "COMPUTE_WH",
4
"database": "SALES_DB",
5
"schema": "PUBLIC"
6
},
7
"warehouse": "COMPUTE_WH",
8
"table_database": "SALES_DB",
9
"table_schema": "PUBLIC",
10
"table": "CUSTOMERS",
11
"distinct": false,
12
"objects": [
13
{
14
"column_name": "CUSTOMER_ID",
15
"column_alias": "ID"
16
},
17
{
18
"column_name": "FIRST_NAME"
19
},
20
{
21
"column_name": "LAST_NAME"
22
},
23
{
24
"column_name": "EMAIL"
25
}
26
],
27
"query": "WHERE CUSTOMER_ID > 1000",
28
"limit": 5
29
}

Sample Output

1
{
2
"status": "success",
3
"rows": [
4
{
5
"ID": 1001,
6
"FIRST_NAME": "John",
7
"LAST_NAME": "Doe",
8
"EMAIL": "john.doe@example.com"
9
},
10
{
11
"ID": 1002,
12
"FIRST_NAME": "Jane",
13
"LAST_NAME": "Smith",
14
"EMAIL": "jane.smith@example.com"
15
},
16
{
17
"ID": 1003,
18
"FIRST_NAME": "Michael",
19
"LAST_NAME": "Johnson",
20
"EMAIL": "michael.johnson@example.com"
21
},
22
{
23
"ID": 1004,
24
"FIRST_NAME": "Emily",
25
"LAST_NAME": "Brown",
26
"EMAIL": "emily.brown@example.com"
27
},
28
{
29
"ID": 1005,
30
"FIRST_NAME": "David",
31
"LAST_NAME": "Wilson",
32
"EMAIL": "david.wilson@example.com"
33
}
34
]
35
}

List tables
Copy

Lists the tables for which you have access privileges, including the dropped tables that are still within the Time Travel retention period.

Sample Input

1
{
2
"auth": {},
3
"warehouse": "MY_WAREHOUSE",
4
"terse": false,
5
"history": true,
6
"like": "SALES%",
7
"in": {
8
"database": "MY_DATABASE"
9
},
10
"starts_with": "SALES_",
11
"limit": {
12
"number": 5,
13
"from": "SALES_2023"
14
}
15
}

Sample Output

1
{
2
"status": "SUCCESS",
3
"rows": [
4
{
5
"schema_name": "PUBLIC",
6
"bytes": 1048576,
7
"name": "SALES_2023_Q1",
8
"retention_time": "1",
9
"database_name": "MY_DATABASE",
10
"owner": "SALESADMIN",
11
"kind": "TABLE",
12
"created_on": "2023-01-01 00:00:00.000 -0800",
13
"rows": 100000,
14
"cluster_by": "ORDER_DATE",
15
"comment": "Q1 sales data"
16
},
17
{
18
"schema_name": "PUBLIC",
19
"bytes": 2097152,
20
"name": "SALES_2023_Q2",
21
"retention_time": "1",
22
"database_name": "MY_DATABASE",
23
"owner": "SALESADMIN",
24
"kind": "TABLE",
25
"created_on": "2023-04-01 00:00:00.000 -0700",
26
"rows": 150000,
27
"cluster_by": "ORDER_DATE",
28
"comment": "Q2 sales data"
29
},
30
{
31
"schema_name": "PUBLIC",
32
"bytes": 3145728,
33
"name": "SALES_2023_Q3",
34
"retention_time": "1",
35
"database_name": "MY_DATABASE",
36
"owner": "SALESADMIN",
37
"kind": "TABLE",
38
"created_on": "2023-07-01 00:00:00.000 -0700",
39
"rows": 200000,
40
"cluster_by": "ORDER_DATE",
41
"comment": "Q3 sales data"
42
}
43
]
44
}

List warehouses
Copy

Lists the warehouses for which you have access privileges.

Sample Input

1
{
2
"auth": {},
3
"like": "PROD%"
4
}

Sample Output

1
{
2
"status": "SUCCESS",
3
"rows": [
4
{
5
"name": "PROD_WAREHOUSE",
6
"state": "STARTED",
7
"type": "STANDARD",
8
"size": "X-LARGE",
9
"running": 5,
10
"queued": 2,
11
"is_default": "N",
12
"is_current": "Y",
13
"auto_suspend": 600,
14
"auto_resume": "true",
15
"available": "YES",
16
"provisioning": "N",
17
"quiescing": "N",
18
"other": null,
19
"created_on": "2023-05-01T10:00:00.000Z",
20
"resumed_on": "2023-05-15T08:30:00.000Z",
21
"updated_on": "2023-05-15T09:15:00.000Z",
22
"owner": "SYSADMIN",
23
"comment": "Production warehouse for data processing",
24
"resource_monitor": "PROD_MONITOR",
25
"actives": 3,
26
"pendings": 1,
27
"failed": 0,
28
"suspended": 0,
29
"uuid": "01a2b3c4-5d6e-7f8g-9h0i-jklmnopqrstu"
30
},
31
{
32
"name": "PROD_REPORTING",
33
"state": "SUSPENDED",
34
"type": "STANDARD",
35
"size": "LARGE",
36
"running": 0,
37
"queued": 0,
38
"is_default": "N",
39
"is_current": "N",
40
"auto_suspend": 300,
41
"auto_resume": "true",
42
"available": "YES",
43
"provisioning": "N",
44
"quiescing": "N",
45
"other": null,
46
"created_on": "2023-04-15T14:30:00.000Z",
47
"resumed_on": null,
48
"updated_on": "2023-05-10T16:45:00.000Z",
49
"owner": "REPORTADMIN",
50
"comment": "Warehouse for generating reports",
51
"resource_monitor": "REPORTING_MONITOR",
52
"actives": 0,
53
"pendings": 0,
54
"failed": 0,
55
"suspended": 1,
56
"uuid": "98v7u6t5-4s3r-2q1p-0o9i-nmlkjihgfedcb"
57
}
58
]
59
}

Load data to table
Copy

Load data from staged files to an existing table. The files must already be staged.

Sample Input

1
{
2
"auth": {
3
"database": "MYDB",
4
"schema": "PUBLIC",
5
"warehouse": "COMPUTE_WH"
6
},
7
"table": "CUSTOMERS",
8
"warehouse": "COMPUTE_WH",
9
"stage_type": {
10
"internal": {
11
"value": "my_stage",
12
"path": "customer_data/"
13
}
14
},
15
"file_format": {
16
"type": {
17
"csv": {
18
"compression": "GZIP",
19
"field_delimiter": ",",
20
"skip_header": 1
21
}
22
}
23
},
24
"copy_options": {
25
"on_error": "CONTINUE",
26
"size_limit": 1000000000,
27
"purge": false
28
}
29
}

Sample Output

1
{
2
"status": "LOADED",
3
"file": "customer_data/customers_20230601.csv.gz",
4
"rows_parsed": 10000,
5
"rows_loaded": 9998,
6
"error_limit": null,
7
"errors_seen": 2,
8
"first_error": "Invalid date format in column 'DATE_OF_BIRTH'",
9
"first_error_line": null,
10
"first_error_character": null,
11
"first_error_column_name": null
12
}

Unload data from table
Copy

Unload data from a table (or query) into one or more files in a named internal stage, a named external stage or an external stage.

Sample Input

1
{
2
"auth": {},
3
"stage_type": {
4
"internal": {
5
"value": "my_internal_stage",
6
"path": "unloaded_data/"
7
}
8
},
9
"database": "MY_DATABASE",
10
"schema": "MY_SCHEMA",
11
"table": "CUSTOMERS",
12
"warehouse": "MY_WAREHOUSE",
13
"pattern": ".*\\.csv",
14
"stage_copy_options": {
15
"overwrite": true,
16
"single": false,
17
"max_file_size": 50000000
18
},
19
"stage_file_format": {
20
"type": {
21
"csv": {
22
"compression": "GZIP",
23
"record_delimiter": "\\n",
24
"field_delimiter": ",",
25
"file_extension": ".csv.gz",
26
"date_format": "YYYY-MM-DD",
27
"time_format": "HH24:MI:SS",
28
"timestamp_format": "YYYY-MM-DD HH24:MI:SS",
29
"binary_format": "HEX",
30
"escape": "\\",
31
"escape_unenclosed_field": "\\",
32
"field_optionally_enclosed_by": "\"",
33
"null_if": [
34
"NULL",
35
""
36
]
37
}
38
}
39
},
40
"header": true
41
}

Sample Output

1
{
2
"status": "SUCCESS",
3
"rows_unloaded": 1000000,
4
"input_bytes": 500000000,
5
"output_bytes": 150000000
6
}

Update database
Copy

Modifies the properties for an existing database, including changing the name of the database and changing the Time Travel data retention period (if you are using Snowflake Enterprise Edition or higher).

Sample Input

1
{
2
"auth": {},
3
"database": "SALES_DB",
4
"if_exists": true,
5
"operation": {
6
"set": {
7
"data_retention_time_in_days": 90,
8
"comment": "Updated sales database with 90-day data retention"
9
}
10
}
11
}

Sample Output

1
{
2
"status": "success",
3
"message": "Database SALES_DB updated successfully"
4
}

Update row in table
Copy

Updates specified rows in the target table with new values.

Sample Input

1
{
2
"auth": {
3
"warehouse": "COMPUTE_WH",
4
"database": "SALES_DB",
5
"schema": "PUBLIC"
6
},
7
"warehouse": "COMPUTE_WH",
8
"table_database": "SALES_DB",
9
"table_schema": "PUBLIC",
10
"table": "CUSTOMERS",
11
"set": [
12
{
13
"column": "EMAIL",
14
"value": "'new.email@example.com'"
15
},
16
{
17
"column": "LAST_UPDATED",
18
"value": "CURRENT_TIMESTAMP()"
19
}
20
],
21
"where": "CUSTOMER_ID = 1001"
22
}

Sample Output

1
{
2
"status": "success",
3
"number of rows updated": 1,
4
"number of multi-joined rows updated": 0
5
}

Update schema
Copy

Modifies the properties for an existing schema, including renaming the schema or swapping it with another schema, and changing the Time Travel data retention period (if you are using Snowflake Enterprise Edition or higher).

Sample Input

1
{
2
"auth": {
3
"database": "SALES_DB"
4
},
5
"schema": "CUSTOMER_DATA",
6
"if_exists": true,
7
"operation": {
8
"set": {
9
"data_retention_time_in_days": 30,
10
"comment": "Updated schema for customer data analysis"
11
}
12
}
13
}

Sample Output

1
{
2
"status": "success",
3
"message": "Schema CUSTOMER_DATA updated successfully"
4
}

Update stage
Copy

Modifies the properties for an existing named internal or external stage.

Sample Input

1
{
2
"auth": {
3
"database": "MY_DATABASE",
4
"schema": "MY_SCHEMA"
5
},
6
"stage": "MY_STAGE",
7
"if_exists": true,
8
"operation": {
9
"set": {
10
"type": {
11
"aws_s3_bucket": {
12
"bucket": "my-s3-bucket",
13
"path": "data/",
14
"credentials": {
15
"aws_key_id": "AKIAIOSFODNN7EXAMPLE",
16
"aws_secret_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
17
},
18
"encryption": {
19
"type": "AWS_SSE_S3"
20
}
21
}
22
},
23
"stage_file_format": {
24
"type": {
25
"csv": {
26
"compression": "GZIP",
27
"field_delimiter": ",",
28
"skip_header": 1
29
}
30
}
31
},
32
"stage_copy_options": {
33
"on_error": "CONTINUE",
34
"size_limit": 1000000000,
35
"purge": true
36
},
37
"comment": "Updated S3 stage for CSV files"
38
}
39
}
40
}

Sample Output

1
{
2
"status": "success",
3
"message": "Stage MY_STAGE updated successfully"
4
}

Update table
Copy

Modifies the properties, columns, or constraints for an existing table.

Sample Input

1
{
2
"auth": {
3
"database": "MY_DATABASE",
4
"schema": "MY_SCHEMA"
5
},
6
"table": "CUSTOMERS",
7
"operation": {
8
"add_column": {
9
"name": "LOYALTY_POINTS",
10
"type": {
11
"simple_type": "INTEGER"
12
},
13
"not_null": false,
14
"default": {
15
"default": "0"
16
}
17
}
18
}
19
}

Sample Output

1
{
2
"status": "success",
3
"message": "Column 'LOYALTY_POINTS' added to table 'CUSTOMERS' successfully."
4
}

Update warehouse
Copy

Suspends or resumes a virtual warehouse, or aborts all queries (and other SQL statements) for a warehouse. Can also be used to rename or set/unset the properties for a warehouse.

Sample Input

1
{
2
"auth": {},
3
"warehouse": "MY_WAREHOUSE",
4
"if_exists": true,
5
"operation": {
6
"set": {
7
"comment": "Updated warehouse settings",
8
"warehouse_size": "MEDIUM",
9
"max_cluster_count": 3,
10
"min_cluster_count": 1,
11
"scaling_policy": "STANDARD",
12
"auto_suspend": 300,
13
"auto_resume": true,
14
"resource_monitor": "MY_RESOURCE_MONITOR",
15
"max_concurrency_level": 8,
16
"statement_queued_timeout_in_seconds": 60,
17
"statement_timeout_in_seconds": 3600
18
}
19
}
20
}

Sample Output

1
{
2
"status": "success",
3
"message": "Warehouse MY_WAREHOUSE updated successfully"
4
}

DDL operations
Copy

List table columns (DDL)
Copy

Note that DDL operations can only be called directly by Connectors API, or when using CustomJS in the Embedded solution editor for e.g. DDL-dependent data mapping

Sample Input

1
{
2
"auth": {}
3
}

Sample Output

1
[
2
{
3
"column_name": "id",
4
"data_type": "NUMBER(38,0)",
5
"is_nullable": "NO",
6
"column_default": "NULL",
7
"is_identity": "YES",
8
"comment": "Unique identifier for the record"
9
},
10
{
11
"column_name": "first_name",
12
"data_type": "VARCHAR(50)",
13
"is_nullable": "NO",
14
"column_default": "NULL",
15
"is_identity": "NO",
16
"comment": "First name of the user"
17
},
18
{
19
"column_name": "last_name",
20
"data_type": "VARCHAR(50)",
21
"is_nullable": "NO",
22
"column_default": "NULL",
23
"is_identity": "NO",
24
"comment": "Last name of the user"
25
},
26
{
27
"column_name": "email",
28
"data_type": "VARCHAR(100)",
29
"is_nullable": "NO",
30
"column_default": "NULL",
31
"is_identity": "NO",
32
"comment": "Email address of the user"
33
},
34
{
35
"column_name": "created_at",
36
"data_type": "TIMESTAMP_NTZ(9)",
37
"is_nullable": "NO",
38
"column_default": "CURRENT_TIMESTAMP()",
39
"is_identity": "NO",
40
"comment": "Timestamp when the record was created"
41
}
42
]