Overview
Larger datasets may run into file size limitations set by BigQuery. In these instances, we recommend using the **Google BigQuery - Store Query Results in Google Cloud Storage** Blueprint.
Variables
|
Name |
Reference |
Type |
Required |
Default |
Options |
Description |
|---|---|---|---|---|---|---|
|
Query |
BIGQUERY_QUERY |
Alphanumeric |
✅ |
None |
- |
Standard SQL query to be executed against BigQuery. Does not support Legacy SQL. |
|
Workflows File Name |
BIGQUERY_DESTINATION_FILE_NAME |
Alphanumeric |
✅ |
None |
- |
Name of file to be generated with the results. Should be `.csv` extension. |
|
Workflows Folder Name |
BIGQUERY_DESTINATION_FOLDER_NAME |
Alphanumeric |
➖ |
None |
- |
Folder where the file should be downloaded. Leaving blank will place the file in the home directory. |
|
Service Account |
GOOGLE_APPLICATION_CREDENTIALS |
Password |
✅ |
None |
- |
JSON from a Google Cloud Service account key. |
YAML
Below is the YAML template
source:
template: Google BigQuery - Download Query Results to Workflows
inputs:
BIGQUERY_QUERY:
BIGQUERY_DESTINATION_FILE_NAME:
BIGQUERY_DESTINATION_FOLDER_NAME:
GOOGLE_APPLICATION_CREDENTIALS:
type: TEMPLATE
guardrails:
retry_count: 1
retry_wait: 0h0m0s
runtime_cutoff: 1h0m0s
exclude_exit_code_ranges:
- 101
- 103
- 104
- 200
- 203
- 205