Contact Us 1-800-596-4880

Google BigQuery Connector Reference

Google BigQuery is a cloud data warehouse that enables customers to store and access their data while also offering tooling to run real-time analytics across datasets.

Before You Begin

To use Google BigQuery Connector, you must have a Google Workspace account. Connections from Composer to personal Google accounts are not permitted.

When you connect to your Google Workspace account, a message appears indicating that the Composer is an unverified app. This behavior is expected. To bypass this message and use the connector, on the Google hasn’t verified this app dialog, click Advanced > Use anyway.

Connections

To connect to a system, whether it is a source of data or the target where you are sending data, you must create a connection with that system’s required credentials. You typically connect to different systems within a flow and can reuse each connection every time you need to connect.

This system requires the following credential information for its connections:

Connection Display Name

Enter a connection name that will help you remember the details of this connection. You can reuse connections multiple times in a single flow, and in more than one flow, and the credentials are hidden after you create the connection. Therefore, it’s helpful to provide a name that easily identifies this connection from others.

Only you can see the connection. Other users in your Composer app must create their own connections.

Project ID

The project ID from Google BigQuery.

Username

The OAuth username that you use to log into Google BigQuery.

Password

The OAuth password that you use to log into Google BigQuery.

Data Types

The following data types are supported by Google BigQuery Connector:

  • INT64

  • BIGNUMERIC

  • BOOL

  • BYTES

  • DATE

  • DATETIME

  • FLOAT64

  • JSON

  • NUMERIC

  • STRING

  • TIME

  • TIMESTAMP

Triggers

Each flow starts with either a scheduler or a trigger. Schedulers are set to a default frequency of 15 minutes. Triggers, such as "each time a new record is created", are defined by you and provide multiple options related to the data in your system.

Triggers use a polling process to actively check for new updates from an external system. The polling frequency is 15 seconds.

This system provides the following trigger options:

On New Row Source

Returns all new rows of a table based on a watermark every 15 seconds. Provide a value for the following fields:

  • Dataset name

  • Table name

  • New Watermark Column: The following column data types are available for this field:

    • DATE

    • DATETIME

On New or Updated Row Source

Returns all new or updated rows of a table based on a watermark every 15 seconds. Provide a value for the following fields:

  • Dataset name

  • Table name

  • New Watermark Column: The following column data types are available for this field:

    • DATE

    • DATETIME

  • Update Watermark Column: The following column data types are available for this field:

    • DATE

    • DATETIME

Polling Frequency

For triggers in this connector, the Polling Frequency field is required. This field enables you to select how often Composer makes a request to the underling system API to look for changes to records in active flows. You can select a different polling frequency for each flow in your organization. With a shorter frequency, the flow executes more often; however, a shorter frequency consumes more API calls. For example, if you have an order management flow, you might require nearly real-time updates, which can be achieved with a 15-second polling interval (which equates to 5760 API calls per day). In contrast, if your flow updates a pricing catalog, you could poll the flow every hour (which equates to 24 API calls per day).

Actions

After you connect to a system and define the trigger or configure a scheduler, you define the actions that the flow performs. For example, you can copy a record between different systems, such as Workday and Salesforce. The actions you can perform depend on which system you are connected to.

This system provides the following actions:

Insert Rows

Updates a table by inserting one or more rows into the table. Provide a value for SQL string.

Update Rows

Updates a set of rows. Provide a value for SQL string.

Delete Rows

Deletes a set of rows. Provide a value for SQL string.

Select Rows

Retrieves a set of rows. Provide a value for SQL string.

Input Parameters

Query statements can include parameter values by adding : before the parameter name, such as Select field from dataset.table where field = :param1. You can define a value for the parameter in the Input Parameters section. For example, if you define param1 as 12345, the query statement is Select field from dataset.table where field = 12345.

You must define a value for the parameter as either a literal string or an output value from previous actions. Note that input parameter values of String or date data types must be provided between single quotes using the expression editor. For literal strings, add the single quotes to the value in the editor, such as ’12345’. For an output value from previous actions, select the custom expression option, add the single quotes to the custom expression in the editor, and apply the changes, such as ’Field From Previous Step’.

Google BigQuery™ is a trademark of Google LLC.