BigQuery
Last updated
Last updated
In order to write Data into your BigQuery warehouse, Whaly needs additional configuration on Google Cloud side. This guide will details the necessary steps:
Creation of a Google Cloud Storage Bucket
Giving permissions to your service account to the Storage Bucket
Steps where you need to paste info from the google cloud console to Whaly are identified with a
To connect BigQuery to Whaly, you need the following:
A Google Cloud Project
Billing enabled on your Google Cloud Project
Admin rights on the Google Cloud Console
A Service Account already configured in Whaly
In Google Cloud, Cloud Storage buckets are data repositories that are used to load or save data within Google Cloud. Whaly is using a Google Cloud Storage bucket to load data from its connector into your BigQuery tables.
Go to the Cloud Storage tab, and go to the browser page.
Click on "Create Bucket"
Create the bucket:
Name: The name that you want to keep, ex. "whaly-connectors-loading-deck
"
Location type: We advise "Multi-region" for maximum reliability
Storage class: Standard
Access control: Fine grained
Protection tool: None
You must give the service account (in the setup form) Storage Admin permission for the bucket, so that it can read and write the data from the bucket.
In your Google Cloud Console, go Storage > Browser to see the list of buckets in your current project.
Select the bucket you want to use.
Go to Permissions and then click Add Members.
In the Add members window, enter the Whaly service account email.
From the Select a role dropdown, select Storage Admin
In Whaly, go in your "Warehouse" page in the Settings and paste the gsutil URI
value in the "GCS Bucket Name" field.
Location: The region in which your BigQuery data is (EU / US) --> Set the same value in the Whaly form.
Open the details for the bucket you've just created, open the configuration tab, and paste the gsutil URI into Whaly:
That's it, your BigQuery setup is all set for being used with Whaly connectors