Private preview: This integration is available to select accounts. Reach out to your Immuta representative for details.
The Google BigQuery integration allows users to query policy protected data directly in BigQuery as secure views within an Immuta-created dataset. Immuta controls who can see what within the views, allowing data governors to create complex ABAC policies and data users to query the right data within the BigQuery console.
Google BigQuery is configured through the Immuta console and a script provided by Immuta. While you can complete some steps within the BigQuery console, it is easiest to install using gcloud and the Immuta script.
Once Google BigQuery has been configured, BigQuery admins can start creating subscription and data policies to meet compliance requirements and users can start querying policy protected data directly in BigQuery.
Create a global subscription or supported data policy.
Revoke user access to the original datasets and grant users access to the Immuta created datasets in BigQuery.
Users query data from the Immuta created datasets directly in BigQuery.
What permissions will Immuta have in my BigQuery environment?
You can find a list of the permissions the custom Immuta role has here.
What integration features will Immuta support for BigQuery?
For private preview, Immuta supports a basic version of the BigQuery integration where Immuta can enforce specific policies on data in a single BigQuery project. At this time, workspaces, tag ingestion, user impersonation, native query audit, and multiple integrations are not supported.
In this policy push integration, Immuta creates views that contain all policy logic. Each view has a 1-to-1 relationship with the original table. Access controls are applied in the view, allowing customers to leverage Immuta’s powerful set of attribute-based policies and query data directly in BigQuery.
BigQuery is organized by projects (which can be thought of as databases), datasets (which can be compared to schemas), tables, and views. When you enable the integration, an Immuta dataset is created in BigQuery that contains the Immuta-required user entitlements information. These objects within the Immuta dataset are intended to only be used and altered by the Immuta application.
After data sources are registered, Immuta uses the custom user and role, created before the integration is enabled, to push the Immuta data sources as views into a mirrored dataset of the original table. Immuta manages grants on the created view to ensure only users subscribed to the Immuta data source will see the data.
The Immuta integration uses a mirrored dataset approach. That is, if the source dataset is named mydataset
, Immuta will create a dataset named mydataset_secure
, assuming that _secure
is the specified Immuta dataset suffix. This mirrored dataset is an authorized dataset, allowing it to access the data of the original dataset. It will contain the Immuta-managed views, which have identical names to the original tables they’re based on.
Following the principle of least privilege, Immuta does not have permission to manage Google Cloud Platform users, specifically in granting or denying access to a project and its datasets. This means that data governors should limit user access to original datasets to ensure data users are accessing the data through the Immuta created views and not the backing tables. The only users who need to have access to the backing tables are the credentials used to register the tables in Immuta.
Additionally, a data governor must grant users access to the mirrored datasets that Immuta will create and populate with views. Immuta and BigQuery’s best practice recommendation is to grant access via groups in Google Cloud Platform. Because users still must be registered in Immuta and subscribed to an Immuta data source to be able to query Immuta views, all Immuta users can be granted access to the mirrored datasets that Immuta creates.
This integration can only be enabled through a manual bootstrap using the Immuta API.
This integration can only be enabled to work in a single region.
This integration supports the following policy types:
Column masking
Mask using hashing (SHA256())
Mask by making NULL
Mask using constant
Mask using a regular expression
Mask by date rounding
Mask by numeric rounding
Mask using custom functions
Row-level masking
Row visibility based on user attributes and/or object attributes
Only show rows that fall within a given time window
Minimize rows
Filter rows using custom WHERE clause
Always hide rows
See the resources below to start implementing and using the BigQuery integration:
Building global subscription and data policies to govern data
Creating projects to collaborate
Follow this guide to connect your Google BigQuery data warehouse to Immuta.
Immuta SaaS or Immuta v2023.1 or newer with Google BigQuery integration (PrPr) enabled.
Immuta role with SYSTEM_ADMIN permissions and an API key.
The Google BigQuery integration requires you to create a Google Cloud service account and role that will be used by Immuta to
create a Google BigQuery dataset that will be used to store a table of user entitlements, UDFs for policy enforcement, etc.
manage the table of user entitlements via updates when entitlements change in Immuta.
create datasets and secure views with access control policies enforced, which mirror tables inside of datasets you ingest as Immuta data sources.
You have two options to create the required Google Cloud service account and role:
The bootstrap.sh
script is a shell script provided by Immuta that creates prerequisite Google Cloud IAM objects for the integration to connect. When you run this script from your command line, it will create the following items, :
A new Google Cloud IAM role
A new Google Cloud service account, which will be granted the newly-created role
A JSON keyfile for the newly-created service account
You will need to use the objects created in these steps to enable the Google BigQuery integration.
Google Cloud IAM roles required to run the script
To execute bootstrap.sh
from your command line, you must be authenticated to the gcloud CLI utility as a user with all of the following roles:
roles/iam.roleAdmin
roles/iam.serviceAccountAdmin
roles/serviceusage.serviceUsageAdmin
Having these three roles is the least-privilege set of Google Cloud IAM roles required to successfully run the bootstrap.sh
script from your command line. However, having either of the following Google Cloud IAM roles will also allow you to run the script successfully:
roles/editor
roles/owner
Install gcloud.
Set the account property in the core section for Google Cloud CLI to the account gcloud should use for authentication. (You can run gcloud auth list to see your currently available accounts):
In Immuta, navigate to the App Settings page and click the Integrations tab.
Click Add Native Integration and select Google BigQuery from the dropdown menu.
Click Select Authentication Method and select Key File.
Click Download Script(s).
Before you run the script, update your permissions to execute it:
Run the script, where
PROJECT_ID is the Google Cloud Platform project to operate on.
ROLE_ID is the name of the custom role to create.
NAME will create a service account with the provided name.
OUTPUT_FILE is the path where the resulting private key should be written. File system write permission will be checked on the specified path prior to the key creation.
undelete-role (optional) will undelete the custom role from the project. Roles that have been deleted for a long time can't be undeleted. This option can fail for the following reasons:
The role specified does not exist.
The active user does not have permission to access the given role.
enable-api (optional) provided you’ve been granted access to enable the Google BigQuery API, will enable the service.
Alternatively, you may use the Google Cloud Console to create the prerequisite role, service account, and private key file for the integration to connect to Google BigQuery.
Create a custom role using the console with the following privileges:
bigquery.datasets.create
bigquery.datasets.delete
bigquery.datasets.get
bigquery.datasets.update
bigquery.jobs.create
bigquery.jobs.get
bigquery.jobs.list
bigquery.jobs.listAll
bigquery.routines.create
bigquery.routines.delete
bigquery.routines.get
bigquery.routines.list
bigquery.routines.update
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.setCategory
bigquery.tables.update
bigquery.tables.updateData
bigquery.tables.updateTag
Create a service account and grant it the custom role you just created.
Once the Google Cloud IAM custom role and service account are created, you can enable the Google BigQuery integration. This section illustrates how to enable the integration on the Immuta app settings page. To configure this integration via the Immuta API, see the Configure a Google BigQuery integration API guide.
In Immuta, navigate to the App Settings page and click the Integrations tab.
Click Add Native Integration and select Google BigQuery from the dropdown menu.
Click Select Authentication Method and select Key File.
Upload your GCP Service Account Key File. This is the private key file generated in create a Google Cloud service account and role for Immuta to use to connect to Google BigQuery. Uploading this file will auto-populate the following fields:
Project Id: The Google Cloud Platform project to operate on, where your Google BigQuery data warehouse is located. A new dataset will be provisioned in this Google BigQuery project to store the integration configuration.
Service Account: The service account you created in create a Google Cloud service account and role for Immuta to use to connect to Google BigQuery.
Complete the following fields:
Immuta Dataset: The name of the Google BigQuery dataset to provision inside of the project. Important: if you are using multiple environments in the same Google BigQuery project, this dataset to provision must be unique across environments.
Immuta Role: The custom role you created in create a Google Cloud service account and role for Immuta to use to connect to Google BigQuery.
Dataset Suffix: The suffix that will be postfixed to the name of each dataset created to store secure views, one per dataset that you ingest a table for as a data source in Immuta. Important: if you are using multiple environments in the same Google BigQuery project, this suffix must be unique across environments.
GCP Location: The dataset’s location. After a dataset is created, the location can't be changed. Note that
If you choose EU for the dataset location, your Core BigQuery Customer Data resides in the EU.
Click Test Google BigQuery Integration.
Click Save.
GCP location must match dataset region
The region set for the GCP location must match the region of your datasets. Set GCP location to a general region (for example, US
) to include child regions.
You can disable the Google BigQuery integration automatically or manually.
Click the App Settings icon, and then click the Integrations tab.
Select the Google BigQuery integration you would like to disable, and select the Disable Integration checkbox.
Click Save.
The privileges required to run the cleanup script are the same as the Google Cloud IAM roles required to run the bootstrap.sh
script.
Click the App Settings icon, and then click the Integrations tab.
Select the Google BigQuery integration you would like to disable, and click Download Scripts.
Click Save. Wait until Immuta has finished saving your configuration changes before proceeding.
Before you run the script, update your permissions to execute it:
Run the cleanup script.
Build global subscription policies and data policies
Create projects to securely collaborate on analytical workloads
The table below provides definitions for each status and the state of configured data platform integrations. The status of the integration appears on the integrations tab of the Immuta application settings page and in the response schema of the integrations API.
If any errors occur with the integration configuration, a banner will appear in the Immuta UI with guidance for remediating the error.
Status | Description | State |
---|---|---|
createError
Error occurred during creation of the integration.
creating
Integration is in the process of being created and set up.
deleted
Integration is deleted.
Not in use
deleteError
Error occurred while deleting the integration. The integration has been rolled back to the previous state.
deleting
Integration is in the process of being disabled or deleted.
disabled
Integration was force disabled and no cleanup was performed on the native platform.
Not in use
editError
Error occurred while editing the integration. The integration has been rolled back to the previous state.
editing
The integration is in the process of being edited.
enabled
The integration is enabled and active.
migrateError
Error occurred while performing a migration of the integration. The integration has been rolled back to the previous state.
migrating
Migration is being performed on the integration. An example of a migration is a stored procedure update.
recurringValidationError
Validation has failed during the periodic check and the integration may be misconfigured.