LogoLogo
2025.1Book a demo
  • Immuta Documentation - 2025.1
  • Configuration
    • Deploy Immuta
      • Requirements
      • Install
        • Managed Public Cloud
        • Red Hat OpenShift
      • Upgrade
        • Migrating to the New Helm Chart
        • Upgrading IEHC
      • Guides
        • Ingress Configuration
        • TLS Configuration
        • Cosign Verification
        • Production Best Practices
        • Rotating Credentials
        • External Cache Configuration
        • Enabling Legacy Query Engine
        • Private Container Registries
        • Air-Gapped Environments
      • Disaster Recovery
      • Troubleshooting
      • Conventions
    • Connect Data Platforms
      • Data Platforms Overview
      • Amazon S3
      • AWS Lake Formation
        • Register an AWS Lake Formation Connection
        • AWS Lake Formation Reference Guide
      • Azure Synapse Analytics
        • Getting Started with Azure Synapse Analytics
        • Configure Azure Synapse Analytics Integration
        • Reference Guides
          • Azure Synapse Analytics Integration
          • Azure Synapse Analytics Pre-Configuration Details
      • Databricks
        • Databricks Spark
          • Getting Started with Databricks Spark
          • How-to Guides
            • Configure a Databricks Spark Integration
            • Manually Update Your Databricks Cluster
            • Install a Trusted Library
            • Project UDFs Cache Settings
            • Run R and Scala spark-submit Jobs on Databricks
            • DBFS Access
            • Troubleshooting
          • Reference Guides
            • Databricks Spark Integration Configuration
              • Installation and Compliance
              • Customizing the Integration
              • Setting Up Users
              • Spark Environment Variables
              • Ephemeral Overrides
            • Security and Compliance
            • Registering and Protecting Data
            • Accessing Data
              • Delta Lake API
        • Databricks Unity Catalog
          • Getting Started with Databricks Unity Catalog
          • How-to Guides
            • Register a Databricks Unity Catalog Connection
            • Configure a Databricks Unity Catalog Integration
            • Migrate to Unity Catalog
          • Databricks Unity Catalog Integration Reference Guide
      • Google BigQuery
      • Redshift
        • Getting Started with Redshift
        • How-to Guides
          • Configure Redshift Integration
          • Configure Redshift Spectrum
        • Reference Guides
          • Redshift Integration
          • Redshift Pre-Configuration Details
      • Snowflake
        • Getting Started with Snowflake
        • How-to Guides
          • Register a Snowflake Connection
          • Configure a Snowflake Integration
          • Snowflake Table Grants Migration
          • Edit or Remove Your Snowflake Integration
          • Integration Settings
            • Enable Snowflake Table Grants
            • Use Snowflake Data Sharing with Immuta
            • Configure Snowflake Lineage Tag Propagation
            • Enable Snowflake Low Row Access Policy Mode
              • Upgrade Snowflake Low Row Access Policy Mode
        • Reference Guides
          • Snowflake Integration
          • Snowflake Data Sharing
          • Snowflake Lineage Tag Propagation
          • Snowflake Low Row Access Policy Mode
          • Snowflake Table Grants
          • Warehouse Sizing Recommendations
        • Explanatory Guides
          • Phased Snowflake Onboarding
      • Starburst (Trino)
        • Getting Started with Starburst (Trino)
        • How-to Guides
          • Configure Starburst (Trino) Integration
          • Customize Read and Write Access Policies for Starburst (Trino)
        • Starburst (Trino) Integration Reference Guide
      • Queries Immuta Runs in Remote Platforms
      • Legacy Integrations
        • Securing Hive and Impala Without Sentry
        • Enabling ImmutaGroupsMapping
      • Connect Your Data
        • Connections
          • How-to Guides
            • Run Object Sync
            • Manage Connection Settings
            • Use the Connection Upgrade Manager
              • Troubleshooting
          • Reference Guides
            • Connections Reference Guide
            • Upgrading to Connections
              • Before You Begin
              • API Changes
              • FAQ
        • Data Sources
          • Data Sources in Immuta
          • Register Data Sources
            • Amazon S3 Data Source
            • Azure Synapse Analytics Data Source
            • Databricks Data Source
            • Google BigQuery Data Source
            • Redshift Data Source
            • Snowflake Data Source
              • Bulk Create Snowflake Data Sources
            • Starburst (Trino) Data Source
          • Data Source Settings
            • How-to Guides
              • Manage Data Sources and Data Source Settings
              • Manage Data Source Members
              • Manage Access Requests and Tasks
              • Manage Data Dictionary Descriptions
              • Disable Immuta from Sampling Raw Data
            • Data Source Health Checks Reference Guide
          • Schema Monitoring
            • How-to Guides
              • Run Schema Monitoring and Column Detection Jobs
              • Manage Schema Monitoring
            • Reference Guides
              • Schema Monitoring
              • Schema Projects
            • Why Use Schema Monitoring?
    • Manage Data Metadata
      • Connect External Catalogs
        • Getting Started with External Catalogs
        • Configure an External Catalog
        • Reference Guides
          • External Catalogs
          • Custom REST Catalogs
            • Custom REST Catalog Interface Endpoints
      • Data Identification
        • Introduction
        • Getting Started with Data Identification
        • How-to Guides
          • Use Identification
          • Manage Identifiers
          • Run and Manage Identification
          • Manage Identification Frameworks
          • Use Sensitive Data Discovery (SDD)
        • Reference Guides
          • How Competitive Criteria Analysis Works
          • Built-in Identifier Reference
            • Built-In Identifier Changelog
          • Built-in Discovered Tags Reference
      • Data Classification
        • How-to Guides
          • Activate Classification Frameworks
          • Adjust Identification and Classification Framework Tags
          • How to Use a Built-In Classification Framework with Your Own Tags
        • Classification Frameworks Reference Guide
      • Manage Tags
        • How-to Guides
          • Create and Manage Tags
          • Add Tags to Data Sources and Projects
        • Tags Reference Guide
    • Manage Users
      • Getting Started with Users
      • Identity Managers (IAMs)
        • How-to Guides
          • Okta LDAP Interface
          • OpenID Connect
            • OpenID Connect Protocol
            • Okta and OpenID Connect
            • OneLogin with OpenID Connect
          • SAML
            • SAML Protocol
            • Microsoft Entra ID
            • Okta SAML SCIM
        • Reference Guides
          • Identity Managers
          • SAML Single Logout
          • SAML Protocol Configuration Options
      • Immuta Users
        • How-to Guides
          • Managing Personas and Permissions
          • Manage Attributes and Groups
          • User Impersonation
          • External User ID Mapping
          • External User Info Endpoint
        • Reference Guides
          • Attributes and Groups in Immuta
          • Permissions and Personas
    • Organize Data into Domains
      • Getting Started with Domains
      • Domains Reference Guide
    • Application Settings
      • How-to Guides
        • App Settings
        • BI Tools
          • BI Tool Configuration Recommendations
          • Power BI Configuration Example
          • Tableau Configuration Example
        • Add a License Key
        • Add ODBC Drivers
        • Manage Encryption Keys
        • System Status Bundle
      • Reference Guides
        • Data Processing, Encryption, and Masking Practices
        • Metadata Ingestion
  • Governance
    • Introduction
      • Automate Data Access Control Decisions
        • The Two Paths: Orchestrated RBAC and ABAC
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
        • Test and Deploy Policy
      • Compliantly Open More Sensitive Data for ML and Analytics
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
    • Author Policies for Data Access Control
      • Introduction
        • Scalability and Evolvability
        • Understandability
        • Distributed Stewardship
        • Consistency
        • Availability of Data
      • Policies
        • Authoring Policies at Scale
        • Data Engineering with Limited Policy Downtime
        • Subscription Policies
          • How-to Guides
            • Author a Subscription Policy
            • Author an ABAC Subscription Policy
            • Subscription Policies Advanced DSL Guide
            • Author a Restricted Subscription Policy
            • Clone, Activate, or Stage a Global Policy
          • Reference Guides
            • Subscription Policies
            • Subscription Policy Access Types
            • Advanced Use of Special Functions
        • Data Policies
          • Overview
          • How-to Guides
            • Author a Masking Data Policy
            • Author a Minimization Policy
            • Author a Purpose-Based Restriction Policy
            • Author a Restricted Data Policy
            • Author a Row-Level Policy
            • Author a Time-Based Restriction Policy
            • Policy Certifications and Diffs
          • Reference Guides
            • Data Policy Types
            • Masking Policies
            • Row-Level Policies
            • Custom WHERE Clause Functions
            • Data Policy Conflicts and Fallback
            • Custom Data Policy Certifications
            • Orchestrated Masking Policies
      • Projects and Purpose-Based Access Control
        • Projects and Purpose Controls
          • Getting Started
          • How-to Guides
            • Create a Project
            • Create and Manage Purposes
            • Project Management
              • Manage Projects and Project Settings
              • Manage Project Data Sources
              • Manage Project Members
          • Reference Guides
            • Projects and Purposes
          • Why Use Purposes?
        • Equalized Access
          • Manage Project Equalization
          • Project Equalization Reference Guide
          • Why Use Project Equalization?
        • Masked Joins
          • Enable Masked Joins
          • Why Use Masked Joins?
        • Writing to Projects
          • How-to Guides
            • Create and Manage Snowflake Project Workspaces
            • Create and Manage Databricks Spark Project Workspaces
            • Write Data to the Workspace
          • Reference Guides
            • Project Workspaces
            • Project UDFs (Databricks)
    • Observe Access and Activity
      • Introduction
      • Audit
        • How-to Guides
          • Export Audit Logs to S3
          • Export Audit Logs to ADLS
          • Run Governance Reports
        • Reference Guides
          • Universal Audit Model (UAM)
            • UAM Schema
          • Query Audit Logs
            • Snowflake Query Audit Logs
            • Databricks Unity Catalog Query Audit Logs
            • Databricks Spark Query Audit Logs
            • Starburst (Trino) Query Audit Logs
          • Audit Export GraphQL Reference Guide
          • Governance Report Types
          • Unknown Users in Audit Logs
      • Dashboards
        • Use the Audit Dashboards How-To Guide
        • Audit Dashboards Reference Guide
      • Monitors
        • Manage Monitors and Observations
        • Monitors Reference Guide
    • Access Data
      • Subscribe to a Data Source
      • Query Data
        • Querying Snowflake Data
        • Querying Databricks Data
        • Querying Databricks SQL Data
        • Querying Starburst (Trino) Data
        • Querying Redshift Data
        • Querying Azure Synapse Analytics Data
        • Connect to a Database Tool to Run Ad Hoc Queries
      • Subscribe to Projects
  • Releases
    • Release Notes
      • Immuta v2025.1 Release Notes
        • User Interface Changes in v2025.1 LTS
      • Immuta LTS Changelog
      • Immuta Image Digests
      • Immuta CLI Release Notes
    • Immuta Release Lifecycle
    • Immuta Support Matrix Overview
    • Preview Features
      • Features in Preview
    • Deprecations and EOL
  • Developer Guides
    • The Immuta CLI
      • Install and Configure the Immuta CLI
      • Manage Your Immuta Tenant
      • Manage Data Sources
      • Manage Sensitive Data Discovery
        • Manage Sensitive Data Discovery Rules
        • Manage Identification Frameworks
        • Run Sensitive Data Discovery on Data Sources
      • Manage Policies
      • Manage Projects
      • Manage Purposes
      • Manage Audit
    • The Immuta API
      • Integrations API
        • Getting Started
        • How-to Guides
          • Configure an Amazon S3 Integration
          • Configure an Azure Synapse Analytics Integration
          • Configure a Databricks Unity Catalog Integration
          • Configure a Google BigQuery Integration
          • Configure a Redshift Integration
          • Configure a Snowflake Integration
          • Configure a Starburst (Trino) Integration
        • Reference Guides
          • Integrations API Endpoints
          • Integration Configuration Payload
          • Response Schema
          • HTTP Status Codes and Error Messages
      • Connections API
        • How-to Guides
          • Register a Connection
            • Register a Snowflake Connection
            • Register a Databricks Unity Catalog Connection
            • Register an AWS Lake Formation Connection
          • Manage a Connection
          • Deregister a Connection
        • Connection Registration Payloads Reference Guide
      • Immuta V2 API
        • Data Source Payload Attribute Details
        • Data Source Request Payload Examples
        • Create Policies API Examples
        • Create Projects API Examples
        • Create Purposes API Examples
      • Immuta V1 API
        • Authenticate with the API
        • Configure Your Instance of Immuta
          • Get Job Status
          • Manage Frameworks
          • Manage IAMs
          • Manage Licenses
          • Manage Notifications
          • Manage Tags
          • Manage Webhooks
          • Search Filters
          • Manage Identification
            • Identification Frameworks to Identifiers in Domains
            • Manage Sensitive Data Discovery (SDD)
        • Connect Your Data
          • Create and Manage an Amazon S3 Data Source
          • Create an Azure Synapse Analytics Data Source
          • Create an Azure Blob Storage Data Source
          • Create a Databricks Data Source
          • Create a Presto Data Source
          • Create a Redshift Data Source
          • Create a Snowflake Data Source
          • Create a Starburst (Trino) Data Source
          • Manage the Data Dictionary
        • Use Domains
        • Manage Data Access
          • Manage Access Requests
          • Manage Data and Subscription Policies
          • Manage Write Policies
            • Write Policies Payloads and Response Schema Reference Guide
          • Policy Handler Objects
          • Search Connection Strings
          • Search for Organizations
          • Search Schemas
        • Subscribe to and Manage Data Sources
        • Manage Projects and Purposes
          • Manage Projects
          • Manage Purposes
        • Generate Governance Reports
Powered by GitBook

Other versions

  • SaaS
  • 2024.3
  • 2024.2

Copyright © 2014-2025 Immuta Inc. All rights reserved.

On this page
  • Amazon S3 configuration object
  • Azure Synapse Analytics configuration objects
  • Azure Synapse Analytics impersonation object
  • Delete Azure Synapse Analytics payload
  • Metadata delimiters object
  • Databricks Unity Catalog configuration objects
  • Additional workspace connections
  • Databricks Unity Catalog audit object
  • Group pattern object
  • Databricks Unity Catalog proxy options object
  • Databricks Unity Catalog OAuth configuration object
  • Google BigQuery configuration object
  • Redshift configuration objects
  • Delete Redshift integration payload
  • Redshift impersonation object
  • Snowflake configuration objects
  • Audit object
  • Delete Snowflake integration payload
  • Snowflake impersonation object
  • Lineage object
  • Snowflake OAuth configuration object
  • User role pattern object
  • Workspaces object
Export as PDF
  1. Developer Guides
  2. The Immuta API
  3. Integrations API
  4. Reference Guides

Integration Configuration Payload

The parameters for configuring an integration in Immuta are outlined in the table below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Trino

autoBootstrap boolean

Required for all integrations except Starburst (Trino)

-

true or false

config object

Required for all integrations except Starburst (Trino)

-

-

Amazon S3 configuration object

The config object configures the S3 integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

name string

A name for the integration that is unique across all Amazon S3 integrations configured in Immuta.

Required

-

-

awsAccountId string

The ID of your AWS account.

Required

-

-

awsRegion string

The AWS region to use.

Required

-

Any valid AWS region (us-east-1, for example)

awsLocationRole string

The AWS IAM role ARN assigned to the base access grants location. This is the role the AWS Access Grants service assumes to vend credentials to the grantee. When a grantee accesses S3 data, the AWS Access Grants service attaches session policies and assumes this role in order to vend scoped down credentials to the grantee. This role needs full access to all paths under the S3 location prefix.

Required

-

-

awsLocationPath string

The base S3 location prefix that Immuta will use for this connection when registering S3 data sources. This path must be unique across all S3 integrations configured in Immuta.

Required

-

-

awsRoleToAssume string

The AWS IAM role ARN Immuta assumes when interacting with AWS.

Required when authenticationType is auto.

[]

-

authenticationType string

The method used to authenticate with AWS when configuring the S3 integration.

Required

-

  • auto

  • accessKey

awsAccessKeyId string

The AWS access key ID for the AWS account configuring the integration.

Required when authenticationType is accessKey.

-

-

awsSecretAccessKey string

The AWS secret access key for the AWS account configuring the integration.

Required when authenticationType is accessKey.

-

-

port number

The port to use when connecting to your S3 Access Grants instance.

Optional

443

0-65535

Azure Synapse Analytics configuration objects

The config object configures the Azure Synapse Analytics integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The URL of your Azure Synapse Analytics account.

Required

-

Valid URL hostnames.

database string

Name of an existing database where the Immuta system user will store all Immuta-generated schemas and views.

Required

-

-

schema string

Name of the Immuta-managed schema where all your secure views will be created and stored.

Required

-

-

authenticationType string

The method used to authenticate with Azure Synapse Analytics when configuring the integration.

Required

-

userPassword

username string

The username of the system account that can act on Azure Synapse Analytics objects and configure the integration.

Required

-

-

password string

The password of the system account that can act on Azure Synapse Analytics objects and configure the integration.

Required

-

-

Optional

-

port number

The port to use when connecting to your Azure Synapse Analytics account host.

Optional

1433

0-65535

Optional

-

connectArgs string

The connection string arguments to pass to the ODBC driver when connecting as the Immuta system user.

Optional

-

-

Azure Synapse Analytics impersonation object

The impersonation object enables and defines roles for user impersonation for Azure Synapse Analytics. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

When true, enables user impersonation.

false

true or false

role string

The name of the user impersonation role.

IMMUTA_IMPERSONATION

-

Delete Azure Synapse Analytics payload

The credentials you used when configuring your integration are required in the payload when autoBootstrap was set to true when setting up your integration. For integration configurations with autoBootstrap set to false, no payload is required when deleting the integration.

Parameter
Description
Required or optional
Accepted values

authenticationType string

The type of authentication used when originally configuring the Azure Synapse Analytics integration.

Required

userPassword

username string

The username of the system account that configured the integration.

Required if autoBootstrap was true when setting up the integration.

-

password string

The password of the system account that configured the integration.

Required if autoBootstrap was true when setting up the integration.

-

Metadata delimiters object

The metadataDelimiters object specifies the delimiters that Immuta uses to store profile data in Azure Synapse Analytics. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

hashDelimiter string

A delimiter used to separate key-value pairs.

`

`

hashKeyDelimiter string

A delimiter used to separate a key from its value.

:

-

arrayDelimiter string

A delimiter used to separate array elements.

,

-

Databricks Unity Catalog configuration objects

The config object configures the Databricks Unity Catalog integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

port number

The port to use when connecting to your Databricks account host.

Optional

443

0-65535

workspaceUrl string

Databricks workspace URL. For example, my-workspace.cloud.databricks.com.

Required

-

-

httpPath string

The HTTP path of your Databricks cluster or SQL warehouse.

Required

-

-

authenticationType string

The type of authentication to use when connecting to Databricks.

Required

-

  • token

  • oAuthM2M

token string

The Databricks personal access token. This is the access token for the Immuta service principal.

Required if authenticationType is token.

-

-

Optional

[]

-

Required if you selected oAuthM2M as your authenticationType.

-

-

Optional

-

catalog string

The name of the Databricks catalog Immuta will create to store internal entitlements and other user data specific to Immuta. This catalog will only be readable for the Immuta service principal and should not be granted to other users. The catalog name may only contain letters, numbers, and underscores and cannot start with a number.

Optional

immuta

-

Optional

-

-

Optional

[]

-

Additional workspace connections

The additionalWorkspaceConnections array allows you to configure additional workspace connections for your Databricks Unity Catalog integration. The table below outlines its child attributes.

Attribute
Description
Required or optional
Default values
Accepted values

workspaceUrl string

Databricks workspace URL. For example, my-workspace.cloud.databricks.com.

Required

-

-

httpPath string

The HTTP path of the compute for the workspace.

Required

-

-

authenticationType string

The type of authentication to use when connecting to Databricks. The additional workspace credentials will be used when processing objects in bound catalogs that are not accessible via the default workspace.

Required

-

  • token

  • oAuthM2M

token string

The Databricks personal access token. This is the access token for the Immuta service principal. The additional workspace credentials will be used when processing objects in bound catalogs that are not accessible via the default workspace.

Required if authenticationType is token.

-

-

Required if you selected oAuthM2M as your authenticationType.

-

-

catalogs string

The name of the catalog to use for this additional workspace connection. The catalog name may only contain letters, numbers, and underscores and cannot start with a number. Users may configure one additional workspace connection per catalog.

Users may still bind a catalog to more than one workspace in Databricks, as long as there is only one additional workspace connection in Immuta, as Immuta requires a single connection from which to control the catalog.

Required

-

-

Databricks Unity Catalog audit object

The audit object enables Databricks Unity Catalog query audit. The table below outlines its child parameter.

Parameter
Description
Default values
Accepted values

enabled boolean

This setting enables or disables Databricks Unity Catalog query audit.

false

true or false

Group pattern object

The object excludes the listed group from having data policies applied in the Databricks Unity Catalog integration. This account-level group should be used for privileged users and service accounts that require an unmasked view of data. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

deny string

The name of a group in Databricks that will be excluded from having data policies applied. This account-level group should be used for privileged users and service accounts that require an unmasked view of data.

immuta_exemption_group

-

Databricks Unity Catalog proxy options object

The proxyOptions object represents your proxy server configuration in Databricks Unity Catalog. The table below outlines the object's child attributes.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The hostname of the proxy server.

Required

-

Valid URL hostnames

port number

The port to use when connecting to your proxy server.

Optional

443

0-65535

username string

The username to use with the proxy server.

Optional

[]

-

password string

The password to use with the proxy server.

Optional

[]

-

Databricks Unity Catalog OAuth configuration object

Parameter
Description
Required or optional
Default values
Accepted values

clientId string

The client identifier of the Immuta service principal you configured. This is the client ID displayed in Databricks when creating the client secret for the service principal.

Required

-

-

authorityUrl string

Authority URL of your identity provider.

Required

https://<your workspace name>.cloud.databricks.com/oidc/v1/token

-

scope

Optional

[]

-

clientSecret string

Required

-

-

Google BigQuery configuration object

The config object configures the Google BigQuery integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

role string

Google Cloud role used to connect to Google BigQuery.

Required

-

-

datasetSuffix string

Suffix to postfix to the name of each dataset created to store secure views. This string must start with an underscore.

Required

-

-

dataset string

Name of the BigQuery dataset to provision inside of the project for Immuta metadata storage.

Optional

immuta

-

location string

The dataset's location. After a dataset is created, the location can't be changed.

Required

-

Any valid GCP location (us-east1, for example).

credential string

Required

-

-

port number

The port to use when connecting to your BigQuery account host.

Optional

443

0-65535

Redshift configuration objects

The config object configures the Redshift integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The URL of your Redshift account.

Required

-

Valid URL hostnames

database string

Name of a new empty database that the Immuta system user will manage and store metadata in.

Required

-

-

initialDatabase string

Name of the existing database in Redshift that Immuta initially connects to and creates the Immuta-managed database.

Required if autoBootstrap is true.

-

-

authenticationType string

The type of authentication to use when connecting to Redshift.

Required

-

  • userPassword

  • accessKey

username string

The username of the system account that can act on Redshift objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

password string

The password of the system account that can act on Redshift objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

databaseUser string

The Redshift database username.

Required if you selected accessKey as your authenticationType.

-

-

accessKeyId string

The Redshift access key ID.

Required if you selected accessKey as your authenticationType.

-

-

secretKey string

The Redshift secret key.

Required if you selected accessKey as your authenticationType.

-

-

sessionToken string

The Redshift session token.

Optional if you selected accessKey as your authenticationType.

-

-

port number

The port to use when connecting to your Redshift account host.

Optional

5439

0-65535

Optional

-

connectArgs string

The connection string arguments to pass to the Node.js driver when connecting as the Immuta system user.

Optional

-

-

Delete Redshift integration payload

The authentication type and credentials you used when configuring your integration are required in the payload when autoBootstrap was set to true when setting up your integration. For integration configurations with autoBootstrap set to false, no payload is required when deleting the integration.

Parameter
Description
Required or optional
Accepted values

authenticationType string

The type of authentication used when originally configuring the Redshift integration.

Required if autoBootstrap was true when setting up the integration.

  • userPassword

  • accessKey

username string

The username of the system account that configured the integration.

Required if you selected userPassword as your authenticationType.

-

password string

The password of the system account that configured the integration.

Required if you selected userPassword as your authenticationType.

-

databaseUser string

The Redshift database username.

Required if you selected accessKey as your authenticationType.

-

accessKeyId string

The Redshift access key ID.

Required if you selected accessKey as your authenticationType.

-

secretKey string

The Redshift secret key.

Required if you selected accessKey as your authenticationType.

-

sessionToken string

The Redshift session token.

Optional if you selected accessKey as your authenticationType.

-

Redshift impersonation object

The impersonation object enables and defines roles for user impersonation for Redshift. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

When true, enables user impersonation.

false

true or false

role string

The name of the user impersonation role.

immuta_impersonation

-

Snowflake configuration objects

The config object configures the Snowflake integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The URL of your Snowflake account.

Required

-

Valid URL hostnames

warehouse string

The default pool of compute resources the Immuta system user will use to run queries and perform other Snowflake operations.

Required

-

-

database string

Name of a new empty database that the Immuta system user will manage and store metadata in.

Required

-

-

authenticationType string

The type of authentication to use when connecting to Snowflake.

Required

-

  • userPassword

  • keyPair

  • oAuthClientCredentials

username string

The username of a Snowflake account that can act on Snowflake objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

password string

The password of a Snowflake account that can act on Snowflake objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

privateKey string

The private key. Replace new lines in the private key with a backslash before the new line character: "\n". If you are using another means of configuration, such as a Python script, the "\n" should not be added.

Required if you selected keyPair as your authenticationType.

-

-

Required if you selected oAuthClientCredentials as your authenticationType.

-

-

role string

The privileged Snowflake role used by the Immuta system account when configuring the Snowflake integration.

Required when autoBootstrap is true.

-

-

port number

The port to use when connecting to your Snowflake account host.

Optional

443

0-65535

Optional

-

Optional

-

Optional

{[]}

-

Optional

-

connectArgs string

The connection string arguments to pass to the Node.js driver when connecting as the Immuta system user.

Optional

-

-

privilegedConnectArgs string

The connection string arguments to pass to the Node.js driver when connecting as the privileged user.

Optional when autoBootstrap is true.

-

-

Optional

-

-

Audit object

The audit object enables Snowflake query audit. The table below outlines its child parameter.

Parameter
Description
Default values
Accepted values

enabled boolean

This setting enables or disables Snowflake query audit.

false

true or false

Delete Snowflake integration payload

The authentication type and credentials you used when configuring your integration are required in the payload when autoBootstrap was set to true when setting up your integration. For integration configurations with autoBootstrap set to false, no payload is required when deleting the integration.

Parameter
Description
Required or optional
Accepted values

authenticationType string

The type of authentication used when originally configuring the integration.

Required if autoBootstrap was true when configuring the integration.

  • userPassword

  • keyPair

  • oAuthClientCredentials

username string

The username of the system account that configured the integration.

Required for the Azure Synapse Analytics integration or if you selected userPassword as your authenticationType for Redshift or Snowflake.

-

password string

The password of the system account that configured the integration.

Required for the Azure Synapse Analytics integration or if you selected userPassword as your authenticationType for Redshift or Snowflake.

-

privateKey string

The private key. Replace new lines in the private key with a backslash before the new line character: "\n". If you are using another means of configuration, such as a Python script, the "\n" should not be added.

Required if you selected keyPair as your authenticationType.

-

Required if you selected oAuthClientCredentials as your authenticationType.

-

role string

The privileged Snowflake role used by the Immuta system account when configuring the Snowflake integration.

Required when autoBootstrap is true for Snowflake.

-

Snowflake impersonation object

The impersonation object enables and defines roles for user impersonation for Snowflake. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

When true, enables user impersonation.

false

true or false

role string

The name of the user impersonation role.

IMMUTA_IMPERSONATION

-

Lineage object

The lineage object enables Snowflake lineage ingestion. When this setting is enabled, Immuta automatically applies tags added to a Snowflake table to its descendant data source columns in Immuta so you can build policies using those tags to restrict access to sensitive data. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

enabled boolean

When true, enables Snowflake lineage so that Immuta can apply tags added to Snowflake data sources to their descendant data source columns in Immuta.

Optional

false

true or false

lineageConfig object

Configures what tables Immuta will ingest lineage history for, the number of rows to ingest per batch, and what tags to propagate. Child parameters include tableFilter, tagFilterRegex, and ingestBatchSize.

Required if enabled is true.

-

-

lineageConfig.tableFilter string

This child parameter of lineageConfig determines which tables Immuta will ingest lineage for. Use a regular expression that excludes / from the beginning and end to filter tables. Without this filter, Immuta will attempt to ingest lineage for every table on your Snowflake instance.

Optional

^.*$

Regular expression that excludes / from the beginning and end.

lineageConfig.tagFilterRegex string

This child parameter of lineageConfig determines which tags to propagate using lineage. Use a regular expression that excludes / from the beginning and end to filter tags. Without this filter, Immuta will ingest lineage for every tag on your Snowflake instance.

Optional

^.*$

Regular expression that excludes / from the beginning and end.

lineageConfig.ingestBatchSize number

This child parameter of lineageConfig configures the number of rows Immuta ingests per batch when streaming Access History data from your Snowflake instance.

Optional

1000

Minimum value of 1.

Snowflake OAuth configuration object

The oAuthClientConfig object represents your OAuth configuration in Snowflake. This object is required if you set oAuthClientCredentials as your authentication type in the Snowflake integration configuration, and you must set autoBootstrap to false. The table below outlines the object's child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

provider string

The identity provider for OAuth, such as Okta.

Required

-

-

clientId string

The client identifier of your registered application.

Required

-

-

authorityUrl string

Authority URL of your identity provider.

Required

-

-

useCertificate boolean

Specifies whether or not to use a certificate and private key for authenticating with OAuth.

Required

-

true or false

publicCertificateThumbprint string

Your certificate thumbprint.

Required if useCertificate is true.

-

-

oauthPrivateKey string

The private key content.

Required if useCertificate is true.

-

-

clientSecret string

Client secret of the application.

Required if useCertificate is false.

-

-

resource string

An optional resource to pass to the token provider.

Optional

-

-

scope string

Optional

[]

-

User role pattern object

The userRolePattern object excludes roles and users from authorization checks in the Snowflake integration. The table below outlines its child parameter.

Parameter
Description
Default values
Accepted values

exclude array[string]

This array is a list of roles and users (both case-sensitive) to exclude from authorization checks. Wildcards are unsupported.

[]

-

Workspaces object

The workspaces object represents an Immuta project workspace configured for Snowflake. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

This setting enables or disables Snowflake project workspaces. If you use Snowflake secure data sharing with Immuta, set this property to true, as project workspaces are required. If you use Snowflake table grants, set this property to false; project workspaces cannot be used when Snowflake table grants are enabled.

false

true or false

warehouses array[string]

This array is a list of warehouses workspace users have usage privileges on.

[]

-

PreviousIntegrations API EndpointsNextResponse Schema

Last updated 1 month ago

When true, Immuta will automatically configure the integration in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment for you. When false, you must set up your environment manually before configuring the integration with the API. This parameter must be set to false in the Amazon S3 and Google BigQuery configurations. See the specific how-to guide for configuring your integration for details: , , , .

This object specifies the integration settings. See the config object description for your integration for details: , , , , , or .

object

This object is a set of delimiters that Immuta uses to store profile data in Azure Synapse Analytics. See the for parameters.

See the for default values.

object

Enables user impersonation. See the for parameters.

Disabled by default. See the for parameters.

object

This object allows you to configure your integration to use a proxy server. See the for child attributes.

object

This object represents your OAuth configuration. To use this authentication method, authenticationType must be oAuthM2M. See the for parameters.

object

This object enables Databricks Unity Catalog query audit. See the for parameters.

Disabled by default. See the for parameters.

object

This object allows you to exclude groups in Databricks from authorization checks. See the for parameters.

array[ object]

This object allows you to configure additional workspace connections for your integration. See the for child attributes.

object

This object represents your OAuth configuration. To use this authentication method, authenticationType must be oAuthM2M. See the for parameters. The additional workspace credentials will be used when processing objects in bound catalogs that are not accessible via the default workspace.

The oAuthClientConfig object represents your OAuth configuration in Databricks Unity Catalog. This object is required if you set oAuthM2M as your . The table below outlines the object's child parameters.

The scope limits the operations and roles allowed in Databricks by the access token. See the for details about scopes.

.

The Google BigQuery service account JSON keyfile credential content. See the for guidance on generating and downloading this keyfile.

object

Enables user impersonation. See the for parameters.

Disabled by default. See the for parameters.

object

This object represents your OAuth configuration. To use this authentication method, autoBootstrap must be false. See the for parameters.

object

This object enables Snowflake query audit. See the for the parameter.

Disabled by default. See the for the parameter.

object

Enables user impersonation. See the for parameters.

Disabled by default. See the for parameters.

object

This object excludes roles and users from authorization checks. See the for parameters.

object

This object represents an Immuta project workspace configured for Snowflake. See the for parameters.

Disabled by default. See the for parameters.

object

Enables Snowflake lineage ingestion so that Immuta can apply tags added to Snowflake tables to their descendant data source columns. See the for parameters.

object

This object represents your OAuth configuration. See the for parameters.

The scope limits the operations and roles allowed in Snowflake by the access token. See the for details about scopes.

authentication type in the Databricks Unity Catalog integration configuration
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
OAuth 2.0 documentation
Client secret created for the Immuta service principal
Google documentation
OAuth 2.0 documentation
Azure Synapse Analytics
Databricks
Google BigQuery
Native S3
Redshift
Snowflake
Amazon S3
Azure Synapse Analytics
Databricks Unity Catalog
Google BigQuery
Redshift
Snowflake
metadataDelimiters
object description
object description
impersonation
impersonation object description
impersonation object description
proxyOptions
proxy options object description
oAuthClientConfig
oAuthClientConfig object description
audit
audit object description
audit object description
groupPattern
groupPattern object description
additionalWorkspaceConnections
additionalWorkspaceConnections description
oAuthClientConfig
oAuthClientConfig object description
impersonation
impersonation object description
impersonation object description
oAuthClientConfig
oAuthClientConfig object description
audit
audit object description
audit object description
impersonation
impersonation object description
impersonation object description
userRolePattern
userRolePattern object description
workspaces
workspaces object description
workspaces object description
lineage
lineage ingestion object description
oAuthClientConfig
oAuthClientConfig object description