LogoLogo
2024.2
  • Immuta Documentation - 2024.2
  • What is Immuta?
  • Self-Managed Deployment
    • Getting Started
    • Deployment Requirements
    • Install
      • Managed Public Cloud
      • Red Hat OpenShift
      • Generic Installation
      • Immuta in an Air-Gapped Environment
      • Deploy Immuta without Elasticsearch
    • Configure
      • Ingress Configuration
      • Cosign Verification
      • TLS Configuration
      • Immuta in Production
      • External Cache Configuration
      • Rotating Credentials
      • Enabling Legacy Query Engine and Fingerprint
    • Upgrade
      • Upgrade Immuta
      • Upgrade to Immuta 2024.2 LTS
    • Disaster Recovery
    • Troubleshooting
    • Conventions
    • Release Notes
  • Data and Integrations
    • Immuta Integrations
    • Snowflake
      • Getting Started
      • How-to Guides
        • Configure a Snowflake Integration
        • Snowflake Table Grants Migration
        • Edit or Remove Your Snowflake Integration
        • Integration Settings
          • Enable Snowflake Table Grants
          • Use Snowflake Data Sharing with Immuta
          • Configure Snowflake Lineage Tag Propagation
          • Enable Snowflake Low Row Access Policy Mode
            • Upgrade Snowflake Low Row Access Policy Mode
      • Reference Guides
        • Snowflake Integration
        • Snowflake Data Sharing
        • Snowflake Lineage Tag Propagation
        • Snowflake Low Row Access Policy Mode
        • Snowflake Table Grants
        • Warehouse Sizing Recommendations
      • Phased Snowflake Onboarding Concept Guide
    • Databricks Unity Catalog
      • Getting Started
      • How-to Guides
        • Configure a Databricks Unity Catalog Integration
        • Migrate to Unity Catalog
      • Databricks Unity Catalog Integration Reference Guide
    • Databricks Spark
      • How-to Guides
        • Configuration
          • Simplified Databricks Configuration
          • Manual Databricks Configuration
          • Manually Update Your Databricks Cluster
          • Install a Trusted Library
        • DBFS Access
        • Limited Enforcement in Databricks
        • Hide the Immuta Database in Databricks
        • Run spark-submit Jobs on Databricks
        • Configure Project UDFs Cache Settings
        • External Metastores
      • Reference Guides
        • Databricks Spark Integration
        • Databricks Spark Pre-Configuration Details
        • Configuration Settings
          • Cluster Policies
            • Python & SQL
            • Python & SQL & R
            • Python & SQL & R with Library Support
            • Scala
            • Sparklyr
          • Environment Variables
          • Ephemeral Overrides
          • Py4j Security Error
          • Scala Cluster Security Details
          • Databricks Security Configuration for Performance
        • Databricks Change Data Feed
        • Databricks Libraries Introduction
        • Delta Lake API
        • Spark Direct File Reads
        • Databricks Metastore Magic
    • Starburst (Trino)
      • Getting Started
      • How-to Guides
        • Configure Starburst (Trino) Integration
        • Customize Read and Write Access Policies for Starburst (Trino)
      • Starburst (Trino) Integration Reference Guide
    • Redshift
      • Getting Started
      • How-to Guides
        • Configure Redshift Integration
        • Configure Redshift Spectrum
      • Reference Guides
        • Redshift Integration
        • Redshift Pre-Configuration Details
    • Azure Synapse Analytics
      • Getting Started
      • Configure Azure Synapse Analytics Integration
      • Reference Guides
        • Azure Synapse Analytics Integration
        • Azure Synapse Analytics Pre-Configuration Details
    • Amazon S3
    • Google BigQuery
    • Legacy Integrations
      • Securing Hive and Impala Without Sentry
      • Enabling ImmutaGroupsMapping
    • Registering Metadata
      • Data Sources in Immuta
      • Register Data Sources
        • Create a Data Source
        • Create an Amazon S3 Data Source
        • Create a Google BigQuery Data Source
        • Bulk Create Snowflake Data Sources
      • Data Source Settings
        • How-to Guides
          • Manage Data Sources and Data Source Settings
          • Manage Data Source Members
          • Manage Access Requests and Tasks
          • Manage Data Dictionary Descriptions
          • Disable Immuta from Sampling Raw Data
        • Data Source Health Checks Reference Guide
      • Schema Monitoring
        • How-to Guides
          • Run Schema Monitoring and Column Detection Jobs
          • Manage Schema Monitoring
        • Reference Guides
          • Schema Monitoring
          • Schema Projects
        • Why Use Schema Monitoring?
    • Catalogs
      • Getting Started with External Catalogs
      • Configure an External Catalog
      • Reference Guides
        • External Catalogs
        • Custom REST Catalogs
          • Custom REST Catalog Interface Endpoints
    • Tags
      • How-to Guides
        • Create and Manage Tags
        • Add Tags to Data Sources and Projects
      • Tags Reference Guide
  • People
    • Getting Started
    • Identity Managers (IAMs)
      • How-to Guides
        • Microsoft Entra ID
        • Okta LDAP Interface
        • Okta and OpenID Connect
        • Integrate Okta SAML SCIM with Immuta
        • OneLogin with OpenID
        • Configure SAML IAM Protocol
      • Reference Guides
        • Identity Managers
        • SAML Single Logout
        • SAML Protocol Configuration Options
    • Immuta Users
      • How-to Guides
        • Managing Personas and Permissions
        • Manage Attributes and Groups
        • User Impersonation
        • External User ID Mapping
        • External User Info Endpoint
      • Reference Guides
        • Attributes and Groups in Immuta
        • Permissions and Personas
  • Discover Your Data
    • Getting Started
    • Introduction
    • Architecture
    • Data Discovery
      • How-to Guides
        • Enable Sensitive Data Discovery (SDD)
        • Manage Identification Frameworks
        • Manage Patterns
        • Manage Rules
        • Manage SDD on Data Sources
        • Manage Global SDD Settings
        • Migrate From Legacy to Native SDD
      • Reference Guides
        • How Competitive Pattern Analysis Works
        • Built-in Pattern Reference
        • Built-in Discovered Tags Reference
    • Data Classification
      • How-to Guides
        • Activate Classification Frameworks
        • Adjust Identification and Classification Framework Tags
        • How to Use a Built-In Classification Framework with Your Own Tags
      • Built-in Classification Frameworks Reference Guide
  • Detect Your Activity
    • Getting Started
      • Monitor and Secure Sensitive Data Platform Query Activity
        • User Identity Best Practices
        • Integration Architecture
        • Snowflake Roles Best Practices
        • Register Data Sources
        • Automate Entity and Sensitivity Discovery
        • Detect with Discover: Onboarding Guide
        • Using Immuta Detect
      • General Immuta Configuration
        • User Identity Best Practices
        • Integration Architecture
        • Databricks Roles Best Practices
        • Register Data Sources
    • Introduction
    • Audit
      • How-to Guides
        • Export Audit Logs to S3
        • Export Audit Logs to ADLS
        • Run Governance Reports
      • Reference Guides
        • Universal Audit Model (UAM)
        • Snowflake Query Audit Logs
        • Databricks Unity Catalog Audit Logs
        • Databricks Query Audit Logs
        • Starburst (Trino) Query Audit Logs
        • UAM Schema
        • Audit Export CLI
        • Governance Report Types
      • Deprecated Audit Guides
        • Legacy to UAM Migration
        • Download Audit Logs
        • System Audit Logs
    • Detection
      • Use the Detect Dashboards
      • Reference Guides
        • Detect
        • Detect Dashboards
        • Unknown Users in Audit Logs
    • Monitors
      • Manage Monitors and Observations
      • Detect Monitors Reference Guide
  • Secure Your Data
    • Getting Started with Secure
      • Automate Data Access Control Decisions
        • The Two Paths: Orchestrated RBAC and ABAC
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
        • Test and Deploy Policy
      • Compliantly Open More Sensitive Data for ML and Analytics
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
      • Federated Governance for Data Mesh and Self-Serve Data Access
        • Defining Domains
        • Managing Data Products
        • Managing Data Metadata
        • Apply Federated Governance
        • Discover and Subscribe to Data Products
    • Introduction
      • Scalability and Evolvability
      • Understandability
      • Distributed Stewardship
      • Consistency
      • Availability of Data
    • Authoring Policies in Secure
      • Authoring Policies at Scale
      • Data Engineering with Limited Policy Downtime
      • Subscription Policies
        • How-to Guides
          • Author a Subscription Policy
          • Author an ABAC Subscription Policy
          • Subscription Policies Advanced DSL Guide
          • Author a Restricted Subscription Policy
          • Clone, Activate, or Stage a Global Policy
        • Reference Guides
          • Subscription Policies
          • Subscription Policy Access Types
          • Advanced Use of Special Functions
      • Data Policies
        • Overview
        • How-to Guides
          • Author a Masking Data Policy
          • Author a Minimization Policy
          • Author a Purpose-Based Restriction Policy
          • Author a Restricted Data Policy
          • Author a Row-Level Policy
          • Author a Time-Based Restriction Policy
          • Certifications Exemptions and Diffs
          • External Masking Interface
        • Reference Guides
          • Data Policy Types
          • Masking Policies
          • Row-Level Policies
          • Custom WHERE Clause Functions
          • Data Policy Conflicts and Fallback
          • Custom Data Policy Certifications
          • Orchestrated Masking Policies
    • Domains
      • Getting Started with Domains
      • Domains Reference Guide
    • Projects and Purpose-Based Access Control
      • Projects and Purpose Controls
        • Getting Started
        • How-to Guides
          • Create a Project
          • Create and Manage Purposes
          • Adjust a Policy
          • Project Management
            • Manage Projects and Project Settings
            • Manage Project Data Sources
            • Manage Project Members
        • Reference Guides
          • Projects and Purposes
          • Policy Adjustments
        • Why Use Purposes?
      • Equalized Access
        • Manage Project Equalization
        • Project Equalization Reference Guide
        • Why Use Project Equalization?
      • Masked Joins
        • Enable Masked Joins
        • Why Use Masked Joins?
      • Writing to Projects
        • How-to Guides
          • Create and Manage Snowflake Project Workspaces
          • Create and Manage Databricks Project Workspaces
          • Write Data to the Workspace
        • Reference Guides
          • Project Workspaces
          • Project UDFs (Databricks)
    • Data Consumers
      • Subscribe to a Data Source
      • Query Data
        • Querying Snowflake Data
        • Querying Databricks Data
        • Querying Databricks SQL Data
        • Querying Starburst (Trino) Data
        • Querying Redshift Data
        • Querying Azure Synapse Analytics Data
      • Subscribe to Projects
  • Application Settings
    • How-to Guides
      • App Settings
      • BI Tools
        • BI Tool Configuration Recommendations
        • Power BI Configuration Example
        • Tableau Configuration Example
      • Add a License Key
      • Add ODBC Drivers
      • Manage Encryption Keys
      • System Status Bundle
    • Reference Guides
      • Data Processing, Encryption, and Masking Practices
      • Metadata Ingestion
  • Releases
    • Immuta v2024.2 Release Notes
    • Immuta Release Lifecycle
    • Immuta LTS Changelog
    • Immuta Support Matrix Overview
    • Immuta CLI Release Notes
    • Immuta Image Digests
    • Preview Features
      • Features in Preview
    • Deprecations
  • Developer Guides
    • The Immuta CLI
      • Install and Configure the Immuta CLI
      • Manage Your Immuta Tenant
      • Manage Data Sources
      • Manage Sensitive Data Discovery
        • Manage Sensitive Data Discovery Rules
        • Manage Identification Frameworks
        • Run Sensitive Data Discovery on Data Sources
      • Manage Policies
      • Manage Projects
      • Manage Purposes
    • The Immuta API
      • Integrations API
        • Getting Started
        • How-to Guides
          • Configure an Amazon S3 Integration
          • Configure an Azure Synapse Analytics Integration
          • Configure a Databricks Unity Catalog Integration
          • Configure a Google BigQuery Integration
          • Configure a Redshift Integration
          • Configure a Snowflake Integration
          • Configure a Starburst (Trino) Integration
        • Reference Guides
          • Integrations API Endpoints
          • Integration Configuration Payload
          • Response Schema
          • HTTP Status Codes and Error Messages
      • Immuta V2 API
        • Data Source Payload Attribute Details
        • Data Source Request Payload Examples
        • Create Policies API Examples
        • Create Projects API Examples
        • Create Purposes API Examples
      • Immuta V1 API
        • Authenticate with the API
        • Configure Your Instance of Immuta
          • Get Fingerprint Status
          • Get Job Status
          • Manage Frameworks
          • Manage IAMs
          • Manage Licenses
          • Manage Notifications
          • Manage Sensitive Data Discovery (SDD)
          • Manage Tags
          • Manage Webhooks
          • Search Filters
        • Connect Your Data
          • Create and Manage an Amazon S3 Data Source
          • Create an Azure Synapse Analytics Data Source
          • Create an Azure Blob Storage Data Source
          • Create a Databricks Data Source
          • Create a Presto Data Source
          • Create a Redshift Data Source
          • Create a Snowflake Data Source
          • Create a Starburst (Trino) Data Source
          • Manage the Data Dictionary
        • Manage Data Access
          • Manage Access Requests
          • Manage Data and Subscription Policies
          • Manage Domains
          • Manage Write Policies
            • Write Policies Payloads and Response Schema Reference Guide
          • Policy Handler Objects
          • Search Audit Logs
          • Search Connection Strings
          • Search for Organizations
          • Search Schemas
        • Subscribe to and Manage Data Sources
        • Manage Projects and Purposes
          • Manage Projects
          • Manage Purposes
        • Generate Governance Reports
Powered by GitBook

Other versions

  • SaaS
  • 2024.3

Copyright © 2014-2024 Immuta Inc. All rights reserved.

On this page
  • Amazon S3 configuration object
  • Azure Synapse Analytics configuration objects
  • Azure Synapse Analytics impersonation object
  • Delete Azure Synapse Analytics payload
  • Metadata delimiters object
  • Databricks Unity Catalog configuration objects
  • Databricks Unity Catalog audit object
  • Group pattern object
  • Databricks Unity Catalog proxy options object
  • Databricks Unity Catalog OAuth configuration object
  • Google BigQuery configuration object
  • Redshift configuration objects
  • Delete Redshift integration payload
  • Redshift impersonation object
  • Okta object
  • Snowflake configuration objects
  • Audit object
  • Delete Snowflake integration payload
  • Snowflake impersonation object
  • Lineage object
  • Snowflake OAuth configuration object
  • User role pattern object
  • Workspaces object

Was this helpful?

Export as PDF
  1. Developer Guides
  2. The Immuta API
  3. Integrations API
  4. Reference Guides

Integration Configuration Payload

The parameters for configuring an integration in Immuta are outlined in the table below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Trino

autoBootstrap boolean

Required for all integrations except Starburst (Trino)

-

true or false

config object

Required for all integrations except Starburst (Trino)

-

-

Amazon S3 configuration object

The config object configures the S3 integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

name string

A name for the integration that is unique across all Amazon S3 integrations configured in Immuta.

Required

-

-

awsAccountId string

The ID of your AWS account.

Required

-

-

awsRegion string

The AWS region to use.

Required

-

Any valid AWS region (us-east-1, for example)

awsLocationRole string

The AWS IAM role ARN assigned to the base access grants location. This is the role the AWS Access Grants service assumes to vend credentials to the grantee. When a grantee accesses S3 data, the AWS Access Grants service attaches session policies and assumes this role in order to vend scoped down credentials to the grantee. This role needs full access to all paths under the S3 location prefix.

Required

-

-

awsLocationPath string

The base S3 location prefix that Immuta will use for this connection when registering S3 data sources. This path must be unique across all S3 integrations configured in Immuta.

Required

-

-

awsRoleToAssume string

The AWS IAM role ARN Immuta assumes when interacting with AWS.

Required when authenticationType is auto.

[]

-

authenticationType string

The method used to authenticate with AWS when configuring the S3 integration.

Required

-

  • auto

  • accessKey

awsAccessKeyId string

The AWS access key ID for the AWS account configuring the integration.

Required when authenticationType is accessKey.

-

-

awsSecretAccessKey string

The AWS secret access key for the AWS account configuring the integration.

Required when authenticationType is accessKey.

-

-

port number

The port to use when connecting to your S3 Access Grants instance.

Optional

443

0-65535

Azure Synapse Analytics configuration objects

The config object configures the Azure Synapse Analytics integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The URL of your Azure Synapse Analytics account.

Required

-

Valid URL hostnames.

database string

Name of an existing database where the Immuta system user will store all Immuta-generated schemas and views.

Required

-

-

schema string

Name of the Immuta-managed schema where all your secure views will be created and stored.

Required

-

-

authenticationType string

The method used to authenticate with Azure Synapse Analytics when configuring the integration.

Required

-

userPassword

username string

The username of the system account that can act on Azure Synapse Analytics objects and configure the integration.

Required

-

-

password string

The password of the system account that can act on Azure Synapse Analytics objects and configure the integration.

Required

-

-

Optional

-

port number

The port to use when connecting to your Azure Synapse Analytics account host.

Optional

1433

0-65535

Optional

-

connectArgs string

The connection string arguments to pass to the ODBC driver when connecting as the Immuta system user.

Optional

-

-

Azure Synapse Analytics impersonation object

The impersonation object enables and defines roles for user impersonation for Azure Synapse Analytics. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

When true, enables user impersonation.

false

true or false

role string

The name of the user impersonation role.

IMMUTA_IMPERSONATION

-

Delete Azure Synapse Analytics payload

The credentials you used when configuring your integration are required in the payload when autoBootstrap was set to true when setting up your integration. For integration configurations with autoBootstrap set to false, no payload is required when deleting the integration.

Parameter
Description
Required or optional
Accepted values

authenticationType string

The type of authentication used when originally configuring the Azure Synapse Analytics integration.

Required

userPassword

username string

The username of the system account that configured the integration.

Required if autoBootstrap was true when setting up the integration.

-

password string

The password of the system account that configured the integration.

Required if autoBootstrap was true when setting up the integration.

-

Metadata delimiters object

The metadataDelimiters object specifies the delimiters that Immuta uses to store profile data in Azure Synapse Analytics. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

hashDelimiter string

A delimiter used to separate key-value pairs.

`

`

hashKeyDelimiter string

A delimiter used to separate a key from its value.

:

-

arrayDelimiter string

A delimiter used to separate array elements.

,

-

Databricks Unity Catalog configuration objects

The config object configures the Databricks Unity Catalog integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

port number

The port to use when connecting to your Databricks account host.

Optional

443

0-65535

workspaceUrl string

Databricks workspace URL. For example, my-workspace.cloud.databricks.com.

Required

-

-

httpPath string

The HTTP path of your Databricks cluster or SQL warehouse.

Required

-

-

authenticationType string

The type of authentication to use when connecting to Databricks.

Required

-

  • token

  • oAuthM2M

token string

The Databricks personal access token. This is the access token for the Immuta service principal.

Required if authenticationType is token.

-

-

Optional

[]

-

Required if you selected oAuthM2M as your authenticationType.

-

-

Optional

-

catalog string

The name of the Databricks catalog Immuta will create to store internal entitlements and other user data specific to Immuta. This catalog will only be readable for the Immuta service principal and should not be granted to other users. The catalog name may only contain letters, numbers, and underscores and cannot start with a number.

Optional

immuta

-

Optional

-

-

Databricks Unity Catalog audit object

The audit object enables Databricks Unity Catalog query audit. The table below outlines its child parameter.

Parameter
Description
Default values
Accepted values

enabled boolean

This setting enables or disables Databricks Unity Catalog query audit.

false

true or false

Group pattern object

The object excludes the listed group from having data policies applied in the Databricks Unity Catalog integration. This account-level group should be used for privileged users and service accounts that require an unmasked view of data. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

deny string

The name of a group in Databricks that will be excluded from having data policies applied. This account-level group should be used for privileged users and service accounts that require an unmasked view of data.

immuta_exemption_group

-

Databricks Unity Catalog proxy options object

The proxyOptions object represents your proxy server configuration in Databricks Unity Catalog. The table below outlines the object's child attributes.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The hostname of the proxy server.

Required

-

Valid URL hostnames

port number

The port to use when connecting to your proxy server.

Optional

443

0-65535

username string

The username to use with the proxy server.

Optional

[]

-

password string

The password to use with the proxy server.

Optional

[]

-

Databricks Unity Catalog OAuth configuration object

The oAuthClientConfig object represents your OAuth configuration in Databricks Unity Catalog. This object is required if you set oAuthM2M as your authentication type in the Databricks Unity Catalog integration configuration. The table below outlines the object's child parameters.

Parameter

Description

Required or optional

Default values

Accepted values

clientId string

The client identifier of the Immuta service principal you configured. This is the client ID displayed in Databricks when creating the client secret for the service principal.

Required

-

-

authorityUrl string

Authority URL of your identity provider.

Required

https://<your workspace name>.cloud.databricks.com/oidc/v1/token

-

scope

Optional

[]

-

clientSecret string

Required

-

-

Google BigQuery configuration object

The config object configures the Google BigQuery integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

role string

Google Cloud role used to connect to Google BigQuery.

Required

-

-

datasetSuffix string

Suffix to postfix to the name of each dataset created to store secure views. This string must start with an underscore.

Required

-

-

dataset string

Name of the BigQuery dataset to provision inside of the project for Immuta metadata storage.

Optional

immuta

-

location string

The dataset's location. After a dataset is created, the location can't be changed.

Required

-

Any valid GCP location (us-east1, for example).

credential string

Required

-

-

port number

The port to use when connecting to your BigQuery account host.

Optional

443

0-65535

Redshift configuration objects

The config object configures the Redshift integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The URL of your Redshift account.

Required

-

Valid URL hostnames

database string

Name of a new empty database that the Immuta system user will manage and store metadata in.

Required

-

-

initialDatabase string

Name of the existing database in Redshift that Immuta initially connects to and creates the Immuta-managed database.

Required if autoBootstrap is true.

-

-

authenticationType string

The type of authentication to use when connecting to Redshift.

Required

-

  • userPassword

  • accessKey

  • okta

username string

The username of the system account that can act on Redshift objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

password string

The password of the system account that can act on Redshift objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

Required if you selected okta as your authenticationType.

-

-

databaseUser string

The Redshift database username.

Required if you selected accessKey as your authenticationType.

-

-

accessKeyId string

The Redshift access key ID.

Required if you selected accessKey as your authenticationType.

-

-

secretKey string

The Redshift secret key.

Required if you selected accessKey as your authenticationType.

-

-

sessionToken string

The Redshift session token.

Optional if you selected accessKey as your authenticationType.

-

-

port number

The port to use when connecting to your Redshift account host.

Optional

5439

0-65535

Optional

-

connectArgs string

The connection string arguments to pass to the ODBC driver when connecting as the Immuta system user.

Optional

-

-

Delete Redshift integration payload

The authentication type and credentials you used when configuring your integration are required in the payload when autoBootstrap was set to true when setting up your integration. For integration configurations with autoBootstrap set to false, no payload is required when deleting the integration.

Parameter
Description
Required or optional
Accepted values

authenticationType string

The type of authentication used when originally configuring the Redshift integration.

Required if autoBootstrap was true when setting up the integration.

  • userPassword

  • accessKey

  • okta

username string

The username of the system account that configured the integration.

Required if you selected userPassword as your authenticationType.

-

password string

The password of the system account that configured the integration.

Required if you selected userPassword as your authenticationType.

-

databaseUser string

The Redshift database username.

Required if you selected accessKey as your authenticationType.

-

accessKeyId string

The Redshift access key ID.

Required if you selected accessKey as your authenticationType.

-

secretKey string

The Redshift secret key.

Required if you selected accessKey as your authenticationType.

-

sessionToken string

The Redshift session token.

Optional if you selected accessKey as your authenticationType.

-

Required if you selected okta as your authenticationType.

-

Redshift impersonation object

The impersonation object enables and defines roles for user impersonation for Redshift. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

When true, enables user impersonation.

false

true or false

role string

The name of the user impersonation role.

immuta_impersonation

-

Okta object

The okta object represents your Okta configuration. This object is required if you set okta as your authentication type in the Redshift integration configuration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

username string

The username of the system account that can act on Redshift objects and configure the integration.

Required

-

-

password string

The password of the system account that can act on Redshift objects and configure the integration.

Required

-

-

appId string

The Okta application ID.

Required

-

-

idpHost string

The Okta identity provider host URL.

Required

-

-

role string

The Okta role.

Required

-

-

Snowflake configuration objects

The config object configures the Snowflake integration. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

host string

The URL of your Snowflake account.

Required

-

Valid URL hostnames

warehouse string

The default pool of compute resources the Immuta system user will use to run queries and perform other Snowflake operations.

Required

-

-

database string

Name of a new empty database that the Immuta system user will manage and store metadata in.

Required

-

-

authenticationType string

The type of authentication to use when connecting to Snowflake.

Required

-

  • userPassword

  • keyPair

  • oAuthClientCredentials

username string

The username of a Snowflake account that can act on Snowflake objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

password string

The password of a Snowflake account that can act on Snowflake objects and configure the integration.

Required if you selected userPassword as your authenticationType.

-

-

privateKey string

The private key. Replace new lines in the private key with a backslash before the new line character: . If you are using another means of configuration, such as a Python script, the should not be added.

Required if you selected keyPair as your authenticationType.

-

-

Required if you selected oAuthClientCredentials as your authenticationType.

-

-

role string

The privileged Snowflake role used by the Immuta system account when configuring the Snowflake integration.

Required when autoBootstrap is true.

-

-

port number

The port to use when connecting to your Snowflake account host.

Optional

443

0-65535

Optional

-

Optional

-

Optional

{[]}

-

Optional

-

connectArgs string

The connection string arguments to pass to the Node.js driver when connecting as the Immuta system user.

Optional

-

-

privilegedConnectArgs string

The connection string arguments to pass to the Node.js driver when connecting as the privileged user.

Optional when autoBootstrap is true.

-

-

Optional

-

-

Audit object

The audit object enables Snowflake query audit. The table below outlines its child parameter.

Parameter
Description
Default values
Accepted values

enabled boolean

This setting enables or disables Snowflake query audit.

false

true or false

Delete Snowflake integration payload

The authentication type and credentials you used when configuring your integration are required in the payload when autoBootstrap was set to true when setting up your integration. For integration configurations with autoBootstrap set to false, no payload is required when deleting the integration.

Parameter
Description
Required or optional
Accepted values

authenticationType string

The type of authentication used when originally configuring the integration.

Required if autoBootstrap was true when configuring the integration.

  • userPassword

  • keyPair

  • oAuthClientCredentials

username string

The username of the system account that configured the integration.

Required for the Azure Synapse Analytics integration or if you selected userPassword as your authenticationType for Redshift or Snowflake.

-

password string

The password of the system account that configured the integration.

Required for the Azure Synapse Analytics integration or if you selected userPassword as your authenticationType for Redshift or Snowflake.

-

privateKey string

The private key. Replace new lines in the private key with a backslash before the new line character: . If you are using another means of configuration, such as a Python script, the should not be added.

Required if you selected keyPair as your authenticationType.

-

Required if you selected oAuthClientCredentials as your authenticationType.

-

role string

The privileged Snowflake role used by the Immuta system account when configuring the Snowflake integration.

Required when autoBootstrap is true for Snowflake.

-

Snowflake impersonation object

The impersonation object enables and defines roles for user impersonation for Snowflake. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

When true, enables user impersonation.

false

true or false

role string

The name of the user impersonation role.

IMMUTA_IMPERSONATION

-

Lineage object

The lineage object enables Snowflake lineage ingestion. When this setting is enabled, Immuta automatically applies tags added to a Snowflake table to its descendant data source columns in Immuta so you can build policies using those tags to restrict access to sensitive data. The table below outlines its child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

enabled boolean

When true, enables Snowflake lineage so that Immuta can apply tags added to Snowflake data sources to their descendant data source columns in Immuta.

Optional

false

true or false

lineageConfig object

Configures what tables Immuta will ingest lineage history for, the number of rows to ingest per batch, and what tags to propagate. Child parameters include tableFilter, tagFilterRegex, and ingestBatchSize.

Required if enabled is true.

-

-

lineageConfig.tableFilter string

This child parameter of lineageConfig determines which tables Immuta will ingest lineage for. Use a regular expression that excludes / from the beginning and end to filter tables. Without this filter, Immuta will attempt to ingest lineage for every table on your Snowflake instance.

Optional

^.*$

Regular expression that excludes / from the beginning and end.

lineageConfig.tagFilterRegex string

This child parameter of lineageConfig determines which tags to propagate using lineage. Use a regular expression that excludes / from the beginning and end to filter tags. Without this filter, Immuta will ingest lineage for every tag on your Snowflake instance.

Optional

^.*$

Regular expression that excludes / from the beginning and end.

lineageConfig.ingestBatchSize number

This child parameter of lineageConfig configures the number of rows Immuta ingests per batch when streaming Access History data from your Snowflake instance.

Optional

1000

Minimum value of 1.

Snowflake OAuth configuration object

The oAuthClientConfig object represents your OAuth configuration in Snowflake. This object is required if you set oAuthClientCredentials as your authentication type in the Snowflake integration configuration, and you must set autoBootstrap to false. The table below outlines the object's child parameters.

Parameter
Description
Required or optional
Default values
Accepted values

provider string

The identity provider for OAuth, such as Okta.

Required

-

-

clientId string

The client identifier of your registered application.

Required

-

-

authorityUrl string

Authority URL of your identity provider.

Required

-

-

useCertificate boolean

Specifies whether or not to use a certificate and private key for authenticating with OAuth.

Required

-

true or false

publicCertificateThumbprint string

Your certificate thumbprint.

Required if useCertificate is true.

-

-

oauthPrivateKey string

The private key content.

Required if useCertificate is true.

-

-

clientSecret string

Client secret of the application.

Required if useCertificate is false.

-

-

resource string

An optional resource to pass to the token provider.

Optional

-

-

scope string

Optional

[]

-

User role pattern object

The userRolePattern object excludes roles and users from authorization checks in the Snowflake integration. The table below outlines its child parameter.

Parameter
Description
Default values
Accepted values

exclude array[string]

This array is a list of roles and users (both case-sensitive) to exclude from authorization checks. Wildcards are unsupported.

[]

-

Workspaces object

The workspaces object represents an Immuta project workspace configured for Snowflake. The table below outlines its child parameters.

Parameter
Description
Default values
Accepted values

enabled boolean

This setting enables or disables Snowflake project workspaces. If you use Snowflake secure data sharing with Immuta, set this property to true, as project workspaces are required. If you use Snowflake table grants, set this property to false; project workspaces cannot be used when Snowflake table grants are enabled.

false

true or false

warehouses array[string]

This array is a list of warehouses workspace users have usage privileges on.

[]

-

PreviousIntegrations API EndpointsNextResponse Schema

Last updated 2 months ago

Was this helpful?

When true, Immuta will automatically configure the integration in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment for you. When false, you must set up your environment manually before configuring the integration with the API. This parameter must be set to false in the Amazon S3 and Google BigQuery configurations. See the specific how-to guide for configuring your integration for details: , , , .

This object specifies the integration settings. See the config object description for your integration for details: , , , , , or .

object

This object is a set of delimiters that Immuta uses to store profile data in Azure Synapse Analytics. See the for parameters.

See the for default values.

object

Enables user impersonation. See the for parameters.

Disabled by default. See the for parameters.

object

This object allows you to configure your integration to use a proxy server. See the for child attributes.

object

This object represents your OAuth configuration. To use this authentication method, authenticationType must be oAuthM2M. See the for parameters.

object

This object enables Databricks Unity Catalog query audit. See the for parameters.

Disabled by default. See the for parameters.

object

This object allows you to exclude groups in Databricks from authorization checks. See the for parameters.

The scope limits the operations and roles allowed in Databricks by the access token. See the for details about scopes.

.

The Google BigQuery service account JSON keyfile credential content. See the for guidance on generating and downloading this keyfile.

object

This object represents your Okta configuration. See the for parameters.

object

Enables user impersonation. See the for parameters.

Disabled by default. See the for parameters.

object

This object represents your Okta configuration. See the for parameters.

object

This object represents your OAuth configuration. To use this authentication method, autoBootstrap must be false. See the for parameters.

object

This object enables Snowflake query audit. See the for the parameter.

Disabled by default. See the for the parameter.

object

Enables user impersonation. See the for parameters.

Disabled by default. See the for parameters.

object

This object excludes roles and users from authorization checks. See the for parameters.

object

This object represents an Immuta project workspace configured for Snowflake. See the for parameters.

Disabled by default. See the for parameters.

object

Enables Snowflake lineage ingestion so that Immuta can apply tags added to Snowflake tables to their descendant data source columns. See the for parameters.

object

This object represents your OAuth configuration. See the for parameters.

The scope limits the operations and roles allowed in Snowflake by the access token. See the for details about scopes.

Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
OAuth 2.0 documentation
Client secret created for the Immuta service principal
Google documentation
OAuth 2.0 documentation
Azure Synapse Analytics
Databricks
Google BigQuery
Native S3
Redshift
Snowflake
Amazon S3
Azure Synapse Analytics
Databricks Unity Catalog
Google BigQuery
Redshift
Snowflake
metadataDelimiters
object description
object description
impersonation
impersonation object description
impersonation object description
proxyOptions
proxy options object description
oAuthClientConfig
oAuthClientConfig object description
audit
audit object description
audit object description
groupPattern
groupPattern object description
okta
Okta object description
impersonation
impersonation object description
impersonation object description
okta
Okta object description
oAuthClientConfig
oAuthClientConfig object description
audit
audit object description
audit object description
impersonation
impersonation object description
impersonation object description
userRolePattern
userRolePattern object description
workspaces
workspaces object description
workspaces object description
lineage
lineage ingestion object description
oAuthClientConfig
oAuthClientConfig object description