LogoLogo
SaaS
  • Immuta Documentation - SaaS
  • Configuration
    • Connect Data Platforms
      • Data Platforms Overview
      • Amazon S3 Integration
      • AWS Lake Formation
        • Getting Started with AWS Lake Formation
        • Register an AWS Lake Formation Connection
        • Reference Guides
          • AWS Lake Formation
          • Security and Compliance
          • Protecting Data
          • Accessing Data
      • Azure Synapse Analytics
        • Getting Started with Azure Synapse Analytics
        • Configure Azure Synapse Analytics Integration
        • Reference Guides
          • Azure Synapse Analytics Overview
          • Azure Synapse Analytics Pre-Configuration Details
      • Databricks
        • Databricks Spark
          • Getting Started with Databricks Spark
          • How-to Guides
            • Configure a Databricks Spark Integration
            • Manually Update Your Databricks Cluster
            • Install a Trusted Library
            • Project UDFs Cache Settings
            • Run R and Scala spark-submit Jobs on Databricks
            • DBFS Access
            • Troubleshooting
          • Reference Guides
            • Databricks Spark Integration Configuration
              • Installation and Compliance
              • Customizing the Integration
              • Setting Up Users
              • Spark Environment Variables
              • Ephemeral Overrides
            • Security and Compliance
            • Registering and Protecting Data
            • Accessing Data
              • Delta Lake API
        • Databricks Unity Catalog
          • Getting Started with Databricks Unity Catalog
          • How-to Guides
            • Configure a Databricks Unity Catalog Integration
            • Migrating to Unity Catalog
          • Databricks Unity Catalog Integration Reference Guide
      • Google BigQuery Integration
      • Redshift
        • Getting Started with Redshift
        • How-to Guides
          • Configure Redshift Integration
          • Configure Redshift Spectrum
        • Reference Guides
          • Redshift Overview
          • Redshift Pre-Configuration Details
      • Snowflake
        • Getting Started with Snowflake
        • How-to Guides
          • Configure a Snowflake Integration
          • Edit or Remove Your Snowflake Integration
          • Integration Settings
            • Snowflake Table Grants Private Preview Migration
            • Enable Snowflake Table Grants
            • Using Snowflake Data Sharing with Immuta
            • Enable Snowflake Low Row Access Policy Mode
              • Upgrade Snowflake Low Row Access Policy Mode
            • Configure Snowflake Lineage Tag Propagation
        • Reference Guides
          • Snowflake Integration
          • Snowflake Table Grants
          • Snowflake Data Sharing with Immuta
          • Snowflake Low Row Access Policy Mode
          • Snowflake Lineage Tag Propagation
          • Warehouse Sizing Recommendations
        • Explanatory Guides
          • Phased Snowflake Onboarding
      • Starburst (Trino)
        • Getting Started with Starburst (Trino)
        • How-to Guides
          • Configure Starburst (Trino) Integration
          • Customize Read and Write Access Policies for Starburst (Trino)
        • Starburst (Trino) Integration Reference Guide
      • Queries Immuta Runs in Your Data Platform
      • Connect Your Data
        • Registering a Connection
          • How-to Guides
            • Register a Snowflake Connection
            • Register a Databricks Unity Catalog Connection
            • Manually Run Object Sync
            • Manage Connection Settings
            • Use the Connection Upgrade Manager
              • Troubleshooting
          • Reference Guides
            • Connections
            • Upgrading to Connections
              • Before You Begin
              • API Changes
              • FAQ
        • Registering Metadata
          • Data Sources in Immuta
          • Register Data Sources
            • Amazon S3 Data Source
            • Azure Synapse Analytics Data Source
            • Databricks Data Source
            • Google BigQuery Data Source
            • Redshift Data Source
            • Snowflake Data Source
              • Bulk Create Snowflake Data Sources
            • Create a Starburst (Trino) Data Source
          • Data Source Settings
            • How-to Guides
              • Manage Data Source Settings
              • Manage Data Source Members
              • Manage Access Requests and Tasks
              • Manage Data Dictionary Descriptions
              • Disable Immuta from Sampling Raw Data
            • Data Source Health Checks Reference Guide
          • Schema Monitoring
            • How-to Guides
              • Manage Schema Monitoring
              • Run Schema Monitoring and Column Detection Jobs
            • Reference Guides
              • Schema Monitoring
              • Schema Projects
            • Why Use Schema Monitoring Concept Guide
    • Manage Data Metadata
      • Connect External Catalogs
        • Configure an External Catalog
        • Reference Guides
          • External Catalog Introduction
          • Custom REST Catalog Interface Introduction
          • Custom REST Catalog Interface Endpoints
      • Data Discovery
        • Introduction
        • Getting Started with Data Discovery
        • How-to Guides
          • Use Identifiers in Domains
          • Use Sensitive Data Discovery (SDD)
          • Manage Identification Frameworks
          • Manage Identifiers
          • Run and Manage Sensitive Data Discovery on Data Sources
        • Reference Guides
          • Identifiers in Domains
          • Built-in Identifier Reference
          • Improved Pack: Built-in Identifier Reference
          • Built-in Discovered Tags Reference
          • How Competitive Pattern Analysis Works
      • Data Classification
        • How-to Guides
          • Activate Classification Frameworks
          • Adjust Identification and Classification Framework Tags
          • How to Use a Classification Framework with Your Own Tags
        • Reference Guide
          • Classification Frameworks
      • Manage Tags
        • How-to Guides
          • Create and Manage Tags
          • Add Tags to Data Sources and Projects
        • Tags Reference Guide
    • Manage Users
      • Getting Started with Users
      • Identity Managers (IAMs)
        • How-to Guides
          • Okta LDAP Interface
          • OpenID Connect
            • OpenID Connect Protocol
            • Okta and OpenID Connect
            • OneLogin with OpenID Connect
          • SAML
            • SAML Protocol
            • Microsoft Entra ID
            • Okta SAML SCIM
        • Reference Guides
          • Identity Managers
          • SAML Protocol Configuration Options
          • SAML Single Logout
      • Immuta Users
        • How-to Guides
          • Managing Personas and Permissions
          • User Impersonation
          • Manage Attributes and Groups
          • External User ID Mapping
          • External User Info Endpoint
        • Reference Guides
          • Permissions and Personas
          • Attributes and Groups in Immuta
    • Organize Data into Domains
      • Getting Started with Domains
      • Domains Reference Guide
    • Application Settings
      • How-to Guides
        • App Settings
        • Private Networking Support
          • Data Connection Private Networking
            • AWS PrivateLink for Redshift
            • AWS PrivateLink for API Gateway
            • Databricks Private Connectivity
              • AWS PrivateLink for Databricks
              • Azure Private Link for Databricks
            • Snowflake Private Connectivity
              • AWS PrivateLink for Snowflake
              • Azure Private Link for Snowflake
            • Starburst (Trino) Private Connectivity
              • AWS PrivateLink for Starburst (Trino)
              • Azure Private Link for Starburst (Trino)
          • Immuta SaaS Private Networking
            • Immuta SaaS Private Networking Over AWS PrivateLink
        • BI Tools
          • BI Tool Configuration Recommendations
          • Power BI Configuration Example
          • Tableau Configuration Example
        • IP Filtering
        • System Status Bundle
      • Reference Guides
        • Deployment Options
        • Data Processing
        • Encryption and Masking Practices
  • Marketplace
    • Introduction
      • User Types
      • Walkthrough
    • Share Data Products
      • How-to Guides
        • Manage Data Products
        • View and Respond to Access Requests
        • Customize the Marketplace Branding
      • Reference Guides
        • Marketplace App Requirements
        • Data Products
        • Marketplace Permissions Matrix
        • Understanding Access Provisioning and Underlying Policies in Immuta
          • S3 Provisioning Best Practices
        • Integrating with Existing Catalogs
        • Setting Up Domains for Marketplace
    • Access Data Products
      • How-to Guides
        • Logging into Marketplace
        • Requesting Access to a Data Product
      • Reference Guide
        • Data Source Access Status
    • Short-Term Limitations
  • Governance
    • Introduction
      • Automate Data Access Control Decisions
        • The Two Paths
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
        • Test and Deploy Policy
      • Compliantly Open More Sensitive Data for ML and Analytics
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
    • Author Policies for Data Access Control
      • Introduction
        • Scalability and Evolvability
        • Understandability
        • Distributed Stewardship
        • Consistency
        • Availability of Data
      • Policies
        • Authoring Policies at Scale
        • Data Engineering with Limited Policy Downtime
        • Subscription Policies
          • Overview
          • How-to Guides
            • Author a Subscription Policy
            • Author an ABAC Subscription Policy
            • Subscription Policies Advanced DSL Guide
            • Author a Restricted Subscription Policy
            • Clone, Activate, or Stage a Global Policy
          • Reference Guides
            • Subscription Policy Access Types
            • Advanced Use of Special Functions
        • Data Policies
          • Overview
          • How-to Guides
            • Author a Masking Data Policy
            • Author a Minimization Policy
            • Author a Purpose-Based Restriction Policy
            • Author a Restricted Data Policy
            • Author a Row-Level Policy
            • Author a Time-Based Restriction Policy
            • Policy Certifications and Diffs
          • Reference Guides
            • Data Policy Types
            • Masking Policies
            • Row-Level Policies
            • Custom WHERE Clause Functions
            • Data Policy Conflicts and Fallback
            • Custom Data Policy Certifications
            • Orchestrated Masking Policies
      • Projects and Purpose-Based Access Control
        • Projects and Purpose Controls
          • Getting Started
          • How-to Guides
            • Create a Project
            • Create and Manage Purposes
            • Adjust a Policy
            • Project Management
              • Manage Projects and Project Settings
              • Manage Project Data Sources
              • Manage Project Members
          • Reference Guides
            • Projects and Purposes
            • Policy Adjustments
          • Concept Guide
            • Why Use Purposes?
        • Equalized Access
          • Manage Project Equalization How-to Guide
          • Equalized Access Reference Guide
          • Why Use Project Equalization?
        • Masked Joins
          • Enable Masked Joins How-to Guide
          • Why Use Masked Joins?
        • Writing to Projects
          • How-to Guides
            • Create and Manage Snowflake Project Workspaces
            • Create and Manage Databricks Spark Project Workspaces
            • Write Data to the Workspace
          • Reference Guides
            • Writing to Projects
            • Project UDFs (Databricks)
      • Data Consumers
        • Subscribe to a Data Source
        • Query Data
          • Querying Snowflake Data
          • Querying Databricks Data
          • Querying Starburst (Trino) Data
          • Querying Databricks SQL Data
          • Querying Redshift Data
          • Querying Azure Synapse Analytics Data
        • Subscribe to Projects
    • Observe Access and Activity
      • Introduction
      • Audit
        • How-to Guides
          • Export Audit Logs to S3
          • Export Audit Logs to ADLS
          • Use Immuta Audit
          • Run Governance Reports
        • Reference Guides
          • Universal Audit Model (UAM)
            • UAM Schema Reference Guide
          • Query Audit Logs
            • Snowflake Query Audit Logs
            • Databricks Unity Catalog Query Audit Logs
            • Databricks Spark Query Audit Logs
            • Starburst (Trino) Query Audit Logs
          • Audit Export GraphQL Reference Guide
          • Unknown Users in Audit Logs
          • Governance Report Types
        • Deprecated Audit Guides
          • Legacy to UAM Migration
          • Manage Audit Logs
      • Dashboards
        • Use the Audit Dashboards How-To Guide
        • Audit Dashboards Reference Guide
      • Monitors
        • Manage Monitors and Observations
        • Monitors Reference Guide
  • Releases
    • Deployment Notes
      • 2024
      • 2023
      • 2022
    • Scheduled Maintenance Windows
    • Immuta Support Matrix Overview
    • Immuta CLI Release Notes
    • Preview Features
      • Features in Preview
    • Deprecations
  • Developer Guides
    • The Immuta CLI
      • Install and Configure the Immuta CLI
      • Manage Your Immuta Tenant
      • Manage Data Sources
      • Manage Sensitive Data Discovery
        • Manage Sensitive Data Discovery Rules
        • Manage Identification Frameworks
        • Run Sensitive Data Discovery on Data Sources
      • Manage Policies
      • Manage Projects
      • Manage Purposes
      • Manage Audit Export
    • The Immuta API
      • Integrations API
        • Getting Started
        • How-to Guides
          • Configure an Amazon S3 Integration
          • Configure an Azure Synapse Analytics Integration
          • Configure a Databricks Unity Catalog Integration
          • Configure a Google BigQuery Integration
          • Configure a Redshift Integration
          • Configure a Snowflake Integration
          • Configure a Starburst (Trino) Integration
        • Reference Guides
          • Integrations API Endpoints
          • Integration Configuration Payload
          • Response Schema
          • HTTP Status Codes and Error Messages
      • Connections API
        • How-to Guides
          • Register a Connection
            • Register a Snowflake Connection
            • Register a Databricks Unity Catalog Connection
            • Register an AWS Lake Formation Connection
          • Manage a Connection
          • Deregister a Connection
        • Connection Registration Payloads Reference Guide
      • Marketplace API
        • Marketplace API Endpoints
        • Source Controlling Data Products
      • Immuta V2 API
        • Data Source Payload Attribute Details
          • Data Source Request Payload Examples
        • Create Policies API Examples
        • Create Projects API Examples
        • Create Purposes API Examples
      • Immuta V1 API
        • Authenticate with the API
        • Configure Your Instance of Immuta
          • Get Job Status
          • Manage Frameworks
          • Manage IAMs
          • Manage Licenses
          • Manage Notifications
          • Manage Sensitive Data Discovery (SDD)
          • Manage Tags
          • Manage Webhooks
          • Search Filters
        • Connect Your Data
          • Create and Manage an Amazon S3 Data Source
          • Create an Azure Synapse Analytics Data Source
          • Create a Databricks Data Source
          • Create a Redshift Data Source
          • Create a Snowflake Data Source
          • Create a Starburst (Trino) Data Source
          • Manage the Data Dictionary
        • Use Domains
        • Manage Data Access
          • Manage Access Requests
          • Manage Data and Subscription Policies
          • Manage Write Policies
            • Write Policies Payloads and Response Schema Reference Guide
          • Policy Handler Objects
          • Search Audit Logs
          • Search Connection Strings
          • Search for Organizations
          • Search Schemas
        • Subscribe to and Manage Data Sources
        • Manage Projects and Purposes
          • Manage Projects
          • Manage Purposes
        • Generate Governance Reports
Powered by GitBook

Self-managed versions

  • 2024.3
  • 2024.2

Resources

  • Immuta Changelog

Copyright © 2014-2025 Immuta Inc. All rights reserved.

On this page
  • Endpoints
  • GET /integrations
  • Response
  • POST /integrations
  • Amazon S3 example
  • Azure Synapse Analytics example
  • Databricks Unity Catalog example
  • Google BigQuery example
  • Redshift example
  • Snowflake example
  • Starburst (Trino) example
  • Body parameters
  • Query parameter
  • Response
  • DELETE /integrations/{id}
  • Request parameter
  • Query parameter
  • Body parameters
  • Response
  • GET /integrations/{id}
  • Request parameter
  • Response
  • PUT /integrations/{id}
  • Amazon S3 example
  • Azure Synapse Analytics example
  • Databricks Unity Catalog example
  • Google BigQuery example
  • Redshift example
  • Snowflake example
  • Body parameters
  • Query parameter
  • Response
  • POST /integrations/{id}/regenerate
  • Starburst (Trino) example
  • Response
  • GET /integrations/{id}/status
  • Request parameter
  • Response
  • POST /integrations/scripts/cleanup
  • Body parameters
  • Response
  • POST /integrations/scripts/create
  • Body parameters
  • Response
  • POST /integrations/{id}/scripts/delete
  • Response
  • POST /integrations/{id}/scripts/edit
  • Body parameters
  • Response
  • POST /integrations/scripts/initial-create
  • Body parameters
  • Response
  • POST /integrations/scripts/post-cleanup
  • Body parameters
  • Response
  • Related guides

Was this helpful?

Export as PDF
  1. Developer Guides
  2. The Immuta API
  3. Integrations API
  4. Reference Guides

Integrations API Endpoints

The integrations resource allows you to create, configure, and manage your integration. How Immuta manages and administers policies in your data platform varies by integration.

To configure or manage an integration, users must have the APPLICATION_ADMIN Immuta permission.

Endpoints

Method
Endpoint
Description

GET

Gets all integration configurations

POST

Creates an integration

DELETE

Deletes a configured integration

GET

Gets an integration configuration

PUT

Updates a configured integration

POST

Regenerates an Immuta API key for the configured integration

GET

Gets the status of the specified integration

POST

Creates a script to remove Immuta-managed resources from your platform for integrations that were not successfully created

POST

Creates a script to set up Immuta-managed resources in your platform

POST

Creates a script to remove Immuta-managed resources from your platform for integrations that were successfully configured

POST

Creates a script to edit existing Immuta-managed resources in your platform

POST

Creates the first script to set up Immuta-managed resources in your Azure Synapse Analytics or Redshift platform

POST

Creates the second script to remove Immuta-managed resources from your Azure Synapse Analytics integration if it was not successfully created

GET /integrations

Gets all integration configurations.

curl -X 'GET' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Response

[
  {
    "id": "1",
    "status": "enabled",
    "validationResults": {
      "status": "passed",
      "validationTests": [
      {
        "name": "Initial Validation: Basic Connection Test",
        "status": "passed"
      },
      {
        "name": "Initial Validation: Default Warehouse Access Test",
        "status": "passed",
        "result": []
      },
      {
        "name": "Initial Validation: Validate access to Privileged Role",
        "status": "passed",
        "result": []
      },
      {
        "name": "Validate Automatic: Database Does Not Exist",
        "status": "passed"
      },
      {
        "name": "Validate Automatic: Impersonation Role Does Not Exist",
        "status": "skipped"
      },
      {
        "name": "Validate Automatic Bootstrap User Grants",
        "status": "passed"
      }
    ] },
    "type": "Snowflake",
    "autoBootstrap": true,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "port": 443,
      "audit": {
        "enabled": false
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "lineage": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "<REDACTED>",
      "password": "<REDACTED>",
      "role": "ACCOUNTADMIN"
    },
    {
      "id": "2",
      "status": "enabled",
      "type": "Databricks",
      "validationResults": {
        "status": "passed",
        "validationTests": [
        {
          "name": "Metastore validation",
          "status": "passed"
        },
        {
          "name": "Basic Connection Test",
          "result": [
          {
            "1": 1
          }
          ],
          "status": "passed"
        }
        ]
      },
      "autoBootstrap": true,
      "config": {
        "workspaceUrl": "www.example-workspace.cloud.databricks.com",
        "httpPath": "sql/protocolv1/o/0/0000-00000-abc123",
        "authenticationType": "token",
        "token": "REDACTED",
        "audit": {
          "enabled": false
        },
        "catalog": "immuta"
      }
    }
  }
]

POST /integrations

Creates an integration configuration that allows Immuta to manage access policies on data registered in Immuta.

Amazon S3 example

When you connect Immuta to your AWS account, the awsLocationPath is the base S3 location prefix that Immuta will use for this connection when registering S3 data sources.

This request configures the integration using the AWS access key authentication method.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Native S3",
    "autoBootstrap": false,
    "config": {
      "name": "S3 integration",
      "awsAccountId": "123456789",
      "awsRegion": "us-east-1",
      "awsLocationRole": "arn:aws:iam::123456789:role/access-grants-instance-role",
      "awsLocationPath": "s3://",
      "authenticationType": "accessKey",
      "awsAccessKeyId": "123456789",
      "awsSecretAccessKey": "123456789"
    }
    }'

Azure Synapse Analytics example

When you connect Immuta to your Azure Synapse Analytics account, the schema you specify is where all the policy-enforced views will be created and managed by Immuta.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": true,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Databricks Unity Catalog example

This request creates a Databricks Unity Catalog integration configuration that allows Immuta to administer Unity Catalog policies on data registered in Immuta.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Databricks",
    "autoBootstrap": true,
    "config": {
      "workspaceUrl": "www.example-workspace.cloud.databricks.com",
      "httpPath": "sql/protocolv1/o/0/0000-00000-abc123",
      "authenticationType": "token",
      "token": "REDACTED",
      "catalog": "immuta"
    }
    }'

Google BigQuery example

When you connect Immuta to your Google BigQuery account, the dataset you specify is where all the policy-enforced views will be created and managed by Immuta.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Google BigQuery",
    "autoBootstrap": false,
    "config": {
      "role": "immuta",
      "datasetSuffix": "_secureView",
      "dataset": "immuta",
      "location": "us-east1",
      "credential": "{\"type\":\"service_account\",\"project_id\":\"innate-conquest-123456\",\"private_key_id\":\"9163c12345690924f5dd218ff39\",\"private_key\":\"-----BEGIN PRIVATE KEY-----\nXXXXXXXro0s\n/yQlPQijowkccmrmWJyr93kdLnwJzBvLHCto/+W\ncvF2ygX9oM/dyUK//z//4nptMp+Ck//Yw3D4rIBwGu4DWiR1qRnf\nDoGyXfThPTQ==\n-----END PRIVATE KEY-----\n\",\"client_email\":\"service-account-id@innate-conquest-123456.iam.gserviceaccount.com\",\"client_id\":\"1166290***432952487857\",\"auth_uri\":\"https://accounts.google.com/o/oauth2/auth\",\"token_uri\":\"https://oauth2.googleapis.com/token\",\"auth_provider_x509_cert_url\":\"https://www.googleapis.com/oauth2/v1/certs\",\"client_x509_cert_url\":\"https://www.googleapis.com/robot/v1/metadata/x509/service-accound-id%40innate-conquest-123456.iam.gserviceaccount.com\",\"universe_domain\":\"googleapis.com\"}"
    }
    }'

Redshift example

When you connect Immuta to your Redshift account, the Immuta system user will use the database you specify to manage and store metadata. The initial database (REDSHIFT_SAMPLE_DATA, in the request below) is an existing Redshift database that Immuta connects to in order to create the Immuta-managed database (immuta, in the request below).

This request specifies userPassword as the authentication type for the Immuta system user. The username and password provided are credentials for a system account that can manage the database.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Redshift",
    "autoBootstrap": true,
    "config": {
      "host": "organization.aws.amazon.com",
      "database": "immuta",
      "initialDatabase": "REDSHIFT_SAMPLE_DATA",
      "authenticationType": "userPassword",
      "username": "taylor@redshift.com",
      "password": "abc1234"
    }
    }'

Snowflake example

When you connect Immuta to your Snowflake account, the warehouse you specify is the default pool of compute resources the Immuta system user will use to run queries and perform other Snowflake operations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": true,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "authenticationType": "userPassword",
      "username": "taylor@snowflake.com",
      "password": "abc1234",
      "role": "ACCOUNTADMIN"
    }
    }'

Starburst (Trino) example

When you configure the Starburst (Trino) integration, Immuta generates an API key and configuration snippet on the Immuta app settings page that you will use to configure your Starburst cluster.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Trino"
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Google BigQuery

  • Native S3

  • Redshift

  • Snowflake

  • Trino

autoBootstrap boolean

Required for all integrations except Starburst (Trino)

-

true or false

config object

Required for all integrations except Starburst (Trino)

-

-

Query parameter

Parameter
Description
Required or optional

dryRun boolean

When true, the integration configuration will not actually be created, and the response returns the validation tests statuses.

Optional

Response

A successful response includes the validation tests statuses.

{
  "id": "123456789",
  "status": "creating",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    },
    {
      "name": "Initial Validation: Default Warehouse Access Test",
      "status": "passed",
      "result": []
    },
    {
      "name": "Initial Validation: Validate access to Privileged Role",
      "status": "passed",
      "result": []
    },
    {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    },
    {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    },
    {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }
    ]
  }
}
{
  "statusCode": 409,
  "error": "Conflict",
  "message": "Snowflake integration already exists on host organization.us-east-1.snowflakecomputing.com (id = 123456789)"
}

DELETE /integrations/{id}

Deletes the integration configuration you specify in the request.

curl -X 'DELETE' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "authenticationType": "userPassword",
    "username": "taylor@snowflake.com",
    "password": "abc1234",
    "role": "ACCOUNTADMIN"
    }'

Request parameter

Parameter
Description
Required or optional

id number

The unique identifier of the integration configuration.

Required

Query parameter

Parameter
Description
Required or optional

dryRun boolean

When true, the integration configuration will not actually be deleted, and the response returns the validation tests statuses.

Optional

forceDisable boolean

When true, the integration will be deleted in Immuta. Users must manually remove all Immuta objects in the remote data platform.

Optional

Body parameters

For Amazon S3 integrations, Databricks Unity Catalog integrations, Google BigQuery integrations, Starburst (Trino) integrations, or integration configurations with autoBootstrap set to false, no payload is required to delete the integration.

For the integrations below, the request accepts a JSON or YAML payload when autoBootstrap is set to true. See the payload description for your integration for parameters and details:

Response

{
  "id": "123456789",
  "status": "deleting",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    },
    {
      "name": "Initial Validation: Default Warehouse Access Test",
      "status": "passed",
      "result": []
    },
    {
      "name": "Initial Validation: Validate access to Privileged Role",
      "status": "passed",
      "result": []
    },
    {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    },
    {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    },
    {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }
    ]
  }
}

GET /integrations/{id}

Gets the integration configuration you specify in the request.

curl -X 'GET' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Request parameter

Parameter
Description
Required or optional

id number

The unique identifier of the integration configuration.

Required

Response

{
  "id": "123456789",
  "status": "enabled",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    }, {
      "name": "Initial Validation: Default Warehouse Access Test",
      "result": [],
      "status": "passed"
    }, {
      "name": "Initial Validation: Table Grants Role Prefix is Unique",
      "status": "passed"
    }, {
      "name": "Initial Validation: Validate access to Privileged Role",
      "result": [],
      "status": "passed"
    }, {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    }, {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    }, {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }]
  },
  "type": "Snowflake",
  "autoBootstrap": true,
  "config": {
    "host": "organization.us-east-1.snowflakecomputing.com",
    "warehouse": "SAMPLE_WAREHOUSE",
    "database": "SNOWFLAKE_SAMPLE_DATA",
    "port": 443,
    "audit": {
      "enabled": false
    },
    "workspaces": {
      "enabled": false
    },
    "impersonation": {
      "enabled": false
    },
    "lineage": {
      "enabled": false
    },
    "authenticationType": "userPassword",
    "username": "<REDACTED>",
    "password": "<REDACTED>",
    "role": "ACCOUNTADMIN"
  }
}

PUT /integrations/{id}

Updates an existing integration configuration.

Amazon S3 example

This request changes the name of the integration.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Native S3",
    "autoBootstrap": false,
    "config": {
      "name": "S3 integration edited",
      "awsAccountId": "123456789",
      "awsRegion": "us-east-1",
      "awsLocationRole": "arn:aws:iam::123456789:role/access-grants-instance-role",
      "awsLocationPath": "s3://",
      "authenticationType": "accessKey",
      "awsAccessKeyId": "123456789",
      "awsSecretAccessKey": "123456789"
    }
    }'

Azure Synapse Analytics example

This request enables user impersonation for the Azure Synapse Analytics integration.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": true,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "impersonation": {
        "enabled": true,
        "role": "IMMUTA_IMPERSONATION"
      },
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Databricks Unity Catalog example

This request updates the access token.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Databricks",
    "autoBootstrap": true,
    "config": {
      "workspaceUrl": "www.example-workspace.cloud.databricks.com",
      "httpPath": "sql/protocolv1/o/0/0000-00000-abc123",
      "authenticationType": "token",
      "token": "REDACTED",
      "catalog": "immuta"
    }
    }'

Google BigQuery example

This request updates the private key for the Google BigQuery integration.

  curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/{id}' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Google BigQuery",
    "autoBootstrap": false,
    "config": {
      "role": "immuta",
      "datasetSuffix": "_secureView",
      "dataset": "immuta",
      "location": "us-east1",
      "credential": "{\"type\":\"service_account\",\"project_id\":\"innate-conquest-123456\",\"private_key_id\":\"9163c12345690924f5dd218ff39\",\"private_key\":\"-----BEGIN PRIVATE KEY-----\nXXXXXXXro0s\n/yQlPQijowkccmrmWJyr93kdLnwJzBvLHCto/+W\ncvF2ygX9oM/dyUK//z//4nptMp+Ck//Yw3D4rIBwGu4DWiR1qRnf\nDoGyXfThPTQ==\n-----END PRIVATE KEY-----\n\",\"client_email\":\"service-account-id@innate-conquest-123456.iam.gserviceaccount.com\",\"client_id\":\"1166290***432952487857\",\"auth_uri\":\"https://accounts.google.com/o/oauth2/auth\",\"token_uri\":\"https://oauth2.googleapis.com/token\",\"auth_provider_x509_cert_url\":\"https://www.googleapis.com/oauth2/v1/certs\",\"client_x509_cert_url\":\"https://www.googleapis.com/robot/v1/metadata/x509/service-accound-id%40innate-conquest-123456.iam.gserviceaccount.com\",\"universe_domain\":\"googleapis.com\"}"
    }
    }'

Redshift example

This request enables user impersonation for the Redshift integration.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Redshift",
    "autoBootstrap": true,
    "config": {
      "host": "organization.aws.amazon.com",
      "database": "immuta",
      "initialDatabase": "REDSHIFT_SAMPLE_DATA",
      "impersonation": {
        "enabled": true,
        "role": "immuta_impersonation"
      },
      "authenticationType": "userPassword",
      "username": "taylor@redshift.com",
      "password": "abc1234"
    }
    }'

Snowflake example

This request enables auditing queries run in Snowflake.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": true,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": true
      },
      "authenticationType": "userPassword",
      "username": "taylor@snowflake.com",
      "password": "abc1234",
      "role": "ACCOUNTADMIN"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Google BigQuery

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

true or false

config object

Required

-

-

Query parameter

Parameter
Description
Required or optional

dryRun boolean

When true, the integration configuration will not actually be updated, and the response returns the validation tests statuses.

Optional

Response

A successful response includes the validation tests statuses.

{
  "id": "123456789",
  "status": "editing",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    },
    {
      "name": "Initial Validation: Default Warehouse Access Test",
      "status": "passed",
      "result": []
    },
    {
      "name": "Initial Validation: Validate access to Privileged Role",
      "status": "passed",
      "result": []
    },
    {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    },
    {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    },
    {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }
    ]
  }
}
{
  "statusCode": 409,
  "error": "Conflict",
  "message": "Unable to edit integration with ID 123456789 in current state editing."
}

POST /integrations/{id}/regenerate

Regenerates an Immuta API key for the configured integration.

Starburst (Trino) example

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/123456789/regenerate' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Response

{ "newKey": "5bb6cae9******300c21acbb" }

GET /integrations/{id}/status

Gets the status of the integration specified in the request.

curl -X 'GET' \
    'https://www.organization.immuta.com/integrations/123456789/status' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Request parameter

Parameter
Description
Required or optional

id number

The unique identifier of the integration configuration.

Required

Response

{"id":123456789,"status":"enabled"}

POST /integrations/scripts/cleanup

Creates a script to remove Immuta-managed resources from your platform. This endpoint is for Azure Synapse Analytics, Redshift, and Snowflake integrations that were not successfully created and, therefore, do not have an integration ID.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/cleanup' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": false,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": true
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "IMMUTA_SYSTEM_ACCOUNT",
      "password": "abc1234"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to clean up.

Required

-

  • Azure Synapse Analytics

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics, Redshift, or Snowflake environment.

Once you have run the script,

  • use the DELETE /integrations/{id} endpoint to delete your Redshift or Snowflake integration in Immuta:

POST /integrations/scripts/create

Creates a script for you to run manually to set up objects and resources for Immuta to manage and enforce access controls on your data. This endpoint is available for Azure Synapse Analytics, Databricks Unity Catalog, Redshift, and Snowflake integrations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/create' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": false,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": false
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "IMMUTA_SYSTEM_ACCOUNT",
      "password": "abc1234"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment.

POST /integrations/{id}/scripts/delete

Creates a script to remove Immuta-managed resources from your platform. This endpoint is for Azure Synapse Analytics, Redshift, and Snowflake integrations that were successfully created.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/1/scripts/delete' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Response

The response returns the script that you will run in your Azure Synapse Analytics, Redshift, or Snowflake environment.

Once you have run the script, use the DELETE /integrations/{id} endpoint to delete your integration in Immuta:

POST /integrations/{id}/scripts/edit

Creates a script for you to run manually to edit objects and resources managed by Immuta in your platform. This endpoint is available for Azure Synapse Analytics, Redshift, and Snowflake integrations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/1/scripts/edit' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": false,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": true
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "IMMUTA_SYSTEM_ACCOUNT",
      "password": "abc1234"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment. Once you have run the script, use the PUT /integrations/{id} endpoint to finish editing your integration:

POST /integrations/scripts/initial-create

Creates the first script for you to run manually to set up objects and resources for Immuta to manage and enforce access controls on your data in Azure Synapse Analytics or Redshift integrations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/initial-create' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": false,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Redshift

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics or Redshift environment.

POST /integrations/scripts/post-cleanup

Creates a second script to remove the final Immuta-managed resources from your Azure Synapse Analytics platform. This endpoint is for Azure Synapse Analytics integrations that were not successfully created and, therefore, do not have an integration ID.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/post-cleanup' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": false,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to clean up.

Required

-

Azure Synapse Analytics

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics environment.

Related guides

See the following how-to guides for configuration examples and steps for creating, managing, or deleting your integration:

PreviousReference GuidesNextIntegration Configuration Payload

Last updated 2 months ago

Was this helpful?

The response returns the configuration for all integrations. See the for details about the response schema. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

This request specifies userPassword authentication type. The username and password provided are credentials of a Snowflake account attached to a role with . These credentials are not stored; they are used by Immuta to configure the integration.

When true, Immuta will automatically configure the integration in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment for you. When false, you must set up your environment manually before configuring the integration with the API. This parameter must be set to false in the Amazon S3 and Google BigQuery configurations. See the specific how-to guide for configuring your integration for details: , , , .

This object specifies the integration settings. See the config object description for your integration for details: , , , , , or .

The response returns the status of the integration configuration connection. See the for details about the response schema.

An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

The response returns the status of the integration configuration that has been deleted. See the for details about the response schema. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

The response returns an integration configuration. See the for details about the response schema. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

When true, Immuta will automatically configure the integration in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment for you. When false, you must set up your environment manually before configuring the integration with the API. This parameter must be set to false in the Google BigQuery configuration. See the specific how-to guide for configuring other integrations: , , , .

This object specifies the integration settings. See the config object description for your integration for details: , , , , or .

The response returns the status of the integration configuration connection. See the for details about the response schema.

An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

This request regenerates an Immuta API key for the configured Starburst (Trino) integration. Once you make this request, your old Immuta API key will be deleted and will no longer be valid. See the for instructions on updating your Starburst (Trino) integration to use the new API key.

The response returns the new Immuta API key. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

The response returns the of the specified integration. An unsuccessful request returns the HTTP status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

For Azure Synapse Analytics integrations, you must also make a request to the to create another script that will finish removing Immuta-managed resources from the platform.

Set to false to specify that you will run the script in your environment yourself to clean up the integration resources. See the , , or manual setup section for details.

This object specifies the integration settings. See the config object description for your integration for details: , , or .

use the to create another script that will finish removing Immuta-managed resources from your Azure Synapse Analytics platform.

Set to false to specify that you will run the script in your environment yourself to configure the integration. You must run the Immuta script before creating the integration. See the , , , or manual setup guides for details.

This object specifies the integration settings. See the config object description for your integration for details: , , , or .

Set to false to specify that you will run the script in your environment yourself to configure the integration. You must run the Immuta script before creating the integration. See the , , or manual setup guides for details.

This object specifies the integration settings. Some settings cannot be changed once an integration is configured. See the config object description for your integration for details: , , or .

Set to false to specify that you will run the script in your environment yourself to configure the integration. You must run the Immuta script before creating the integration. See the or manual setup guides for details.

This object specifies the integration settings. See the config object description of the or integration configuration for details.

Once you have run this script, use the to generate a script to finish creating the Immuta-managed resources in your platform.

Before making a request like the one below, you must make a request to the to create the first script that will remove the initial Immuta-managed resources from the platform.

Set to false to specify that you will run the script in your environment yourself to clean up the integration resources. See the manual setup section for details.

This object specifies the integration settings. See the config object description of for details.

Once you have run the script, use the DELETE /integrations/{id} endpoint to delete your integration in Immuta by following the instructions.

response schema reference
response schema reference
Configure an Amazon S3 integration
Configure an Azure Synapse Analytics integration
Configure a Databricks Unity Catalog integration
Configure a Google BigQuery integration
Configure a Redshift integration
Configure a Snowflake integration
Configure a Starburst (Trino) integration
/integrations/scripts/post-cleanup endpoint
/integrations/scripts/post-cleanup endpoint
/integrations/scripts/create endpoint
/integrations/scripts/cleanup endpoint
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
/integrations
/integrations
/integrations/{id}
/integrations/{id}
/integrations/{id}
/integrations/{id}/regenerate
/integrations/{id}/status
/integrations/scripts/cleanup
/integrations/scripts/create
/integrations/{id}/scripts/delete
/integrations/{id}/scripts/edit
/integrations/scripts/initial-create
/integrations/scripts/post-cleanup
response schema reference
HTTP status codes and error messages
HTTP status codes and error messages
response schema reference
HTTP status codes and error messages
response schema reference
HTTP status codes and error messages
HTTP status codes and error messages
HTTP status codes and error messages page
status
HTTP status codes and error messages
Configure a Starburst (Trino) integration page
Delete Redshift integration
Delete Redshift integration
Configure Redshift integration
Delete Azure Synapse Analytics integration payload
Delete Redshift integration payload
Delete Snowflake integration payload
Amazon S3
Azure Synapse Analytics
Databricks Unity Catalog
Google BigQuery
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog
Google BigQuery
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Azure Synapse Analytics
Redshift
Redshift
Redshift
Redshift
Databricks Unity Catalog
these privileges
Delete Snowflake integration
Delete Snowflake integration
Configure Snowflake integration
Snowflake
Snowflake
Snowflake
Delete Azure Synapse Analytics integration
Configure Azure Synapse Analytics integration
Delete Azure Synapse Analytics integration
Azure Synapse Analytics
Azure Synapse Analytics
Azure Synapse Analytics
Azure Synapse Analytics
Azure Synapse Analytics