LogoLogo
2024.3
  • Immuta Documentation - 2024.3
  • What is Immuta?
  • Self-Managed Deployment
    • Requirements
    • Install
      • Managed Public Cloud
      • Red Hat OpenShift
    • Upgrade
      • Migrating to the New Helm Chart
      • Upgrading (IEHC)
      • Upgrading (IHC)
    • Guides
      • Ingress Configuration
      • TLS Configuration
      • Cosign Verification
      • Production Best Practices
      • Rotating Credentials
      • External Cache Configuration
      • Enabling Legacy Query Engine and Fingerprint
      • Private Container Registries
      • Air-Gapped Environments
    • Disaster Recovery
    • Troubleshooting
    • Conventions
  • Integrations
    • Immuta Integrations
    • Snowflake
      • Getting Started
      • How-to Guides
        • Configure a Snowflake Integration
        • Snowflake Table Grants Migration
        • Edit or Remove Your Snowflake Integration
        • Integration Settings
          • Enable Snowflake Table Grants
          • Use Snowflake Data Sharing with Immuta
          • Configure Snowflake Lineage Tag Propagation
          • Enable Snowflake Low Row Access Policy Mode
            • Upgrade Snowflake Low Row Access Policy Mode
      • Reference Guides
        • Snowflake Integration
        • Snowflake Data Sharing
        • Snowflake Lineage Tag Propagation
        • Snowflake Low Row Access Policy Mode
        • Snowflake Table Grants
        • Warehouse Sizing Recommendations
      • Phased Snowflake Onboarding Concept Guide
    • Databricks Unity Catalog
      • Getting Started
      • How-to Guides
        • Configure a Databricks Unity Catalog Integration
        • Migrate to Unity Catalog
      • Databricks Unity Catalog Integration Reference Guide
    • Databricks Spark
      • How-to Guides
        • Configuration
          • Simplified Databricks Spark Configuration
          • Manual Databricks Spark Configuration
          • Manually Update Your Databricks Cluster
          • Install a Trusted Library
        • DBFS Access
        • Limited Enforcement in Databricks Spark
        • Hide the Immuta Database in Databricks
        • Run spark-submit Jobs on Databricks
        • Configure Project UDFs Cache Settings
        • External Metastores
      • Reference Guides
        • Databricks Spark Integration
        • Databricks Spark Pre-Configuration Details
        • Configuration Settings
          • Databricks Spark Cluster Policies
            • Python & SQL
            • Python & SQL & R
            • Python & SQL & R with Library Support
            • Scala
            • Sparklyr
          • Environment Variables
          • Ephemeral Overrides
          • Py4j Security Error
          • Scala Cluster Security Details
          • Databricks Security Configuration for Performance
        • Databricks Change Data Feed
        • Databricks Libraries Introduction
        • Delta Lake API
        • Spark Direct File Reads
        • Databricks Metastore Magic
    • Starburst (Trino)
      • Getting Started
      • How-to Guides
        • Configure Starburst (Trino) Integration
        • Customize Read and Write Access Policies for Starburst (Trino)
      • Starburst (Trino) Integration Reference Guide
    • Redshift
      • Getting Started
      • How-to Guides
        • Configure Redshift Integration
        • Configure Redshift Spectrum
      • Reference Guides
        • Redshift Integration
        • Redshift Pre-Configuration Details
    • Azure Synapse Analytics
      • Getting Started
      • Configure Azure Synapse Analytics Integration
      • Reference Guides
        • Azure Synapse Analytics Integration
        • Azure Synapse Analytics Pre-Configuration Details
    • Amazon S3
    • Google BigQuery
    • Legacy Integrations
      • Securing Hive and Impala Without Sentry
      • Enabling ImmutaGroupsMapping
    • Catalogs
      • Getting Started with External Catalogs
      • Configure an External Catalog
      • Reference Guides
        • External Catalogs
        • Custom REST Catalogs
          • Custom REST Catalog Interface Endpoints
  • Data
    • Registering Metadata
      • Data Sources in Immuta
      • Register Data Sources
        • Create a Data Source
        • Create an Amazon S3 Data Source
        • Create a Google BigQuery Data Source
        • Bulk Create Snowflake Data Sources
      • Data Source Settings
        • How-to Guides
          • Manage Data Sources and Data Source Settings
          • Manage Data Source Members
          • Manage Access Requests and Tasks
          • Manage Data Dictionary Descriptions
          • Disable Immuta from Sampling Raw Data
        • Data Source Health Checks Reference Guide
      • Schema Monitoring
        • How-to Guides
          • Run Schema Monitoring and Column Detection Jobs
          • Manage Schema Monitoring
        • Reference Guides
          • Schema Monitoring
          • Schema Projects
        • Why Use Schema Monitoring?
    • Domains
      • Getting Started with Domains
      • Domains Reference Guide
    • Tags
      • How-to Guides
        • Create and Manage Tags
        • Add Tags to Data Sources and Projects
      • Tags Reference Guide
  • People
    • Getting Started
    • Identity Managers (IAMs)
      • How-to Guides
        • Okta LDAP Interface
        • OpenID Connect
          • OpenID Connect Protocol
          • Okta and OpenID Connect
          • OneLogin with OpenID
        • SAML
          • SAML Protocol
          • Microsoft Entra ID
          • Okta SAML SCIM
      • Reference Guides
        • Identity Managers
        • SAML Single Logout
        • SAML Protocol Configuration Options
    • Immuta Users
      • How-to Guides
        • Managing Personas and Permissions
        • Manage Attributes and Groups
        • User Impersonation
        • External User ID Mapping
        • External User Info Endpoint
      • Reference Guides
        • Attributes and Groups in Immuta
        • Permissions and Personas
  • Discover Your Data
    • Getting Started with Discover
    • Introduction
    • Data Discovery
      • How-to Guides
        • Enable Sensitive Data Discovery (SDD)
        • Manage Identification Frameworks
        • Manage Identifiers
        • Run and Manage SDD on Data Sources
        • Manage Sensitive Data Discovery Settings
        • Migrate From Legacy to Native SDD
      • Reference Guides
        • How Competitive Criteria Analysis Works
        • Built-in Identifier Reference
        • Built-in Discovered Tags Reference
    • Data Classification
      • How-to Guides
        • Activate Classification Frameworks
        • Adjust Identification and Classification Framework Tags
        • How to Use a Built-In Classification Framework with Your Own Tags
      • Built-in Classification Frameworks Reference Guide
  • Detect Your Activity
    • Getting Started with Detect
      • Monitor and Secure Sensitive Data Platform Query Activity
        • User Identity Best Practices
        • Integration Architecture
        • Snowflake Roles Best Practices
        • Register Data Sources
        • Automate Entity and Sensitivity Discovery
        • Detect with Discover: Onboarding Guide
        • Using Immuta Detect
      • General Immuta Configuration
        • User Identity Best Practices
        • Integration Architecture
        • Databricks Roles Best Practices
        • Register Data Sources
    • Introduction
    • Audit
      • How-to Guides
        • Export Audit Logs to S3
        • Export Audit Logs to ADLS
        • Run Governance Reports
      • Reference Guides
        • Universal Audit Model (UAM)
          • UAM Schema
        • Query Audit Logs
          • Snowflake Query Audit Logs
          • Databricks Unity Catalog Query Audit Logs
          • Databricks Spark Query Audit Logs
          • Starburst (Trino) Query Audit Logs
        • Audit Export GraphQL Reference Guide
        • Governance Report Types
        • Unknown Users in Audit Logs
      • Deprecated Audit Guides
        • Legacy to UAM Migration
        • Download Audit Logs
        • System Audit Logs
    • Dashboards
      • Use the Detect Dashboards How-To Guide
      • Detect Dashboards Reference Guide
    • Monitors
      • Manage Monitors and Observations
      • Detect Monitors Reference Guide
  • Secure Your Data
    • Getting Started with Secure
      • Automate Data Access Control Decisions
        • The Two Paths: Orchestrated RBAC and ABAC
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
        • Test and Deploy Policy
      • Compliantly Open More Sensitive Data for ML and Analytics
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
      • Federated Governance for Data Mesh and Self-Serve Data Access
        • Defining Domains
        • Managing Data Products
        • Managing Data Metadata
        • Apply Federated Governance
        • Discover and Subscribe to Data Products
    • Introduction
      • Scalability and Evolvability
      • Understandability
      • Distributed Stewardship
      • Consistency
      • Availability of Data
    • Authoring Policies in Secure
      • Authoring Policies at Scale
      • Data Engineering with Limited Policy Downtime
      • Subscription Policies
        • How-to Guides
          • Author a Subscription Policy
          • Author an ABAC Subscription Policy
          • Subscription Policies Advanced DSL Guide
          • Author a Restricted Subscription Policy
          • Clone, Activate, or Stage a Global Policy
        • Reference Guides
          • Subscription Policies
          • Subscription Policy Access Types
          • Advanced Use of Special Functions
      • Data Policies
        • Overview
        • How-to Guides
          • Author a Masking Data Policy
          • Author a Minimization Policy
          • Author a Purpose-Based Restriction Policy
          • Author a Restricted Data Policy
          • Author a Row-Level Policy
          • Author a Time-Based Restriction Policy
          • Certifications Exemptions and Diffs
          • External Masking Interface
        • Reference Guides
          • Data Policy Types
          • Masking Policies
          • Row-Level Policies
          • Custom WHERE Clause Functions
          • Data Policy Conflicts and Fallback
          • Custom Data Policy Certifications
          • Orchestrated Masking Policies
    • Projects and Purpose-Based Access Control
      • Projects and Purpose Controls
        • Getting Started
        • How-to Guides
          • Create a Project
          • Create and Manage Purposes
          • Adjust a Policy
          • Project Management
            • Manage Projects and Project Settings
            • Manage Project Data Sources
            • Manage Project Members
        • Reference Guides
          • Projects and Purposes
          • Policy Adjustments
        • Why Use Purposes?
      • Equalized Access
        • Manage Project Equalization
        • Project Equalization Reference Guide
        • Why Use Project Equalization?
      • Masked Joins
        • Enable Masked Joins
        • Why Use Masked Joins?
      • Writing to Projects
        • How-to Guides
          • Create and Manage Snowflake Project Workspaces
          • Create and Manage Databricks Spark Project Workspaces
          • Write Data to the Workspace
        • Reference Guides
          • Project Workspaces
          • Project UDFs (Databricks)
    • Data Consumers
      • Subscribe to a Data Source
      • Query Data
        • Querying Snowflake Data
        • Querying Databricks Data
        • Querying Databricks SQL Data
        • Querying Starburst (Trino) Data
        • Querying Redshift Data
        • Querying Azure Synapse Analytics Data
      • Subscribe to Projects
  • Application Settings
    • How-to Guides
      • App Settings
      • BI Tools
        • BI Tool Configuration Recommendations
        • Power BI Configuration Example
        • Tableau Configuration Example
      • Add a License Key
      • Add ODBC Drivers
      • Manage Encryption Keys
      • System Status Bundle
    • Reference Guides
      • Data Processing, Encryption, and Masking Practices
      • Metadata Ingestion
  • Releases
    • Immuta v2024.3 Release Notes
    • Immuta Release Lifecycle
    • Immuta LTS Changelog
    • Immuta Support Matrix Overview
    • Immuta CLI Release Notes
    • Immuta Image Digests
    • Preview Features
      • Features in Preview
    • Deprecations
  • Developer Guides
    • The Immuta CLI
      • Install and Configure the Immuta CLI
      • Manage Your Immuta Tenant
      • Manage Data Sources
      • Manage Sensitive Data Discovery
        • Manage Sensitive Data Discovery Rules
        • Manage Identification Frameworks
        • Run Sensitive Data Discovery on Data Sources
      • Manage Policies
      • Manage Projects
      • Manage Purposes
      • Manage Audit
    • The Immuta API
      • Integrations API
        • Getting Started
        • How-to Guides
          • Configure an Amazon S3 Integration
          • Configure an Azure Synapse Analytics Integration
          • Configure a Databricks Unity Catalog Integration
          • Configure a Google BigQuery Integration
          • Configure a Redshift Integration
          • Configure a Snowflake Integration
          • Configure a Starburst (Trino) Integration
        • Reference Guides
          • Integrations API Endpoints
          • Integration Configuration Payload
          • Response Schema
          • HTTP Status Codes and Error Messages
      • Immuta V2 API
        • Data Source Payload Attribute Details
        • Data Source Request Payload Examples
        • Create Policies API Examples
        • Create Projects API Examples
        • Create Purposes API Examples
      • Immuta V1 API
        • Authenticate with the API
        • Configure Your Instance of Immuta
          • Get Fingerprint Status
          • Get Job Status
          • Manage Frameworks
          • Manage IAMs
          • Manage Licenses
          • Manage Notifications
          • Manage Sensitive Data Discovery (SDD)
          • Manage Tags
          • Manage Webhooks
          • Search Filters
        • Connect Your Data
          • Create and Manage an Amazon S3 Data Source
          • Create an Azure Synapse Analytics Data Source
          • Create an Azure Blob Storage Data Source
          • Create a Databricks Data Source
          • Create a Presto Data Source
          • Create a Redshift Data Source
          • Create a Snowflake Data Source
          • Create a Starburst (Trino) Data Source
          • Manage the Data Dictionary
        • Manage Data Access
          • Manage Access Requests
          • Manage Data and Subscription Policies
          • Manage Domains
          • Manage Write Policies
            • Write Policies Payloads and Response Schema Reference Guide
          • Policy Handler Objects
          • Search Audit Logs
          • Search Connection Strings
          • Search for Organizations
          • Search Schemas
        • Subscribe to and Manage Data Sources
        • Manage Projects and Purposes
          • Manage Projects
          • Manage Purposes
        • Generate Governance Reports
Powered by GitBook

Other versions

  • SaaS
  • 2024.3
  • 2024.2

Copyright © 2014-2024 Immuta Inc. All rights reserved.

On this page
  • Endpoints
  • GET /integrations
  • Response
  • POST /integrations
  • Amazon S3 example
  • Azure Synapse Analytics example
  • Databricks Unity Catalog example
  • Google BigQuery example
  • Redshift example
  • Snowflake example
  • Starburst (Trino) example
  • Body parameters
  • Query parameter
  • Response
  • DELETE /integrations/{id}
  • Request parameter
  • Query parameter
  • Body parameters
  • Response
  • GET /integrations/{id}
  • Request parameter
  • Response
  • PUT /integrations/{id}
  • Amazon S3 example
  • Azure Synapse Analytics example
  • Databricks Unity Catalog example
  • Google BigQuery example
  • Redshift example
  • Snowflake example
  • Body parameters
  • Query parameter
  • Response
  • POST /integrations/{id}/regenerate
  • Starburst (Trino) example
  • Response
  • GET /integrations/{id}/status
  • Request parameter
  • Response
  • POST /integrations/scripts/cleanup
  • Body parameters
  • Response
  • POST /integrations/scripts/create
  • Body parameters
  • Response
  • POST /integrations/{id}/scripts/delete
  • Response
  • POST /integrations/{id}/scripts/edit
  • Body parameters
  • Response
  • POST /integrations/scripts/initial-create
  • Body parameters
  • Response
  • POST /integrations/scripts/post-cleanup
  • Body parameters
  • Response
  • Related guides

Was this helpful?

Export as PDF
  1. Developer Guides
  2. The Immuta API
  3. Integrations API
  4. Reference Guides

Integrations API Endpoints

The integrations resource allows you to create, configure, and manage your integration. How Immuta manages and administers policies in your data platform varies by integration.

To configure or manage an integration, users must have the APPLICATION_ADMIN Immuta permission.

Endpoints

Method
Endpoint
Description

GET

Gets all integration configurations

POST

Creates an integration

DELETE

Deletes a configured integration

GET

Gets an integration configuration

PUT

Updates a configured integration

POST

Regenerates an Immuta API key for the configured integration

GET

Gets the status of the specified integration

POST

Creates a script to remove Immuta-managed resources from your platform for integrations that were not successfully created

POST

Creates a script to set up Immuta-managed resources in your platform

POST

Creates a script to remove Immuta-managed resources from your platform for integrations that were successfully configured

POST

Creates a script to edit existing Immuta-managed resources in your platform

POST

Creates the first script to set up Immuta-managed resources in your Azure Synapse Analytics or Redshift platform

POST

Creates the second script to remove Immuta-managed resources from your Azure Synapse Analytics integration if it was not successfully created

GET /integrations

Gets all integration configurations.

curl -X 'GET' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Response

[
  {
    "id": "1",
    "status": "enabled",
    "validationResults": {
      "status": "passed",
      "validationTests": [
      {
        "name": "Initial Validation: Basic Connection Test",
        "status": "passed"
      },
      {
        "name": "Initial Validation: Default Warehouse Access Test",
        "status": "passed",
        "result": []
      },
      {
        "name": "Initial Validation: Validate access to Privileged Role",
        "status": "passed",
        "result": []
      },
      {
        "name": "Validate Automatic: Database Does Not Exist",
        "status": "passed"
      },
      {
        "name": "Validate Automatic: Impersonation Role Does Not Exist",
        "status": "skipped"
      },
      {
        "name": "Validate Automatic Bootstrap User Grants",
        "status": "passed"
      }
    ] },
    "type": "Snowflake",
    "autoBootstrap": true,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "port": 443,
      "audit": {
        "enabled": false
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "lineage": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "<REDACTED>",
      "password": "<REDACTED>",
      "role": "ACCOUNTADMIN"
    },
    {
      "id": "2",
      "status": "enabled",
      "type": "Databricks",
      "validationResults": {
        "status": "passed",
        "validationTests": [
        {
          "name": "Metastore validation",
          "status": "passed"
        },
        {
          "name": "Basic Connection Test",
          "result": [
          {
            "1": 1
          }
          ],
          "status": "passed"
        }
        ]
      },
      "autoBootstrap": true,
      "config": {
        "workspaceUrl": "www.example-workspace.cloud.databricks.com",
        "httpPath": "sql/protocolv1/o/0/0000-00000-abc123",
        "authenticationType": "token",
        "token": "REDACTED",
        "audit": {
          "enabled": false
        },
        "catalog": "immuta"
      }
    }
  }
]

POST /integrations

Creates an integration configuration that allows Immuta to manage access policies on data registered in Immuta.

Amazon S3 example

When you connect Immuta to your AWS account, the awsLocationPath is the base S3 location prefix that Immuta will use for this connection when registering S3 data sources.

This request configures the integration using the AWS access key authentication method.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Native S3",
    "autoBootstrap": false,
    "config": {
      "name": "S3 integration",
      "awsAccountId": "123456789",
      "awsRegion": "us-east-1",
      "awsLocationRole": "arn:aws:iam::123456789:role/access-grants-instance-role",
      "awsLocationPath": "s3://",
      "authenticationType": "accessKey",
      "awsAccessKeyId": "123456789",
      "awsSecretAccessKey": "123456789"
    }
    }'

Azure Synapse Analytics example

When you connect Immuta to your Azure Synapse Analytics account, the schema you specify is where all the policy-enforced views will be created and managed by Immuta.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": true,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Databricks Unity Catalog example

This request creates a Databricks Unity Catalog integration configuration that allows Immuta to administer Unity Catalog policies on data registered in Immuta.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Databricks",
    "autoBootstrap": true,
    "config": {
      "workspaceUrl": "www.example-workspace.cloud.databricks.com",
      "httpPath": "sql/protocolv1/o/0/0000-00000-abc123",
      "authenticationType": "token",
      "token": "REDACTED",
      "catalog": "immuta"
    }
    }'

Google BigQuery example

When you connect Immuta to your Google BigQuery account, the dataset you specify is where all the policy-enforced views will be created and managed by Immuta.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Google BigQuery",
    "autoBootstrap": false,
    "config": {
      "role": "immuta",
      "datasetSuffix": "_secureView",
      "dataset": "immuta",
      "location": "us-east1",
      "credential": "{\"type\":\"service_account\",\"project_id\":\"innate-conquest-123456\",\"private_key_id\":\"9163c12345690924f5dd218ff39\",\"private_key\":\"-----BEGIN PRIVATE KEY-----\nXXXXXXXro0s\n/yQlPQijowkccmrmWJyr93kdLnwJzBvLHCto/+W\ncvF2ygX9oM/dyUK//z//4nptMp+Ck//Yw3D4rIBwGu4DWiR1qRnf\nDoGyXfThPTQ==\n-----END PRIVATE KEY-----\n\",\"client_email\":\"service-account-id@innate-conquest-123456.iam.gserviceaccount.com\",\"client_id\":\"1166290***432952487857\",\"auth_uri\":\"https://accounts.google.com/o/oauth2/auth\",\"token_uri\":\"https://oauth2.googleapis.com/token\",\"auth_provider_x509_cert_url\":\"https://www.googleapis.com/oauth2/v1/certs\",\"client_x509_cert_url\":\"https://www.googleapis.com/robot/v1/metadata/x509/service-accound-id%40innate-conquest-123456.iam.gserviceaccount.com\",\"universe_domain\":\"googleapis.com\"}"
    }
    }'

Redshift example

When you connect Immuta to your Redshift account, the Immuta system user will use the database you specify to manage and store metadata. The initial database (REDSHIFT_SAMPLE_DATA, in the request below) is an existing Redshift database that Immuta connects to in order to create the Immuta-managed database (immuta, in the request below).

This request specifies userPassword as the authentication type for the Immuta system user. The username and password provided are credentials for a system account that can manage the database.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Redshift",
    "autoBootstrap": true,
    "config": {
      "host": "organization.aws.amazon.com",
      "database": "immuta",
      "initialDatabase": "REDSHIFT_SAMPLE_DATA",
      "authenticationType": "userPassword",
      "username": "taylor@redshift.com",
      "password": "abc1234"
    }
    }'

Snowflake example

When you connect Immuta to your Snowflake account, the warehouse you specify is the default pool of compute resources the Immuta system user will use to run queries and perform other Snowflake operations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": true,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "authenticationType": "userPassword",
      "username": "taylor@snowflake.com",
      "password": "abc1234",
      "role": "ACCOUNTADMIN"
    }
    }'

Starburst (Trino) example

When you configure the Starburst (Trino) integration, Immuta generates an API key and configuration snippet on the Immuta app settings page that you will use to configure your Starburst cluster.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Trino"
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Google BigQuery

  • Native S3

  • Redshift

  • Snowflake

  • Trino

autoBootstrap boolean

Required for all integrations except Starburst (Trino)

-

true or false

config object

Required for all integrations except Starburst (Trino)

-

-

Query parameter

Parameter
Description
Required or optional

dryRun boolean

When true, the integration configuration will not actually be created, and the response returns the validation tests statuses.

Optional

Response

A successful response includes the validation tests statuses.

{
  "id": "123456789",
  "status": "creating",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    },
    {
      "name": "Initial Validation: Default Warehouse Access Test",
      "status": "passed",
      "result": []
    },
    {
      "name": "Initial Validation: Validate access to Privileged Role",
      "status": "passed",
      "result": []
    },
    {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    },
    {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    },
    {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }
    ]
  }
}
{
  "statusCode": 409,
  "error": "Conflict",
  "message": "Snowflake integration already exists on host organization.us-east-1.snowflakecomputing.com (id = 123456789)"
}

DELETE /integrations/{id}

Deletes the integration configuration you specify in the request.

curl -X 'DELETE' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "authenticationType": "userPassword",
    "username": "taylor@snowflake.com",
    "password": "abc1234",
    "role": "ACCOUNTADMIN"
    }'

Request parameter

Parameter
Description
Required or optional

id number

The unique identifier of the integration configuration.

Required

Query parameter

Parameter
Description
Required or optional

dryRun boolean

When true, the integration configuration will not actually be deleted, and the response returns the validation tests statuses.

Optional

forceDisable boolean

When true, the integration will be deleted in Immuta. Users must manually remove all Immuta objects in the remote data platform.

Optional

Body parameters

For Amazon S3 integrations, Databricks Unity Catalog integrations, Google BigQuery integrations, Starburst (Trino) integrations, or integration configurations with autoBootstrap set to false, no payload is required to delete the integration.

For the integrations below, the request accepts a JSON or YAML payload when autoBootstrap is set to true. See the payload description for your integration for parameters and details:

Response

{
  "id": "123456789",
  "status": "deleting",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    },
    {
      "name": "Initial Validation: Default Warehouse Access Test",
      "status": "passed",
      "result": []
    },
    {
      "name": "Initial Validation: Validate access to Privileged Role",
      "status": "passed",
      "result": []
    },
    {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    },
    {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    },
    {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }
    ]
  }
}

GET /integrations/{id}

Gets the integration configuration you specify in the request.

curl -X 'GET' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Request parameter

Parameter
Description
Required or optional

id number

The unique identifier of the integration configuration.

Required

Response

{
  "id": "123456789",
  "status": "enabled",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    }, {
      "name": "Initial Validation: Default Warehouse Access Test",
      "result": [],
      "status": "passed"
    }, {
      "name": "Initial Validation: Table Grants Role Prefix is Unique",
      "status": "passed"
    }, {
      "name": "Initial Validation: Validate access to Privileged Role",
      "result": [],
      "status": "passed"
    }, {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    }, {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    }, {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }]
  },
  "type": "Snowflake",
  "autoBootstrap": true,
  "config": {
    "host": "organization.us-east-1.snowflakecomputing.com",
    "warehouse": "SAMPLE_WAREHOUSE",
    "database": "SNOWFLAKE_SAMPLE_DATA",
    "port": 443,
    "audit": {
      "enabled": false
    },
    "workspaces": {
      "enabled": false
    },
    "impersonation": {
      "enabled": false
    },
    "lineage": {
      "enabled": false
    },
    "authenticationType": "userPassword",
    "username": "<REDACTED>",
    "password": "<REDACTED>",
    "role": "ACCOUNTADMIN"
  }
}

PUT /integrations/{id}

Updates an existing integration configuration.

Amazon S3 example

This request changes the name of the integration.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Native S3",
    "autoBootstrap": false,
    "config": {
      "name": "S3 integration edited",
      "awsAccountId": "123456789",
      "awsRegion": "us-east-1",
      "awsLocationRole": "arn:aws:iam::123456789:role/access-grants-instance-role",
      "awsLocationPath": "s3://",
      "authenticationType": "accessKey",
      "awsAccessKeyId": "123456789",
      "awsSecretAccessKey": "123456789"
    }
    }'

Azure Synapse Analytics example

This request enables user impersonation for the Azure Synapse Analytics integration.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": true,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "impersonation": {
        "enabled": true,
        "role": "IMMUTA_IMPERSONATION"
      },
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Databricks Unity Catalog example

This request updates the access token.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Databricks",
    "autoBootstrap": true,
    "config": {
      "workspaceUrl": "www.example-workspace.cloud.databricks.com",
      "httpPath": "sql/protocolv1/o/0/0000-00000-abc123",
      "authenticationType": "token",
      "token": "REDACTED",
      "catalog": "immuta"
    }
    }'

Google BigQuery example

This request updates the private key for the Google BigQuery integration.

  curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/{id}' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Google BigQuery",
    "autoBootstrap": false,
    "config": {
      "role": "immuta",
      "datasetSuffix": "_secureView",
      "dataset": "immuta",
      "location": "us-east1",
      "credential": "{\"type\":\"service_account\",\"project_id\":\"innate-conquest-123456\",\"private_key_id\":\"9163c12345690924f5dd218ff39\",\"private_key\":\"-----BEGIN PRIVATE KEY-----\nXXXXXXXro0s\n/yQlPQijowkccmrmWJyr93kdLnwJzBvLHCto/+W\ncvF2ygX9oM/dyUK//z//4nptMp+Ck//Yw3D4rIBwGu4DWiR1qRnf\nDoGyXfThPTQ==\n-----END PRIVATE KEY-----\n\",\"client_email\":\"service-account-id@innate-conquest-123456.iam.gserviceaccount.com\",\"client_id\":\"1166290***432952487857\",\"auth_uri\":\"https://accounts.google.com/o/oauth2/auth\",\"token_uri\":\"https://oauth2.googleapis.com/token\",\"auth_provider_x509_cert_url\":\"https://www.googleapis.com/oauth2/v1/certs\",\"client_x509_cert_url\":\"https://www.googleapis.com/robot/v1/metadata/x509/service-accound-id%40innate-conquest-123456.iam.gserviceaccount.com\",\"universe_domain\":\"googleapis.com\"}"
    }
    }'

Redshift example

This request enables user impersonation for the Redshift integration.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Redshift",
    "autoBootstrap": true,
    "config": {
      "host": "organization.aws.amazon.com",
      "database": "immuta",
      "initialDatabase": "REDSHIFT_SAMPLE_DATA",
      "impersonation": {
        "enabled": true,
        "role": "immuta_impersonation"
      },
      "authenticationType": "okta",
      "okta": {
        "username": "taylor@redshift.com",
        "password": "abc1234",
        "appId": "Okta",
        "idpHost": "organization.okta.com",
        "role": "admin"
      }
    }
    }'

Snowflake example

This request enables auditing queries run in Snowflake.

curl -X 'PUT' \
    'https://www.organization.immuta.com/integrations/123456789' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": true,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": true
      },
      "authenticationType": "userPassword",
      "username": "taylor@snowflake.com",
      "password": "abc1234",
      "role": "ACCOUNTADMIN"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Google BigQuery

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

true or false

config object

Required

-

-

Query parameter

Parameter
Description
Required or optional

dryRun boolean

When true, the integration configuration will not actually be updated, and the response returns the validation tests statuses.

Optional

Response

A successful response includes the validation tests statuses.

{
  "id": "123456789",
  "status": "editing",
  "validationResults": {
    "status": "passed",
    "validationTests": [
    {
      "name": "Initial Validation: Basic Connection Test",
      "status": "passed"
    },
    {
      "name": "Initial Validation: Default Warehouse Access Test",
      "status": "passed",
      "result": []
    },
    {
      "name": "Initial Validation: Validate access to Privileged Role",
      "status": "passed",
      "result": []
    },
    {
      "name": "Validate Automatic: Database Does Not Exist",
      "status": "passed"
    },
    {
      "name": "Validate Automatic: Impersonation Role Does Not Exist",
      "status": "skipped"
    },
    {
      "name": "Validate Automatic Bootstrap User Grants",
      "status": "passed"
    }
    ]
  }
}
{
  "statusCode": 409,
  "error": "Conflict",
  "message": "Unable to edit integration with ID 123456789 in current state editing."
}

POST /integrations/{id}/regenerate

Regenerates an Immuta API key for the configured integration.

Starburst (Trino) example

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/123456789/regenerate' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Response

{ "newKey": "5bb6cae9******300c21acbb" }

GET /integrations/{id}/status

Gets the status of the integration specified in the request.

curl -X 'GET' \
    'https://www.organization.immuta.com/integrations/123456789/status' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Request parameter

Parameter
Description
Required or optional

id number

The unique identifier of the integration configuration.

Required

Response

{"id":123456789,"status":"enabled"}

POST /integrations/scripts/cleanup

Creates a script to remove Immuta-managed resources from your platform. This endpoint is for Azure Synapse Analytics, Redshift, and Snowflake integrations that were not successfully created and, therefore, do not have an integration ID.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/cleanup' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": false,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": true
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "IMMUTA_SYSTEM_ACCOUNT",
      "password": "abc1234"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to clean up.

Required

-

  • Azure Synapse Analytics

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics, Redshift, or Snowflake environment.

Once you have run the script,

  • use the DELETE /integrations/{id} endpoint to delete your Redshift or Snowflake integration in Immuta:

POST /integrations/scripts/create

Creates a script for you to run manually to set up objects and resources for Immuta to manage and enforce access controls on your data. This endpoint is available for Azure Synapse Analytics, Databricks Unity Catalog, Redshift, and Snowflake integrations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/create' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": false,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": false
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "IMMUTA_SYSTEM_ACCOUNT",
      "password": "abc1234"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment.

POST /integrations/{id}/scripts/delete

Creates a script to remove Immuta-managed resources from your platform. This endpoint is for Azure Synapse Analytics, Redshift, and Snowflake integrations that were successfully created.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/1/scripts/delete' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f'

Response

The response returns the script that you will run in your Azure Synapse Analytics, Redshift, or Snowflake environment.

Once you have run the script, use the DELETE /integrations/{id} endpoint to delete your integration in Immuta:

POST /integrations/{id}/scripts/edit

Creates a script for you to run manually to edit objects and resources managed by Immuta in your platform. This endpoint is available for Azure Synapse Analytics, Redshift, and Snowflake integrations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/1/scripts/edit' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Snowflake",
    "autoBootstrap": false,
    "config": {
      "host": "organization.us-east-1.snowflakecomputing.com",
      "warehouse": "SAMPLE_WAREHOUSE",
      "database": "SNOWFLAKE_SAMPLE_DATA",
      "audit": {
        "enabled": true
      },
      "workspaces": {
        "enabled": false
      },
      "impersonation": {
        "enabled": false
      },
      "authenticationType": "userPassword",
      "username": "IMMUTA_SYSTEM_ACCOUNT",
      "password": "abc1234"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Databricks

  • Redshift

  • Snowflake

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics, Redshift, or Snowflake environment. Once you have run the script, use the PUT /integrations/{id} endpoint to finish editing your integration:

POST /integrations/scripts/initial-create

Creates the first script for you to run manually to set up objects and resources for Immuta to manage and enforce access controls on your data in Azure Synapse Analytics or Redshift integrations.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/initial-create' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": false,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to configure.

Required

-

  • Azure Synapse Analytics

  • Redshift

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics or Redshift environment.

POST /integrations/scripts/post-cleanup

Creates a second script to remove the final Immuta-managed resources from your Azure Synapse Analytics platform. This endpoint is for Azure Synapse Analytics integrations that were not successfully created and, therefore, do not have an integration ID.

curl -X 'POST' \
    'https://www.organization.immuta.com/integrations/scripts/post-cleanup' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: 846e9e43c86a4ct1be14290d95127d13f' \
    -d '{
    "type": "Azure Synapse Analytics",
    "autoBootstrap": false,
    "config": {
      "host": "organization.azure.com",
      "schema": "sample_schema",
      "database": "immuta",
      "metadataDelimiters": {
        "hashDelimiter": "|",
        "hashKeyDelimiter": "-",
        "arrayDelimiter": ","
      },
      "username": "taylor@synapse.com",
      "password": "abc1234",
      "authenticationType": "userPassword"
    }
    }'

Body parameters

The request accepts a JSON or YAML payload with the parameters outlined below.

Parameter
Description
Required or optional
Default values
Accepted values

type string

The type of integration to clean up.

Required

-

Azure Synapse Analytics

autoBootstrap boolean

Required

-

false

config object

Required

-

-

Response

The response returns the script that you will run in your Azure Synapse Analytics environment.

Related guides

See the following how-to guides for configuration examples and steps for creating, managing, or deleting your integration:

Last updated 2 months ago

Was this helpful?

The response returns the configuration for all integrations. See the for details about the response schema. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

This request specifies userPassword authentication type. The username and password provided are credentials of a Snowflake account attached to a role with . These credentials are not stored; they are used by Immuta to configure the integration.

When true, Immuta will automatically configure the integration in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment for you. When false, you must set up your environment manually before configuring the integration with the API. This parameter must be set to false in the Amazon S3 and Google BigQuery configurations. See the specific how-to guide for configuring your integration for details: , , , .

This object specifies the integration settings. See the config object description for your integration for details: , , , , , or .

The response returns the status of the integration configuration connection. See the for details about the response schema.

An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

The response returns the status of the integration configuration that has been deleted. See the for details about the response schema. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

The response returns an integration configuration. See the for details about the response schema. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

When true, Immuta will automatically configure the integration in your Azure Synapse Analytics, Databricks Unity Catalog, Redshift, or Snowflake environment for you. When false, you must set up your environment manually before configuring the integration with the API. This parameter must be set to false in the Google BigQuery configuration. See the specific how-to guide for configuring other integrations: , , , .

This object specifies the integration settings. See the config object description for your integration for details: , , , , or .

The response returns the status of the integration configuration connection. See the for details about the response schema.

An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

This request regenerates an Immuta API key for the configured Starburst (Trino) integration. Once you make this request, your old Immuta API key will be deleted and will no longer be valid. See the for instructions on updating your Starburst (Trino) integration to use the new API key.

The response returns the new Immuta API key. An unsuccessful request returns the status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

The response returns the of the specified integration. An unsuccessful request returns the HTTP status code and an error message. See the for a list of statuses, error messages, and troubleshooting guidance.

For Azure Synapse Analytics integrations, you must also make a request to the to create another script that will finish removing Immuta-managed resources from the platform.

Set to false to specify that you will run the script in your environment yourself to clean up the integration resources. See the , , or manual setup section for details.

This object specifies the integration settings. See the config object description for your integration for details: , , or .

use the to create another script that will finish removing Immuta-managed resources from your Azure Synapse Analytics platform.

Set to false to specify that you will run the script in your environment yourself to configure the integration. You must run the Immuta script before creating the integration. See the , , , or manual setup guides for details.

This object specifies the integration settings. See the config object description for your integration for details: , , , or .

Set to false to specify that you will run the script in your environment yourself to configure the integration. You must run the Immuta script before creating the integration. See the , , or manual setup guides for details.

This object specifies the integration settings. Some settings cannot be changed once an integration is configured. See the config object description for your integration for details: , , or .

Set to false to specify that you will run the script in your environment yourself to configure the integration. You must run the Immuta script before creating the integration. See the or manual setup guides for details.

This object specifies the integration settings. See the config object description of the or integration configuration for details.

Once you have run this script, use the to generate a script to finish creating the Immuta-managed resources in your platform.

Before making a request like the one below, you must make a request to the to create the first script that will remove the initial Immuta-managed resources from the platform.

Set to false to specify that you will run the script in your environment yourself to clean up the integration resources. See the manual setup section for details.

This object specifies the integration settings. See the config object description of for details.

Once you have run the script, use the DELETE /integrations/{id} endpoint to delete your integration in Immuta by following the instructions.

response schema reference
response schema reference
Configure an Amazon S3 integration
Configure an Azure Synapse Analytics integration
Configure a Databricks Unity Catalog integration
Configure a Google BigQuery integration
Configure a Redshift integration
Configure a Snowflake integration
Configure a Starburst (Trino) integration
/integrations/scripts/post-cleanup endpoint
/integrations/scripts/post-cleanup endpoint
/integrations/scripts/create endpoint
/integrations/scripts/cleanup endpoint
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
/integrations
/integrations
/integrations/{id}
/integrations/{id}
/integrations/{id}
/integrations/{id}/regenerate
/integrations/{id}/status
/integrations/scripts/cleanup
/integrations/scripts/create
/integrations/{id}/scripts/delete
/integrations/{id}/scripts/edit
/integrations/scripts/initial-create
/integrations/scripts/post-cleanup
response schema reference
HTTP status codes and error messages
HTTP status codes and error messages
response schema reference
HTTP status codes and error messages
response schema reference
HTTP status codes and error messages
HTTP status codes and error messages
HTTP status codes and error messages page
HTTP status codes and error messages
status
Delete Redshift integration
Delete Redshift integration
Configure Redshift integration
Configure a Starburst (Trino) integration page
these privileges
Delete Snowflake integration
Delete Snowflake integration
Configure Snowflake integration
Delete Azure Synapse Analytics integration payload
Delete Redshift integration payload
Delete Snowflake integration payload
Amazon S3
Azure Synapse Analytics
Databricks Unity Catalog
Google BigQuery
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog
Google BigQuery
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Azure Synapse Analytics
Delete Azure Synapse Analytics integration
Configure Azure Synapse Analytics integration
Delete Azure Synapse Analytics integration
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Snowflake
Azure Synapse Analytics
Redshift
Azure Synapse Analytics
Azure Synapse Analytics
Redshift
Snowflake
Azure Synapse Analytics
Databricks Unity Catalog