LogoLogo
SaaS
  • Immuta Documentation - SaaS
  • Configuration
    • Connect Data Platforms
      • Data Platforms Overview
      • Amazon S3 Integration
      • AWS Lake Formation
        • Getting Started with AWS Lake Formation
        • Register an AWS Lake Formation Connection
        • Reference Guides
          • AWS Lake Formation
          • Security and Compliance
          • Protecting Data
          • Accessing Data
      • Azure Synapse Analytics
        • Getting Started with Azure Synapse Analytics
        • Configure Azure Synapse Analytics Integration
        • Reference Guides
          • Azure Synapse Analytics Overview
          • Azure Synapse Analytics Pre-Configuration Details
      • Databricks
        • Databricks Spark
          • Getting Started with Databricks Spark
          • How-to Guides
            • Configure a Databricks Spark Integration
            • Manually Update Your Databricks Cluster
            • Install a Trusted Library
            • Project UDFs Cache Settings
            • Run R and Scala spark-submit Jobs on Databricks
            • DBFS Access
            • Troubleshooting
          • Reference Guides
            • Databricks Spark Integration Configuration
              • Installation and Compliance
              • Customizing the Integration
              • Setting Up Users
              • Spark Environment Variables
              • Ephemeral Overrides
            • Security and Compliance
            • Registering and Protecting Data
            • Accessing Data
              • Delta Lake API
        • Databricks Unity Catalog
          • Getting Started with Databricks Unity Catalog
          • How-to Guides
            • Configure a Databricks Unity Catalog Integration
            • Migrating to Unity Catalog
          • Databricks Unity Catalog Integration Reference Guide
      • Google BigQuery Integration
      • Redshift
        • Getting Started with Redshift
        • How-to Guides
          • Configure Redshift Integration
          • Configure Redshift Spectrum
        • Reference Guides
          • Redshift Overview
          • Redshift Pre-Configuration Details
      • Snowflake
        • Getting Started with Snowflake
        • How-to Guides
          • Configure a Snowflake Integration
          • Edit or Remove Your Snowflake Integration
          • Integration Settings
            • Snowflake Table Grants Private Preview Migration
            • Enable Snowflake Table Grants
            • Using Snowflake Data Sharing with Immuta
            • Enable Snowflake Low Row Access Policy Mode
              • Upgrade Snowflake Low Row Access Policy Mode
            • Configure Snowflake Lineage Tag Propagation
        • Reference Guides
          • Snowflake Integration
          • Snowflake Table Grants
          • Snowflake Data Sharing with Immuta
          • Snowflake Low Row Access Policy Mode
          • Snowflake Lineage Tag Propagation
          • Warehouse Sizing Recommendations
        • Explanatory Guides
          • Phased Snowflake Onboarding
      • Starburst (Trino)
        • Getting Started with Starburst (Trino)
        • How-to Guides
          • Configure Starburst (Trino) Integration
          • Customize Read and Write Access Policies for Starburst (Trino)
        • Starburst (Trino) Integration Reference Guide
      • Queries Immuta Runs in Your Data Platform
      • Connect Your Data
        • Registering a Connection
          • How-to Guides
            • Register a Snowflake Connection
            • Register a Databricks Unity Catalog Connection
            • Manually Run Object Sync
            • Manage Connection Settings
            • Use the Connection Upgrade Manager
              • Troubleshooting
          • Reference Guides
            • Connections
            • Upgrading to Connections
              • Before You Begin
              • API Changes
              • FAQ
        • Registering Metadata
          • Data Sources in Immuta
          • Register Data Sources
            • Amazon S3 Data Source
            • Azure Synapse Analytics Data Source
            • Databricks Data Source
            • Google BigQuery Data Source
            • Redshift Data Source
            • Snowflake Data Source
              • Bulk Create Snowflake Data Sources
            • Create a Starburst (Trino) Data Source
          • Data Source Settings
            • How-to Guides
              • Manage Data Source Settings
              • Manage Data Source Members
              • Manage Access Requests and Tasks
              • Manage Data Dictionary Descriptions
              • Disable Immuta from Sampling Raw Data
            • Data Source Health Checks Reference Guide
          • Schema Monitoring
            • How-to Guides
              • Manage Schema Monitoring
              • Run Schema Monitoring and Column Detection Jobs
            • Reference Guides
              • Schema Monitoring
              • Schema Projects
            • Why Use Schema Monitoring Concept Guide
    • Manage Data Metadata
      • Connect External Catalogs
        • Configure an External Catalog
        • Reference Guides
          • External Catalog Introduction
          • Custom REST Catalog Interface Introduction
          • Custom REST Catalog Interface Endpoints
      • Data Discovery
        • Introduction
        • Getting Started with Data Discovery
        • How-to Guides
          • Use Identifiers in Domains
          • Use Sensitive Data Discovery (SDD)
          • Manage Identification Frameworks
          • Manage Identifiers
          • Run and Manage Sensitive Data Discovery on Data Sources
        • Reference Guides
          • Identifiers in Domains
          • Built-in Identifier Reference
          • Improved Pack: Built-in Identifier Reference
          • Built-in Discovered Tags Reference
          • How Competitive Pattern Analysis Works
      • Data Classification
        • How-to Guides
          • Activate Classification Frameworks
          • Adjust Identification and Classification Framework Tags
          • How to Use a Classification Framework with Your Own Tags
        • Reference Guide
          • Classification Frameworks
      • Manage Tags
        • How-to Guides
          • Create and Manage Tags
          • Add Tags to Data Sources and Projects
        • Tags Reference Guide
    • Manage Users
      • Getting Started with Users
      • Identity Managers (IAMs)
        • How-to Guides
          • Okta LDAP Interface
          • OpenID Connect
            • OpenID Connect Protocol
            • Okta and OpenID Connect
            • OneLogin with OpenID Connect
          • SAML
            • SAML Protocol
            • Microsoft Entra ID
            • Okta SAML SCIM
        • Reference Guides
          • Identity Managers
          • SAML Protocol Configuration Options
          • SAML Single Logout
      • Immuta Users
        • How-to Guides
          • Managing Personas and Permissions
          • User Impersonation
          • Manage Attributes and Groups
          • External User ID Mapping
          • External User Info Endpoint
        • Reference Guides
          • Permissions and Personas
          • Attributes and Groups in Immuta
    • Organize Data into Domains
      • Getting Started with Domains
      • Domains Reference Guide
    • Application Settings
      • How-to Guides
        • App Settings
        • Private Networking Support
          • Data Connection Private Networking
            • AWS PrivateLink for Redshift
            • AWS PrivateLink for API Gateway
            • Databricks Private Connectivity
              • AWS PrivateLink for Databricks
              • Azure Private Link for Databricks
            • Snowflake Private Connectivity
              • AWS PrivateLink for Snowflake
              • Azure Private Link for Snowflake
            • Starburst (Trino) Private Connectivity
              • AWS PrivateLink for Starburst (Trino)
              • Azure Private Link for Starburst (Trino)
          • Immuta SaaS Private Networking
            • Immuta SaaS Private Networking Over AWS PrivateLink
        • BI Tools
          • BI Tool Configuration Recommendations
          • Power BI Configuration Example
          • Tableau Configuration Example
        • IP Filtering
        • System Status Bundle
      • Reference Guides
        • Deployment Options
        • Data Processing
        • Encryption and Masking Practices
  • Marketplace
    • Introduction
      • User Types
      • Walkthrough
    • Share Data Products
      • How-to Guides
        • Manage Data Products
        • View and Respond to Access Requests
        • Customize the Marketplace Branding
      • Reference Guides
        • Marketplace App Requirements
        • Data Products
        • Marketplace Permissions Matrix
        • Understanding Access Provisioning and Underlying Policies in Immuta
          • S3 Provisioning Best Practices
        • Integrating with Existing Catalogs
        • Setting Up Domains for Marketplace
    • Access Data Products
      • How-to Guides
        • Logging into Marketplace
        • Requesting Access to a Data Product
      • Reference Guide
        • Data Source Access Status
    • Short-Term Limitations
  • Governance
    • Introduction
      • Automate Data Access Control Decisions
        • The Two Paths
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
        • Test and Deploy Policy
      • Compliantly Open More Sensitive Data for ML and Analytics
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
    • Author Policies for Data Access Control
      • Introduction
        • Scalability and Evolvability
        • Understandability
        • Distributed Stewardship
        • Consistency
        • Availability of Data
      • Policies
        • Authoring Policies at Scale
        • Data Engineering with Limited Policy Downtime
        • Subscription Policies
          • Overview
          • How-to Guides
            • Author a Subscription Policy
            • Author an ABAC Subscription Policy
            • Subscription Policies Advanced DSL Guide
            • Author a Restricted Subscription Policy
            • Clone, Activate, or Stage a Global Policy
          • Reference Guides
            • Subscription Policy Access Types
            • Advanced Use of Special Functions
        • Data Policies
          • Overview
          • How-to Guides
            • Author a Masking Data Policy
            • Author a Minimization Policy
            • Author a Purpose-Based Restriction Policy
            • Author a Restricted Data Policy
            • Author a Row-Level Policy
            • Author a Time-Based Restriction Policy
            • Policy Certifications and Diffs
          • Reference Guides
            • Data Policy Types
            • Masking Policies
            • Row-Level Policies
            • Custom WHERE Clause Functions
            • Data Policy Conflicts and Fallback
            • Custom Data Policy Certifications
            • Orchestrated Masking Policies
      • Projects and Purpose-Based Access Control
        • Projects and Purpose Controls
          • Getting Started
          • How-to Guides
            • Create a Project
            • Create and Manage Purposes
            • Adjust a Policy
            • Project Management
              • Manage Projects and Project Settings
              • Manage Project Data Sources
              • Manage Project Members
          • Reference Guides
            • Projects and Purposes
            • Policy Adjustments
          • Concept Guide
            • Why Use Purposes?
        • Equalized Access
          • Manage Project Equalization How-to Guide
          • Equalized Access Reference Guide
          • Why Use Project Equalization?
        • Masked Joins
          • Enable Masked Joins How-to Guide
          • Why Use Masked Joins?
        • Writing to Projects
          • How-to Guides
            • Create and Manage Snowflake Project Workspaces
            • Create and Manage Databricks Spark Project Workspaces
            • Write Data to the Workspace
          • Reference Guides
            • Writing to Projects
            • Project UDFs (Databricks)
      • Data Consumers
        • Subscribe to a Data Source
        • Query Data
          • Querying Snowflake Data
          • Querying Databricks Data
          • Querying Starburst (Trino) Data
          • Querying Databricks SQL Data
          • Querying Redshift Data
          • Querying Azure Synapse Analytics Data
        • Subscribe to Projects
    • Observe Access and Activity
      • Introduction
      • Audit
        • How-to Guides
          • Export Audit Logs to S3
          • Export Audit Logs to ADLS
          • Use Immuta Audit
          • Run Governance Reports
        • Reference Guides
          • Universal Audit Model (UAM)
            • UAM Schema Reference Guide
          • Query Audit Logs
            • Snowflake Query Audit Logs
            • Databricks Unity Catalog Query Audit Logs
            • Databricks Spark Query Audit Logs
            • Starburst (Trino) Query Audit Logs
          • Audit Export GraphQL Reference Guide
          • Unknown Users in Audit Logs
          • Governance Report Types
      • Dashboards
        • Use the Audit Dashboards How-To Guide
        • Audit Dashboards Reference Guide
      • Monitors
        • Manage Monitors and Observations
        • Monitors Reference Guide
  • Releases
    • Deployment Notes
      • 2024
      • 2023
      • 2022
    • Scheduled Maintenance Windows
    • Immuta Support Matrix Overview
    • Immuta CLI Release Notes
    • Preview Features
      • Features in Preview
    • Deprecations
  • Developer Guides
    • The Immuta CLI
      • Install and Configure the Immuta CLI
      • Manage Your Immuta Tenant
      • Manage Data Sources
      • Manage Sensitive Data Discovery
        • Manage Sensitive Data Discovery Rules
        • Manage Identification Frameworks
        • Run Sensitive Data Discovery on Data Sources
      • Manage Policies
      • Manage Projects
      • Manage Purposes
      • Manage Audit Export
    • The Immuta API
      • Integrations API
        • Getting Started
        • How-to Guides
          • Configure an Amazon S3 Integration
          • Configure an Azure Synapse Analytics Integration
          • Configure a Databricks Unity Catalog Integration
          • Configure a Google BigQuery Integration
          • Configure a Redshift Integration
          • Configure a Snowflake Integration
          • Configure a Starburst (Trino) Integration
        • Reference Guides
          • Integrations API Endpoints
          • Integration Configuration Payload
          • Response Schema
          • HTTP Status Codes and Error Messages
      • Connections API
        • How-to Guides
          • Register a Connection
            • Register a Snowflake Connection
            • Register a Databricks Unity Catalog Connection
            • Register an AWS Lake Formation Connection
          • Manage a Connection
          • Deregister a Connection
        • Connection Registration Payloads Reference Guide
      • Marketplace API
        • Marketplace API Endpoints
        • Source Controlling Data Products
      • Immuta V2 API
        • Data Source Payload Attribute Details
          • Data Source Request Payload Examples
        • Create Policies API Examples
        • Create Projects API Examples
        • Create Purposes API Examples
      • Immuta V1 API
        • Authenticate with the API
        • Configure Your Instance of Immuta
          • Get Job Status
          • Manage Frameworks
          • Manage IAMs
          • Manage Licenses
          • Manage Notifications
          • Manage Identifiers in Domains
            • API Changes - Identification Frameworks to Identifiers in Domains
          • Manage Sensitive Data Discovery (SDD)
          • Manage Tags
          • Manage Webhooks
          • Search Filters
        • Connect Your Data
          • Create and Manage an Amazon S3 Data Source
          • Create an Azure Synapse Analytics Data Source
          • Create a Databricks Data Source
          • Create a Redshift Data Source
          • Create a Snowflake Data Source
          • Create a Starburst (Trino) Data Source
          • Manage the Data Dictionary
        • Use Domains
        • Manage Data Access
          • Manage Access Requests
          • Manage Data and Subscription Policies
          • Manage Write Policies
            • Write Policies Payloads and Response Schema Reference Guide
          • Policy Handler Objects
          • Search Audit Logs
          • Search Connection Strings
          • Search for Organizations
          • Search Schemas
        • Subscribe to and Manage Data Sources
        • Manage Projects and Purposes
          • Manage Projects
          • Manage Purposes
        • Generate Governance Reports
Powered by GitBook

Self-managed versions

  • 2024.3
  • 2024.2

Resources

  • Immuta Changelog

Copyright © 2014-2025 Immuta Inc. All rights reserved.

On this page
  • Amazon S3 workflow
  • Create a data source
  • Payload parameters
  • Response schema
  • Request example
  • Response example
  • Search Amazon S3 data sources
  • Query parameter
  • Response schema
  • Request example
  • Response example
  • Delete an Amazon S3 data source

Was this helpful?

Export as PDF
  1. Developer Guides
  2. The Immuta API
  3. Immuta V1 API
  4. Connect Your Data

Create and Manage an Amazon S3 Data Source

PreviousConnect Your DataNextCreate an Azure Synapse Analytics Data Source

Last updated 1 month ago

Was this helpful?

This page describes the native-s3 (Amazon S3 data sources) endpoint.

Private preview: This integration is available to select accounts. Contact your Immuta representative for details.

Additional fields may be included in some responses you receive; however, these attributes are for internal purposes and are therefore undocumented.

Amazon S3 workflow

  1. .

  2. .

  3. .

Create a data source

POST /native-s3/handler

Save the provided connection information as a data source.

Required Immuta permission: CREATE_S3_DATASOURCE

Payload parameters

The table below outlines the parameters for creating an S3 data source.

Parameter
Description
Required or optional
Accepted values

type string

The type of integration.

Required

Native S3

integrationId number

The unique identifier of the S3 integration.

Required

-

dataSources.dataSourceName string

The name of the S3 data source you want to create.

Required

-

dataSources.prefix string

The S3 prefix that creates a data source for the prefix, bucket, or object provided in the path.

Required

-

Response schema

The table below outlines the response schema for successful requests.

Property
Description

dataSourceId integer

The unique identifier of the data source.

prefix string

The S3 path of the prefix, bucket, or object used to create the data source.

dataSourceName string

The name of the data source.

Request example

The following request saves the provided connection information as a data source.

curl -X 'POST' \
    'https://<your-immuta-url.com>/native-s3/handler' \
    -H 'accept: application/json' \
    -H 'Content-Type: application/json' \
    -H 'Authorization: <your-api-key>' \
    -d '{
    "type": "Native S3",
    "integrationId": <id-of-your-s3-integration>,
    "dataSources": [
      {
        "dataSourceName": "<name-of-data-source>",
        "prefix": "</data-source-prefix>"
      }
    ]
    }'

Response example

The response returns the ID, name, and prefix of the data source. See the response schema above for details about the response schema. An unsuccessful response returns the status code and an error message.

{
  "dataSourceId": 1,
  "prefix": "research-data-source",
  "dataSourceName": "s3://research-data-source (64*****50499/us-east-2)"
}

Search Amazon S3 data sources

GET /native-s3/handler/{handlerId}

Get the handler metadata associated with the provided handler ID.

Query parameter

Attribute
Description
Required or optional

handlerId integer

The specific handler ID.

Required

Response schema

Attribute
Description

id integer

The data source ID.

metadata.prefix string

The S3 path of the prefix, bucket, or object used to create the data source.

metadata.integrationId integer

The unique identifier of the integration.

metadata.dataSourceName string

The name of the data source.

type string

The handler type (nativeS3Handler).

awsLocationPath string

The base S3 location prefix that Immuta used to register the S3 data source.

Request example

The following request returns the handler metadata associated with the provided handler ID.

curl \
    --request GET \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    https://your-immuta-url.com/native-s3/handler/1

Response example

{
  "id": 1,
  "metadata": {
    "prefix": "research-data-source",
    "integrationId": 2,
    "dataSourceName": "s3://research-data-source (64*****50499/us-east-2)"
  },
  "type": "nativeS3Handler",
  "awsLocationPath": "s3://"
}

Delete an Amazon S3 data source

DELETE /dataSource/{dataSourceId}

Delete a data source. This will perform a soft delete on the first call and a hard delete the second time.

Required: The global GOVERNANCE permission or be the data source owner

Query parameters

Attribute
Description
Required

dataSourceId

integer The data source ID.

Yes

Response schema

Attribute
Description

success boolean

If true, the request to disable or delete the data source was successful.

id integer

The data source ID.

schemaEvolutionId integer

The schema evolution ID.

name string

The data source name.

disabled boolean

If true, the data source is disabled.

handlerDeleteErrorMessage string

The delete error message.

Request example

The following request disables the data source 1.

curl \
    --request DELETE \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    https://your-immuta-url.com/dataSource/1

Response example

{
  "success": true,
  "id": 1,
  "schemaEvolutionId": 1,
  "name": "s3://research-data-source (64*****50499/us-east-2)",
  "disabled": true,
  "handlerDeleteErrorMessage": null
}
Create an Amazon S3 data source
Search Amazon S3 data sources
Delete Amazon S3 data sources