The databricks endpoint allows you to connect and manage Databricks data sources in Immuta.
Additional fields may be included in some responses you receive; however, these attributes are for internal purposes and are therefore undocumented.
Requirements
Databricks Spark integration
When exposing a table or view from an Immuta-enabled Databricks cluster, be sure that at least one of these traits is true:
The user exposing the tables has READ_METADATA and SELECT permissions on the target views/tables (specifically if Table ACLs are enabled).
The user exposing the tables is listed in the immuta.spark.acl.whitelist configuration on the target cluster.
The user exposing the tables is a Databricks workspace administrator.
Databricks Unity Catalog integration
When exposing a table from Databricks Unity Catalog, be sure the credentials used to register the data sources have the Databricks privileges listed below.
The following privileges on the parent catalogs and schemas of those tables:
SELECT
USE CATALOG
USE SCHEMA
USE SCHEMA on system.information_schema
Azure Databricks Unity Catalog limitation
Set all table-level ownership on your Unity Catalog data sources to an individual user or service principal instead of a Databricks group before proceeding. Otherwise, Immuta cannot apply data policies to the table in Unity Catalog. See the Azure Databricks Unity Catalog limitation for details.
**Duplicate data sources**: In order to avoid two data sources referencing the same table, users can not create duplicate data sources. If you attempt to create a duplicate data source using the API, you will encounter a warning stating "duplicate tables are specified in the payload."
POST/databricks/handler
Save the provided connection information as a data source.
Payload parameters
Attribute
Description
Required
private
boolean When false, the data source will be publicly available in the Immuta UI.
Yes
blobHandler
array[object] A list of full URLs providing the locations of all blob store handlers to use with this data source.
Yes
blobHandlerType
string Describes the type of underlying blob handler that will be used with this data source (e.g., MS SQL).
Yes
recordFormat
string The data format of blobs in the data source, such as json, xml, html, or jpeg.
Yes
type
string The type of data source: ingested (metadata will exist in Immuta) or queryable (metadata is dynamically queried).
Yes
name
string The name of the data source. It must be unique within the Immuta tenant.
Yes
sqlTableName
string A string that represents this data source's table in Immuta.
Yes
organization
string The organization that owns the data source.
Yes
category
string The category of the data source.
No
description
string The description of the data source.
No
hasExamples
boolean When true, the data source contains examples.
No
Response parameters
Attribute
Description
id
integer The handler ID.
dataSourceId
integer The ID of the data source.
warnings
string This message describes issues with the created data source, such as the data source being unhealthy.
connectionString
string The connection string used to connect the data source to Immuta.
Update the data source metadata associated with the provided handler ID. This endpoint does not perform partial updates, but will allow the dictionary to be omitted. In this case, it uses the current dictionary.
Query parameters
Attribute
Description
Required
handlerId
integer The ID of the handler.
Yes
skipCache
boolean When true, will skip the handler cache when retrieving metadata.
No
Payload parameters
Attribute
Description
Required
handler
metadata Includes metadata about the handler, such as ssl, port, database, hostname, username, and password.
Yes
connectionString
string The connection string used to connect to the data source.
Yes
Response parameters
Attribute
Description
id
integer The ID of the handler.
ca
string The certificate authority.
columns
array[object] This is a Data Dictionary object, which provides metadata about the columns in the data source, including the name and data type of the column.
Request example
This request updates the metadata for the data source with the handler ID 48.