Create a Databricks Data Source
Databricks data source API reference guide
The databricks
endpoint allows you to connect and manage Databricks data sources in Immuta.
Requirements
Databricks Spark integration
When exposing a table or view from an Immuta-enabled Databricks cluster, be sure that at least one of these traits is true:
The user exposing the tables has READ_METADATA and SELECT permissions on the target views/tables (specifically if Table ACLs are enabled).
The user exposing the tables is listed in the
immuta.spark.acl.whitelist
configuration on the target cluster.The user exposing the tables is a Databricks workspace administrator.
Databricks Unity Catalog integration
When registering Databricks Unity Catalog securables in Immuta, use the service principal from the integration configuration and ensure it has the privileges listed below. Immuta uses this service principal continuously to orchestrate Unity Catalog policies and maintain state between Immuta and Databricks.
USE CATALOG
andMANAGE
on all catalogs containing securables registered as Immuta data sources.USE SCHEMA
on all schemas containing securables registered as Immuta data sources.MODIFY
andSELECT
on all securables you want registered as Immuta data sources.
Azure Databricks Unity Catalog limitation
Set all table-level ownership on your Unity Catalog data sources to an individual user or service principal instead of a Databricks group before proceeding. Otherwise, Immuta cannot apply data policies to the table in Unity Catalog. See the Azure Databricks Unity Catalog limitation for details.
Databricks workflow
Create a data source
POST
/databricks/handler
Save the provided connection information as a data source.
Payload parameters
private
boolean
When false
, the data source will be publicly available in the Immuta UI.
Yes
blobHandler
array[object]
A list of full URLs providing the locations of all blob store handlers to use with this data source.
Yes
blobHandlerType
string
Describes the type of underlying blob handler that will be used with this data source (e.g., MS SQL
).
Yes
recordFormat
string
The data format of blobs in the data source, such as json
, xml
, html
, or jpeg
.
Yes
type
string
The type of data source: ingested
(metadata will exist in Immuta) or queryable
(metadata is dynamically queried).
Yes
name
string
The name of the data source. It must be unique within the Immuta tenant.
Yes
sqlTableName
string
A string that represents this data source's table in Immuta.
Yes
organization
string
The organization that owns the data source.
Yes
category
string
The category of the data source.
No
description
string
The description of the data source.
No
hasExamples
boolean
When true
, the data source contains examples.
No
Response parameters
id
integer
The handler ID.
dataSourceId
integer
The ID of the data source.
warnings
string
This message describes issues with the created data source, such as the data source being unhealthy.
connectionString
string
The connection string used to connect the data source to Immuta.
Request example
This request creates two Databricks data sources.
curl \
--request POST \
--header "Content-Type: application/json" \
--header "Authorization: Bearer dea464c07bd07300095caa8" \
--data @example-payload.json \
https://demo.immuta.com/databricks/handler
Payload example
{
"handler": [{
"metadata": {
"ssl": true,
"userFiles": [],
"authenticationMethod": "Access Token",
"password": "your-password",
"port": 443,
"hostname": "your-hostname.cloud.databricks.com",
"database": "default",
"httpPath": "sql/your/http/0/path",
"schemaProjectName": "Default",
"staleDataTolerance": 86400,
"bodataSchemaName": "default",
"bodataTableName": "applicant_data",
"dataSourceName": "Default Applicant Data",
"table": "applicant_data",
"schema": "default"
}
}, {
"metadata": {
"ssl": true,
"userFiles": [],
"authenticationMethod": "Access Token",
"password": "your-password",
"port": 443,
"hostname": "your-hostname.cloud.databricks.com",
"database": "default",
"httpPath": "sql/your/http/0/path",
"schemaProjectName": "Default",
"staleDataTolerance": 86400,
"bodataSchemaName": "default",
"bodataTableName": "cities",
"dataSourceName": "Default Cities",
"table": "cities",
"schema": "default"
}
}],
"dataSource": {
"blobHandler": {
"scheme": "https",
"url": ""
},
"blobHandlerType": "Databricks",
"recordFormat": "",
"type": "queryable",
"schemaEvolutionId": null,
"columnEvolutionEnabled": true
},
"schemaEvolution": {
"ownerProfileId": 2,
"config": {
"nameTemplate": {
"nameFormat": "<Schema> <Tablename>",
"tableFormat": "<tablename>",
"sqlSchemaNameFormat": "<schema>",
"schemaProjectNameFormat": "<Schema>"
}
},
"schemas": []
}
}
Response example
{
"connectionString": "your-hostname.cloud.databricks.com:443/default"
}
Get information about a data source
GET
/databricks/handler/{handlerId}
Get the handler metadata associated with the provided handler ID.
Query parameters
handlerId
integer
The ID of the handler.
Yes
skipCache
boolean
When true
, will skip the handler cache when retrieving metadata.
No
Response parameters
body
array[object]
Metadata about the data source, including the data source ID, schema, database, and connection string.
Request example
This request returns metadata for the handler with the ID 48
.
curl \
--request GET \
--header "Content-Type: application/json" \
--header "Authorization: Bearer dea464c07bd07300095caa8" \
https://demo.immuta.com/databricks/handler/48
Response example
{
"dataSourceId": 49,
"metadata": {
"ssl": true,
"port": 443,
"paths": ["/user/hive/warehouse/cities"],
"query": null,
"table": "cities",
"schema": "default",
"scheme": "dbfs",
"database": "default",
"hostname": "your-hostname.cloud.databricks.com",
"httpPath": "sql/your/http/0/path",
"pathUris": ["dbfs:/user/hive/warehouse/cities"],
"ephemeral": true,
"eventTime": null,
"userFiles": [],
"clusterName": null,
"dataSourceName": "Default Cities",
"bodataTableName": "cities",
"metastoreTables": ["default.cities"],
"bodataSchemaName": "default",
"columnsNormalized": false,
"schemaProjectName": "Default",
"staleDataTolerance": 86400,
"authenticationMethod": "Access Token"
},
"type": "queryable",
"connectionString": "your-hostname.cloud.databricks.com:443/default",
"id": 48,
"createdAt": "2021-10-06T17:53:09.640Z",
"updatedAt": "2021-10-06T17:53:09.882Z",
"dbms": {
"name": "databricks"
}
}
Manage data sources
PUT
/databricks/handler/{handlerId}
Update the data source metadata associated with the provided handler ID. This endpoint does not perform partial updates, but will allow the dictionary to be omitted. In this case, it uses the current dictionary.
PUT
/databricks/handler/{handlerId}/triggerHighCardinalityJob
Update a specific data source
PUT
/databricks/handler/{handlerId}
Update the data source metadata associated with the provided handler ID. This endpoint does not perform partial updates, but will allow the dictionary to be omitted. In this case, it uses the current dictionary.
Query parameters
handlerId
integer
The ID of the handler.
Yes
skipCache
boolean
When true
, will skip the handler cache when retrieving metadata.
No
Payload parameters
handler
metadata
Includes metadata about the handler, such as ssl
, port
, database
, hostname
, username
, and password
.
Yes
connectionString
string
The connection string used to connect to the data source.
Yes
Response parameters
id
integer
The ID of the handler.
ca
string
The certificate authority.
columns
array[object]
This is a Data Dictionary object, which provides metadata about the columns in the data source, including the name and data type of the column.
Request example
This request updates the metadata for the data source with the handler ID 48
.
curl \
--request PUT \
--header "Content-Type: application/json" \
--header "Authorization: Bearer dea464c07bd07300095caa8" \
--data @example-payload.json \
https://demo.immuta.com/databricks/handler/48
Payload example
The payload below updates the dataSourceName
to Cities
.
{
"handler": {
"policyHandler": null,
"dataSourceId": 49,
"metadata": {
"ssl": true,
"port": 443,
"paths": ["/user/hive/warehouse/cities"],
"table": "cities",
"schema": "default",
"scheme": "dbfs",
"database": "default",
"hostname": "your-hostname.cloud.databricks.com",
"httpPath": "sql/your/http/0/path",
"pathUris": ["dbfs:/user/hive/warehouse/cities"],
"ephemeral": true,
"eventTime": null,
"userFiles": [],
"clusterName": null,
"dataSourceName": "Cities",
"bodataTableName": "cities",
"metastoreTables": ["default.cities"],
"bodataSchemaName": "default",
"columnsNormalized": false,
"schemaProjectName": "Default",
"staleDataTolerance": 86400,
"authenticationMethod": "Access Token",
"columns": [{
"name": "OBJECTID",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "URBID",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "LIGHTDCW",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "ES90POP",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "ES95POP",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "ES00POP",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "PCOUNT",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "SCHNM",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "NAME",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "SQKM_FINAL",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "ISO3",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "ISOURBID",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "REMOVED_PO",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "ADDED_POIN",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "YEAR_V1_01",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "POP_V1_01",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "Unsdcode",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "Countryeng",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "Continent",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "geometry",
"dataType": "struct",
"remoteType": "struct<__geom__:bigint,_is_empty:boolean,_ndim:bigint>",
"nullable": true,
"children": [{
"name": "__geom__",
"dataType": "bigint"
}, {
"name": "_is_empty",
"dataType": "boolean"
}, {
"name": "_ndim",
"dataType": "bigint"
}]
}, {
"name": "wkt",
"dataType": "text",
"remoteType": "string",
"nullable": true
}],
"password": "your-password"
},
"type": "queryable",
"connectionString": "dbc-d3fe40ca-b4fb.cloud.databricks.com:443/default",
"id": 48,
"createdAt": "2021-10-06T17:53:09.640Z",
"updatedAt": "2021-10-06T17:53:09.882Z",
"dbms": {
"name": "databricks"
}
}
}
Response example
{
"id": 48,
"ca": ["-----BEGIN CERTIFICATE-----\ncertificatedata\n-----END CERTIFICATE-----"],
"metadata": {
"columns": [{
"name": "OBJECTID",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "URBID",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "LIGHTDCW",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "ES90POP",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "ES95POP",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "ES00POP",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "PCOUNT",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "SCHNM",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "NAME",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "SQKM_FINAL",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "ISO3",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "ISOURBID",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "REMOVED_PO",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "ADDED_POIN",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "YEAR_V1_01",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "POP_V1_01",
"dataType": "double precision",
"remoteType": "double",
"nullable": true
}, {
"name": "Unsdcode",
"dataType": "bigint",
"remoteType": "bigint",
"nullable": true
}, {
"name": "Countryeng",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "Continent",
"dataType": "text",
"remoteType": "string",
"nullable": true
}, {
"name": "geometry",
"dataType": "struct",
"remoteType": "struct<__geom__:bigint,_is_empty:boolean,_ndim:bigint>",
"nullable": true,
"children": [{
"name": "__geom__",
"dataType": "bigint"
}, {
"name": "_is_empty",
"dataType": "boolean"
}, {
"name": "_ndim",
"dataType": "bigint"
}]
}, {
"name": "wkt",
"dataType": "text",
"remoteType": "string",
"nullable": true
}]
}
}
Update multiple data sources
PUT
/databricks/bulk
Update the data source metadata associated with the provided connection string.
Payload parameters
handler
metadata
Includes metadata about the handler, such as ssl
, port
, database
, hostname
, username
, and password
.
Yes
connectionString
string
The connection string used to connect to the data sources.
Yes
Response parameters
bulkId
string
The ID of the bulk data source update.
connectionString
string
The connection string shared by the data sources bulk updated.
jobsCreated
integer
The number of jobs that ran to update the data sources; this number corresponds to the number of data sources updated.
Request example
This request updates the metadata for all data sources with the connection string specified in example-payload.json
.
curl \
--request PUT \
--header "Content-Type: application/json" \
--header "Authorization: Bearer dea464c07bd07300095caa8" \
--data @example-payload.json \
https://demo.immuta.com/databricks/bulk
Payload example
The payload below adds a certificate (certificate.json
) to connect to the data sources with the provided connection.
{
"handler": {
"metadata": {
"ssl": true,
"port": 443,
"database": "default",
"hostname": "your-hostname.cloud.databricks.com",
"userFiles": [{
"keyName": "test",
"filename": "6dc06a3310b9ba33c543e483d1e745b3ac9bc648.json",
"userFilename": "certificate.json"
}],
"authenticationMethod": "Access Token",
"password": "your-password",
"httpPath": "sql/your/http/0/path"
}
},
"connectionString": "your-hostname.cloud.databricks.com:443/default/default"
}
Response example
{
"bulkId": "bulk_ds_update_9ae5bfd85a3a47a8b454c618043e2aa3",
"connectionString": "your-hostname.cloud.databricks.com:443/default",
"jobsCreated": 2
}
Recalculate the high cardinality column for a data source
PUT
/databricks/handler/{handlerId}/triggerHighCardinalityJob
Recalculate the high cardinality column for the specified data source.
Query parameters
handlerId
integer
The ID of the handler.
Yes
Response parameters
The response returns a string of characters that identify the high cardinality job run.
Request example
This request re-runs the job that calculates the high cardinality column for the data source with the handler ID 47
.
curl \
--request PUT \
--header "Content-Type: application/json" \
--header "Authorization: Bearer dea464c07bd07300095caa8" \
https://demo.immuta.com/databricks/handler/47/triggerHighCardinalityJob
Response example
f6ac1ad0-26d0-11ec-8078-d36bbf5b90fb
Last updated
Was this helpful?