# Create a Databricks Data Source

{% hint style="warning" %}
**Deprecation notice**

Support for registering Databricks Unity Catalog data sources using this legacy workflow has been deprecated. Instead, register your data using [connections](https://documentation.immuta.com/saas/developer-guides/api-intro/connections-api/how-to-guides/register-a-connection/register-a-databricks-unity-catalog-connection).
{% endhint %}

The `databricks` endpoint allows you to connect and manage Databricks data sources in Immuta.

{% hint style="info" %}
Additional fields may be included in some responses you receive; however, these attributes are for internal purposes and are therefore undocumented.
{% endhint %}

## Requirements

**Databricks Spark integration**

When exposing a table or view from an Immuta-enabled Databricks cluster, be sure that at least one of these traits is true:

* The user exposing the tables has READ\_METADATA and SELECT permissions on the target views/tables (specifically if Table ACLs are enabled).
* The user exposing the tables is listed in the `immuta.spark.acl.allowlist` configuration on the target cluster.
* The user exposing the tables is a Databricks workspace administrator.

**Databricks Unity Catalog integration**

When registering Databricks Unity Catalog securables in Immuta, use the service principal from the integration configuration and ensure it has the privileges listed below. Immuta uses this service principal continuously to orchestrate Unity Catalog policies and maintain state between Immuta and Databricks.

* `USE CATALOG` and `MANAGE` on all catalogs containing securables registered as Immuta data sources.
* `USE SCHEMA` on all schemas containing securables registered as Immuta data sources.
* `MODIFY` and `SELECT` on all securables you want registered as Immuta data sources. *The* `MODIFY` *privilege is not required for materialized views registered as Immuta data sources, since* `MODIFY` *is not a supported privilege on that object type in* [*Databricks*](https://docs.databricks.com/aws/en/data-governance/unity-catalog/manage-privileges/privileges#privilege-types-by-securable-object-in-unity-catalog)*.*

{% hint style="info" %}
`MANAGE` and `MODIFY` are required so that the service principal can apply row filters and column masks on the securable; to do so, the service principal must also have `SELECT` on the securable as well as `USE CATALOG` on its parent catalog and `USE SCHEMA` on its parent schema. Since privileges are inherited, you can grant the service principal the `MODIFY` and `SELECT` privilege on all catalogs or schemas containing Immuta data sources, which automatically grants the service principal the `MODIFY` and `SELECT` privilege on all current and future securables in the catalog or schema. The service principal also inherits `MANAGE` from the parent catalog for the purpose of applying row filters and column masks, but that privilege must be set directly on the parent catalog in order for grants to be fully applied.
{% endhint %}

{% hint style="warning" %}
**Azure Databricks Unity Catalog limitation**

Set all table-level ownership on your Unity Catalog data sources to an individual user or service principal instead of a Databricks group before proceeding. Otherwise, Immuta cannot apply data policies to the table in Unity Catalog. See the [Azure Databricks Unity Catalog limitation](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-unity-catalog/unity-catalog-overview#azure-databricks-unity-catalog-limitation) for details.
{% endhint %}

## Databricks workflow

1. [Create a data source](#create-a-data-source).
2. [Get information about a data source](#get-information-about-a-data-source).
3. [Manage data sources](#manage-data-sources).

## Create a data source

{% hint style="warning" %}
**Databricks Unity Catalog behavior**

If you register a connection and a data object has no subscription policy set on it, Immuta will REVOKE access to the data in Databricks for all Immuta users, even if they had been directly granted access to the table in Unity Catalog.

If you disable a Unity Catalog data source in Immuta, all existing grants and policies on that object will be removed in Databricks for all Immuta users. All existing grants and policies will be removed, regardless of whether they were set in Immuta or in Unity Catalog directly.

If a user is not registered in Immuta, Immuta will have no effect on that user's access to data in Unity Catalog.

See the [Databricks Unity Catalog reference guide](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-unity-catalog/unity-catalog-overview#user-permissions-immuta-revokes) for more details about permissions Immuta revokes and how to configure this behavior for your connection.
{% endhint %}

<mark style="color:green;">`POST`</mark> `/databricks/handler`

Save the provided connection information as a data source.

**Required Immuta permission**: `CREATE_DATA_SOURCE`

#### Payload parameters

| Attribute       | Description                                                                                                          | Required |
| --------------- | -------------------------------------------------------------------------------------------------------------------- | -------- |
| private         | `boolean` When `false`, the data source will be publicly available in the Immuta UI.                                 | **Yes**  |
| blobHandler     | `array[object]` A list of full URLs providing the locations of all blob store handlers to use with this data source. | **Yes**  |
| blobHandlerType | `string` Describes the type of underlying blob handler that will be used with this data source (e.g., `MS SQL`).     | **Yes**  |
| recordFormat    | `string` The data format of blobs in the data source, such as `json`, `xml`, `html`, or `jpeg`.                      | **Yes**  |
| type            | `string` The type of data source: `queryable` (metadata is dynamically queried).                                     | **Yes**  |
| name            | `string` The name of the data source. It must be unique within the Immuta tenant.                                    | **Yes**  |
| sqlTableName    | `string` A string that represents this data source's table in Immuta.                                                | **Yes**  |
| organization    | `string` The organization that owns the data source.                                                                 | **Yes**  |
| category        | `string` The category of the data source.                                                                            | No       |
| description     | `string` The description of the data source.                                                                         | No       |
| hasExamples     | `boolean` When `true`, the data source contains examples.                                                            | No       |

#### Response parameters

| Attribute        | Description                                                                                                   |
| ---------------- | ------------------------------------------------------------------------------------------------------------- |
| id               | `integer` The handler ID.                                                                                     |
| dataSourceId     | `integer` The ID of the data source.                                                                          |
| warnings         | `string` This message describes issues with the created data source, such as the data source being unhealthy. |
| connectionString | `string` The connection string used to connect the data source to Immuta.                                     |

### Request example

This request creates two Databricks data sources.

```bash
curl \
    --request POST \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    --data @example-payload.json \
    https://demo.immuta.com/databricks/handler
```

#### Payload example

```json
{
  "handler": [{
    "metadata": {
      "ssl": true,
      "userFiles": [],
      "authenticationMethod": "Access Token",
      "password": "your-password",
      "port": 443,
      "hostname": "your-hostname.cloud.databricks.com",
      "database": "default",
      "httpPath": "sql/your/http/0/path",
      "schemaProjectName": "Default",
      "staleDataTolerance": 86400,
      "bodataSchemaName": "default",
      "bodataTableName": "applicant_data",
      "dataSourceName": "Default Applicant Data",
      "table": "applicant_data",
      "schema": "default"
    }
  }, {
    "metadata": {
      "ssl": true,
      "userFiles": [],
      "authenticationMethod": "Access Token",
      "password": "your-password",
      "port": 443,
      "hostname": "your-hostname.cloud.databricks.com",
      "database": "default",
      "httpPath": "sql/your/http/0/path",
      "schemaProjectName": "Default",
      "staleDataTolerance": 86400,
      "bodataSchemaName": "default",
      "bodataTableName": "cities",
      "dataSourceName": "Default Cities",
      "table": "cities",
      "schema": "default"
    }
  }],
  "dataSource": {
    "blobHandler": {
      "scheme": "https",
      "url": ""
    },
    "blobHandlerType": "Databricks",
    "recordFormat": "",
    "type": "queryable",
    "schemaEvolutionId": null,
    "columnEvolutionEnabled": true
  },
  "schemaEvolution": {
    "ownerProfileId": 2,
    "config": {
      "nameTemplate": {
        "nameFormat": "<Schema> <Tablename>",
        "tableFormat": "<tablename>",
        "sqlSchemaNameFormat": "<schema>",
        "schemaProjectNameFormat": "<Schema>"
      }
    },
    "schemas": []
  }
}
```

### Response example

```json
{
  "connectionString": "your-hostname.cloud.databricks.com:443/default"
}
```

## Get information about a data source

<mark style="color:green;">`GET`</mark> `/databricks/handler/{handlerId}`

Get the handler metadata associated with the provided handler ID.

#### Query parameters

| Attribute | Description                                                                  | Required |
| --------- | ---------------------------------------------------------------------------- | -------- |
| handlerId | `integer` The ID of the handler.                                             | **Yes**  |
| skipCache | `boolean` When `true`, will skip the handler cache when retrieving metadata. | No       |

#### Response parameters

| Attribute | Description                                                                                                            |
| --------- | ---------------------------------------------------------------------------------------------------------------------- |
| body      | `array[object]` Metadata about the data source, including the data source ID, schema, database, and connection string. |

### Request example

This request returns metadata for the handler with the ID `48`.

```bash
curl \
    --request GET \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    https://demo.immuta.com/databricks/handler/48
```

### Response example

```json
{
  "dataSourceId": 49,
  "metadata": {
    "ssl": true,
    "port": 443,
    "paths": ["/user/hive/warehouse/cities"],
    "query": null,
    "table": "cities",
    "schema": "default",
    "scheme": "dbfs",
    "database": "default",
    "hostname": "your-hostname.cloud.databricks.com",
    "httpPath": "sql/your/http/0/path",
    "pathUris": ["dbfs:/user/hive/warehouse/cities"],
    "ephemeral": true,
    "eventTime": null,
    "userFiles": [],
    "clusterName": null,
    "dataSourceName": "Default Cities",
    "bodataTableName": "cities",
    "metastoreTables": ["default.cities"],
    "bodataSchemaName": "default",
    "columnsNormalized": false,
    "schemaProjectName": "Default",
    "staleDataTolerance": 86400,
    "authenticationMethod": "Access Token"
  },
  "type": "queryable",
  "connectionString": "your-hostname.cloud.databricks.com:443/default",
  "id": 48,
  "createdAt": "2021-10-06T17:53:09.640Z",
  "updatedAt": "2021-10-06T17:53:09.882Z",
  "dbms": {
    "name": "databricks"
  }
}
```

## Manage data sources

| Method | Path                                                        | Purpose                                                                                                                                                                                                                                               |
| ------ | ----------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| PUT    | `/databricks/handler/{handlerId}`                           | [Update the data source metadata associated with the provided handler ID](#update-a-specific-data-source). This endpoint does not perform partial updates, but will allow the dictionary to be omitted. In this case, it uses the current dictionary. |
| PUT    | `/databricks/bulk`                                          | [Update the data source metadata associated with the provided connection string](#update-multiple-data-sources).                                                                                                                                      |
| PUT    | `/databricks/handler/{handlerId}/triggerHighCardinalityJob` | [Recalculate the high cardinality column for the specified data source](#recalculate-the-high-cardinality-column-for-a-data-source).                                                                                                                  |

### Update a specific data source

<mark style="color:green;">`PUT`</mark> `/databricks/handler/{handlerId}`

Update the data source metadata associated with the provided handler ID. This endpoint does not perform partial updates, but will allow the dictionary to be omitted. In this case, it uses the current dictionary.

**Required**: The global `GOVERNANCE` permission or be the data source owner

#### Query parameters

| Attribute | Description                                                                  | Required |
| --------- | ---------------------------------------------------------------------------- | -------- |
| handlerId | `integer` The ID of the handler.                                             | **Yes**  |
| skipCache | `boolean` When `true`, will skip the handler cache when retrieving metadata. | No       |

#### Payload parameters

| Attribute        | Description                                                                                                                | Required |
| ---------------- | -------------------------------------------------------------------------------------------------------------------------- | -------- |
| handler          | `metadata` Includes metadata about the handler, such as `ssl`, `port`, `database`, `hostname`, `username`, and `password`. | **Yes**  |
| connectionString | `string` The connection string used to connect to the data source.                                                         | **Yes**  |

#### Response parameters

| Attribute | Description                                                                                                                         |
| --------- | ----------------------------------------------------------------------------------------------------------------------------------- |
| id        | `integer` The ID of the handler.                                                                                                    |
| ca        | `string` The certificate authority.                                                                                                 |
| columns   | `array[object]` This object provides metadata about the columns in the data source, including the name and data type of the column. |

#### Request example

This request updates the metadata for the data source with the handler ID `48`.

```bash
curl \
    --request PUT \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    --data @example-payload.json \
    https://demo.immuta.com/databricks/handler/48
```

**Payload example**

The payload below updates the `dataSourceName` to `Cities`.

```json
{
  "handler": {
    "policyHandler": null,
    "dataSourceId": 49,
    "metadata": {
      "ssl": true,
      "port": 443,
      "paths": ["/user/hive/warehouse/cities"],
      "table": "cities",
      "schema": "default",
      "scheme": "dbfs",
      "database": "default",
      "hostname": "your-hostname.cloud.databricks.com",
      "httpPath": "sql/your/http/0/path",
      "pathUris": ["dbfs:/user/hive/warehouse/cities"],
      "ephemeral": true,
      "eventTime": null,
      "userFiles": [],
      "clusterName": null,
      "dataSourceName": "Cities",
      "bodataTableName": "cities",
      "metastoreTables": ["default.cities"],
      "bodataSchemaName": "default",
      "columnsNormalized": false,
      "schemaProjectName": "Default",
      "staleDataTolerance": 86400,
      "authenticationMethod": "Access Token",
      "columns": [{
        "name": "OBJECTID",
        "dataType": "bigint",
        "remoteType": "bigint",
        "nullable": true
      }, {
        "name": "URBID",
        "dataType": "bigint",
        "remoteType": "bigint",
        "nullable": true
      }, {
        "name": "LIGHTDCW",
        "dataType": "bigint",
        "remoteType": "bigint",
        "nullable": true
      }, {
        "name": "ES90POP",
        "dataType": "double precision",
        "remoteType": "double",
        "nullable": true
      }, {
        "name": "ES95POP",
        "dataType": "double precision",
        "remoteType": "double",
        "nullable": true
      }, {
        "name": "ES00POP",
        "dataType": "double precision",
        "remoteType": "double",
        "nullable": true
      }, {
        "name": "PCOUNT",
        "dataType": "bigint",
        "remoteType": "bigint",
        "nullable": true
      }, {
        "name": "SCHNM",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "NAME",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "SQKM_FINAL",
        "dataType": "double precision",
        "remoteType": "double",
        "nullable": true
      }, {
        "name": "ISO3",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "ISOURBID",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "REMOVED_PO",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "ADDED_POIN",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "YEAR_V1_01",
        "dataType": "double precision",
        "remoteType": "double",
        "nullable": true
      }, {
        "name": "POP_V1_01",
        "dataType": "double precision",
        "remoteType": "double",
        "nullable": true
      }, {
        "name": "Unsdcode",
        "dataType": "bigint",
        "remoteType": "bigint",
        "nullable": true
      }, {
        "name": "Countryeng",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "Continent",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }, {
        "name": "geometry",
        "dataType": "struct",
        "remoteType": "struct<__geom__:bigint,_is_empty:boolean,_ndim:bigint>",
        "nullable": true,
        "children": [{
          "name": "__geom__",
          "dataType": "bigint"
        }, {
          "name": "_is_empty",
          "dataType": "boolean"
        }, {
          "name": "_ndim",
          "dataType": "bigint"
        }]
      }, {
        "name": "wkt",
        "dataType": "text",
        "remoteType": "string",
        "nullable": true
      }],
      "password": "your-password"
    },
    "type": "queryable",
    "connectionString": "dbc-d3fe40ca-b4fb.cloud.databricks.com:443/default",
    "id": 48,
    "createdAt": "2021-10-06T17:53:09.640Z",
    "updatedAt": "2021-10-06T17:53:09.882Z",
    "dbms": {
      "name": "databricks"
    }
  }
}
```

#### Response example

```json
{
  "id": 48,
  "ca": ["-----BEGIN CERTIFICATE-----\ncertificatedata\n-----END CERTIFICATE-----"],
  "metadata": {
    "columns": [{
      "name": "OBJECTID",
      "dataType": "bigint",
      "remoteType": "bigint",
      "nullable": true
    }, {
      "name": "URBID",
      "dataType": "bigint",
      "remoteType": "bigint",
      "nullable": true
    }, {
      "name": "LIGHTDCW",
      "dataType": "bigint",
      "remoteType": "bigint",
      "nullable": true
    }, {
      "name": "ES90POP",
      "dataType": "double precision",
      "remoteType": "double",
      "nullable": true
    }, {
      "name": "ES95POP",
      "dataType": "double precision",
      "remoteType": "double",
      "nullable": true
    }, {
      "name": "ES00POP",
      "dataType": "double precision",
      "remoteType": "double",
      "nullable": true
    }, {
      "name": "PCOUNT",
      "dataType": "bigint",
      "remoteType": "bigint",
      "nullable": true
    }, {
      "name": "SCHNM",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "NAME",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "SQKM_FINAL",
      "dataType": "double precision",
      "remoteType": "double",
      "nullable": true
    }, {
      "name": "ISO3",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "ISOURBID",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "REMOVED_PO",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "ADDED_POIN",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "YEAR_V1_01",
      "dataType": "double precision",
      "remoteType": "double",
      "nullable": true
    }, {
      "name": "POP_V1_01",
      "dataType": "double precision",
      "remoteType": "double",
      "nullable": true
    }, {
      "name": "Unsdcode",
      "dataType": "bigint",
      "remoteType": "bigint",
      "nullable": true
    }, {
      "name": "Countryeng",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "Continent",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }, {
      "name": "geometry",
      "dataType": "struct",
      "remoteType": "struct<__geom__:bigint,_is_empty:boolean,_ndim:bigint>",
      "nullable": true,
      "children": [{
        "name": "__geom__",
        "dataType": "bigint"
      }, {
        "name": "_is_empty",
        "dataType": "boolean"
      }, {
        "name": "_ndim",
        "dataType": "bigint"
      }]
    }, {
      "name": "wkt",
      "dataType": "text",
      "remoteType": "string",
      "nullable": true
    }]
  }
}
```

### Update multiple data sources

<mark style="color:green;">`PUT`</mark> `/databricks/bulk`

Update the data source metadata associated with the provided connection string.

**Required**: The global `GOVERNANCE` permission or be the data source owner

#### Payload parameters

| Attribute        | Description                                                                                                                | Required |
| ---------------- | -------------------------------------------------------------------------------------------------------------------------- | -------- |
| handler          | `metadata` Includes metadata about the handler, such as `ssl`, `port`, `database`, `hostname`, `username`, and `password`. | **Yes**  |
| connectionString | `string` The connection string used to connect to the data sources.                                                        | **Yes**  |

#### Response parameters

| Attribute        | Description                                                                                                                      |
| ---------------- | -------------------------------------------------------------------------------------------------------------------------------- |
| bulkId           | `string` The ID of the bulk data source update.                                                                                  |
| connectionString | `string` The connection string shared by the data sources bulk updated.                                                          |
| jobsCreated      | `integer` The number of jobs that ran to update the data sources; this number corresponds to the number of data sources updated. |

#### Request example

This request updates the metadata for all data sources with the connection string specified in `example-payload.json`.

```bash
curl \
    --request PUT \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    --data @example-payload.json \
    https://demo.immuta.com/databricks/bulk
```

**Payload example**

The payload below adds a certificate (`certificate.json`) to connect to the data sources with the provided connection.

```json
{
  "handler": {
    "metadata": {
      "ssl": true,
      "port": 443,
      "database": "default",
      "hostname": "your-hostname.cloud.databricks.com",
      "userFiles": [{
        "keyName": "test",
        "filename": "6dc06a3310b9ba33c543e483d1e745b3ac9bc648.json",
        "userFilename": "certificate.json"
      }],
      "authenticationMethod": "Access Token",
      "password": "your-password",
      "httpPath": "sql/your/http/0/path"
    }
  },
  "connectionString": "your-hostname.cloud.databricks.com:443/default/default"
}
```

#### Response example

```json
{
  "bulkId": "bulk_ds_update_9ae5bfd85a3a47a8b454c618043e2aa3",
  "connectionString": "your-hostname.cloud.databricks.com:443/default",
  "jobsCreated": 2
}
```

### Recalculate the high cardinality column for a data source

<mark style="color:green;">`PUT`</mark> `/databricks/handler/{handlerId}/triggerHighCardinalityJob`

Recalculate the high cardinality column for the specified data source.

**Required**: The global `GOVERNANCE` permission or be the data source owner

#### Query parameters

| Attribute | Description                      | Required |
| --------- | -------------------------------- | -------- |
| handlerId | `integer` The ID of the handler. | **Yes**  |

#### Response parameters

The response returns a string of characters that identify the high cardinality job run.

#### Request example

This request re-runs the job that calculates the high cardinality column for the data source with the handler ID `47`.

```bash
curl \
    --request PUT \
    --header "Content-Type: application/json" \
    --header "Authorization: Bearer dea464c07bd07300095caa8" \
    https://demo.immuta.com/databricks/handler/47/triggerHighCardinalityJob
```

#### Response example

```json
f6ac1ad0-26d0-11ec-8078-d36bbf5b90fb
```
