# Install a Trusted Library

## 1 - Install the Library

1. In the Databricks Clusters UI, install your third-party library .jar or Maven artifact with **Library Source** `Upload`, `DBFS`, `DBFS/S3`, or `Maven`. Alternatively, use the Databricks libraries API.

   ![](https://969552016-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLnuUzWSfU9nJeB2EJkrh%2Fuploads%2Fgit-blob-e7f65c0b7e01f896cc9624a5ec6b047347632a1a%2Finstall-library.png?alt=media)
2. In the Databricks Clusters UI, add the `IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS` property as a Spark environment variable and set it to your artifact's URI:

{% tabs %}
{% tab title="Maven artifacts" %}
For Maven artifacts, the URI is `maven:/<maven_coordinates>`, where `<maven_coordinates>` is the **Coordinates** field found when clicking on the installed artifact on the **Libraries** tab in the Databricks Clusters UI. Here's an example of an installed artifact:

<figure><img src="https://969552016-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLnuUzWSfU9nJeB2EJkrh%2Fuploads%2Fgit-blob-90ac31e05ccc18233be5a47ad2179197c3887328%2Fmaven-artifacts.png?alt=media" alt=""><figcaption></figcaption></figure>

In this example, you would add the following Spark environment variable:

```shell
IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS=maven:/com.github.immuta.hadoop.immuta-spark-third-party-maven-lib-test:2020-11-17-144644
```

{% endtab %}

{% tab title=".jar artifacts" %}
For jar artifacts, the URI is the **Source** field found when clicking on the installed artifact on the **Libraries** tab in the Databricks Clusters UI. For artifacts installed from DBFS or S3, this ends up being the original URI to your artifact. For uploaded artifacts, Databricks will rename your .jar and put it in a directory in DBFS. Here's an example of an installed artifact:

<figure><img src="https://969552016-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLnuUzWSfU9nJeB2EJkrh%2Fuploads%2Fgit-blob-aaf1c375c5effea54bc12e02b12a76f9cebc6f50%2Fjar-artifact.png?alt=media" alt=""><figcaption></figcaption></figure>

In this example, you would add the following Spark environment variable:

```shell
IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS=dbfs:/immuta/bstabile/jars/immuta-spark-third-party-lib-test.jar
```

{% endtab %}
{% endtabs %}

Once you've finished making your changes, restart the cluster.

{% hint style="info" %}
**Specifying more than one trusted library**

To specify more than one trusted library, comma delimit the URIs:

```shell
IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS=maven:/my.group.id:my-package-id:1.2.3,dbfs:/path/to/my/library.jar
```

{% endhint %}

## 2 - Execute a Command in a Notebook

Once the cluster is up, execute a command in a notebook. If the trusted library installation is successful, you should see driver log messages like this:

```shell
TrustedLibraryUtils: Successfully found all configured Immuta configured trusted libraries in Databricks.
TrustedLibraryUtils: Wrote trusted libs file to [/databricks/immuta/immutaTrustedLibs.json]: true.
TrustedLibraryUtils: Added trusted libs file with 1 entries to spark context.
TrustedLibraryUtils: Trusted library installation complete.
```
