# Install a Trusted Library

1. In the Databricks Clusters UI, install your third-party library .jar or Maven artifact with **Library Source** `Upload`, `DBFS`, `DBFS/S3`, or `Maven`. Alternatively, use the Databricks libraries API.
2. In the Databricks Clusters UI, add the `IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS` property as a Spark environment variable and set it to your artifact's URI. To specify more than one trusted library, comma delimit the URIs:

   ```bash
   IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS=maven:/my.group.id:my-package-id:1.2.3
   ```

{% tabs %}
{% tab title="Maven artifacts" %}
For Maven artifacts, the URI is `maven:/<maven_coordinates>`, where `<maven_coordinates>` is the **Coordinates** field found when clicking on the installed artifact on the **Libraries** tab in the Databricks Clusters UI. Here's an example of an installed artifact:

<figure><img src="https://1751699907-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FlWBda5Pt4s8apEhzXGl7%2Fuploads%2Fgit-blob-90ac31e05ccc18233be5a47ad2179197c3887328%2Fmaven-artifacts.png?alt=media" alt="The coordinate field includes the following Maven coordinate: com.github.immuta.hadoop.immuta-spark-third-party-maven-lib-test:2020-11-17-144644."><figcaption></figcaption></figure>

In this example, you would add the following Spark environment variable:

```bash
IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS=maven:/com.github.immuta.hadoop.immuta-spark-third-party-maven-lib-test:2020-11-17-144644
```

{% endtab %}

{% tab title=".jar artifacts" %}
For jar artifacts, the URI is the **Source** field found when clicking on the installed artifact on the **Libraries** tab in the Databricks Clusters UI. For artifacts installed from DBFS or S3, this ends up being the original URI to your artifact. For uploaded artifacts, Databricks will rename your .jar and put it in a directory in DBFS. Here's an example of an installed artifact:

<figure><img src="https://1751699907-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FlWBda5Pt4s8apEhzXGl7%2Fuploads%2Fgit-blob-aaf1c375c5effea54bc12e02b12a76f9cebc6f50%2Fjar-artifact.png?alt=media" alt="The source field includes the following URI: dbfs:/immuta/bstabile/jars/immuta-spark-third-party-lib-test.jar."><figcaption></figcaption></figure>

In this example, you would add the following Spark environment variable:

```bash
IMMUTA_SPARK_DATABRICKS_TRUSTED_LIB_URIS=dbfs:/immuta/bstabile/jars/immuta-spark-third-party-lib-test.jar
```

{% endtab %}
{% endtabs %}

3. Once you've finished making your changes, restart the cluster.
4. Once the cluster is up, execute a command in a notebook. If the trusted library installation is successful, you should see driver log messages like this:

   <pre><code><strong>TrustedLibraryUtils: Successfully found all configured Immuta configured trusted libraries in Databricks.
   </strong>TrustedLibraryUtils: Wrote trusted libs file to [/databricks/immuta/immutaTrustedLibs.json]: true.
   TrustedLibraryUtils: Added trusted libs file with 1 entries to spark context.
   TrustedLibraryUtils: Trusted library installation complete.
   </code></pre>
