# Manually Update Your Databricks Cluster

If a Databricks cluster needs to be manually updated to reflect changes in the Immuta init script or cluster policies, you can remove and set up your integration again to get the updated policies and init script.

1. Log in to Immuta as an Application Admin.
2. Click the <i class="fa-gear">:gear:</i> **App Settings** icon in the navigation menu and scroll to the **Integration Settings** section.
3. Your existing Databricks Spark integration should be listed here; expand it and note the configuration values. Now select **Remove** to remove your integration.
4. Click **Add Integration** and select **Databricks Integration** to add a new integration.
5. Enter your Databricks Spark integration settings again as configured previously.
6. Click **Add Integration** to add the integration, and then select **Configure Cluster Policies** to set up the updated cluster policies and init script.
7. Select the cluster policies you wish to use for your Immuta-enabled Databricks clusters.
8. Automatically push cluster policies and the init script (recommended) or manually update your cluster policies.
   * **Automatically push cluster policies**
     1. Select **Automatically Push Cluster Policies** and enter your privileged Databricks access token. This token must have privileges to write to cluster policies.
     2. Select **Apply Policies** to push the cluster policies and init script again.
     3. Click **Save** and **Confirm** to deploy your changes.
   * **Manually update cluster policies**
     1. Download the init script and the new cluster policies to your local computer.
     2. Click **Save** and **Confirm** to save your changes in Immuta.
     3. Log in to your Databricks workspace with your administrator account to set up cluster policies.
     4. Get the path you will upload the init script (`immuta_cluster_init_script_proxy.sh`) to by opening one of the cluster policy `.json` files and looking for the `defaultValue` of the field `init_scripts.0.dbfs.destination`. This should be a DBFS path in the form of `dbfs:/immuta-plugin/hostname/immuta_cluster_init_script_proxy.sh`.
     5. Click **Data** in the left pane to upload your init script to DBFS to the path you found above.
     6. To find your existing cluster policies you need to update, click **Compute** in the left pane and select the **Cluster policies** tab.
     7. Edit each of these cluster policies that were configured before and overwrite the contents of the JSON with the new cluster policy JSON you downloaded.
9. Restart any Databricks clusters using these updated policies for the changes to take effect.
