# Databricks Spark

This integration enforces policies on Databricks securables registered in the legacy Hive metastore. Once these securables are registered as Immuta data sources, users can query policy-enforced data on Databricks clusters.

The guides in this section outline how to integrate Databricks Spark with Immuta.

## [Getting started](/SaaS/configuration/integrations/databricks/databricks-spark/getting-started-with-databricks-spark.md)

This getting started guide outlines how to integrate Databricks with Immuta.

## How-to guides

* [Configure a Databricks Spark integration](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/simplified.md): Configure the Databricks Spark integration.
* [Manually update your Databricks cluster](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/cluster-update.md): Manually update your cluster to reflect changes in the Immuta init script or cluster policies.
* [Install a trusted library](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/installation.md): Register a Databricks library with Immuta as a trusted library to avoid Immuta security manager errors when using third-party libraries.
* [Project UDFs cache settings](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/project-udfs.md): Raise the caching on-cluster and lower the cache timeouts for the Immuta web service to allow use of project UDFs in Spark jobs.
* [Run R and Scala spark-submit jobs on Databricks](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/spark-submit.md): Run R and Scala `spark-submit` jobs on your Databricks cluster.
* [DBFS access](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/access-dbfs.md): Access DBFS in Databricks for non-sensitive data.
* [Troubleshooting](/SaaS/configuration/integrations/databricks/databricks-spark/how-to-guides/troubleshooting.md): Resolve errors in the Databricks Spark configuration.

## Reference guides

* [Databricks Spark integration configuration](/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks.md): This guide describes the design and components of the integration.
* [Security and compliance](/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/security-and-compliance.md): This guide provides an overview of the Immuta features that provide security for your users and Databricks clusters and that allow you to prove compliance and monitor for anomalies.
* [Registering and protecting data](/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/registering-and-protecting-data.md): This guide provides an overview of registering Databricks securables and protecting them with Immuta policies.
* [Accessing data](/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/accessing-data.md): This guide provides an overview of how Databricks users access data registered in Immuta.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
