# Databricks Spark

This integration enforces policies on Databricks securables registered in the legacy Hive metastore. Once these securables are registered as Immuta data sources, users can query policy-enforced data on Databricks clusters.

The guides in this section outline how to integrate Databricks Spark with Immuta.

## [Getting started](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/getting-started-with-databricks-spark)

This getting started guide outlines how to integrate Databricks with Immuta.

## How-to guides

* [Configure a Databricks Spark integration](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/simplified): Configure the Databricks Spark integration.
* [Manually update your Databricks cluster](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/cluster-update): Manually update your cluster to reflect changes in the Immuta init script or cluster policies.
* [Install a trusted library](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/installation): Register a Databricks library with Immuta as a trusted library to avoid Immuta security manager errors when using third-party libraries.
* [Project UDFs cache settings](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/project-udfs): Raise the caching on-cluster and lower the cache timeouts for the Immuta web service to allow use of project UDFs in Spark jobs.
* [Run R and Scala spark-submit jobs on Databricks](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/spark-submit): Run R and Scala `spark-submit` jobs on your Databricks cluster.
* [DBFS access](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/access-dbfs): Access DBFS in Databricks for non-sensitive data.
* [Troubleshooting](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/how-to-guides/troubleshooting): Resolve errors in the Databricks Spark configuration.

## Reference guides

* [Databricks Spark integration configuration](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/reference-guides/databricks): This guide describes the design and components of the integration.
* [Security and compliance](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/reference-guides/security-and-compliance): This guide provides an overview of the Immuta features that provide security for your users and Databricks clusters and that allow you to prove compliance and monitor for anomalies.
* [Registering and protecting data](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/reference-guides/registering-and-protecting-data): This guide provides an overview of registering Databricks securables and protecting them with Immuta policies.
* [Accessing data](https://documentation.immuta.com/saas/configuration/integrations/databricks/databricks-spark/reference-guides/accessing-data): This guide provides an overview of how Databricks users access data registered in Immuta.
