# Reference Guides

- [Databricks Spark Integration Configuration](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks.md): Learn how the Databricks Spark integration works to govern data access
- [Installation and Compliance](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks/installation-and-compliance.md): Learn about what Immuta creates in your Databricks environment to enforce access controls
- [Customizing the Integration](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks/customizing-the-integration.md): Learn how to adjust settings for the Databricks Spark integration
- [Setting Up Users](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks/setting-up-users.md): Learn about user impersonation and how to map your Databricks users into Immuta
- [Spark Environment Variables](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks/configuration.md): Learn about the available Spark environment variables so that you can customize your Databricks Spark integration
- [Ephemeral Overrides](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/databricks/ephemeral-overrides.md): Learn about how Immuta uses ephemeral overrides to determine which cluster compute to use when connecting to Databricks for maintenance operations
- [Security and Compliance](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/security-and-compliance.md): Understand the authentication methods, cluster security, and audit features supported by the Databricks Spark integration to ensure you are meeting your organization's security and compliance needs
- [Registering and Protecting Data](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/registering-and-protecting-data.md): Learn how Immuta enforces policies on data in your Databricks Spark environment
- [Accessing Data](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/accessing-data.md): Learn how end users can access policy-enforced data in Databricks Spark
- [Delta Lake API](https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides/accessing-data/delta-lake-api.md): Learn how to use Spark SQL to achieve the same functionality as the Delta Lake API


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://documentation.immuta.com/SaaS/configuration/integrations/databricks/databricks-spark/reference-guides.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
