# Delta Lake API

When using Delta Lake, the API does not go through the normal Spark execution path. This means that Immuta's Spark extensions do not provide protection for the API. To solve this issue and ensure that Immuta has control over what a user can access, the Delta Lake API is blocked.

Spark SQL can be used instead to give the same functionality with all of Immuta's data protections.

## Requests

Below is a table of the Delta Lake API with the Spark SQL that may be used instead.

| Delta Lake API            | Spark SQL                                                                                |
| ------------------------- | ---------------------------------------------------------------------------------------- |
| DeltaTable.convertToDelta | CONVERT TO DELTA parquet.`/path/to/parquet/`                                             |
| DeltaTable.delete         | DELETE FROM \[table\_identifier delta.`/path/to/delta/`] WHERE condition                 |
| DeltaTable.generate       | GENERATE symlink\_format\_manifest FOR TABLE \[table\_identifier delta.`/path/to/delta`] |
| DeltaTable.history        | DESCRIBE HISTORY \[table\_identifier delta.`/path/to/delta`] (LIMIT x)                   |
| DeltaTable.merge          | MERGE INTO                                                                               |
| DeltaTable.update         | UPDATE \[table\_identifier delta.`/path/to/delta/`] SET column = valueWHERE (condition)  |
| DeltaTable.vacuum         | VACUUM \[table\_identifier delta.`/path/to/delta`]                                       |

See here for a complete list of the [Delta SQL Commands](https://docs.databricks.com/delta/index.html).

## Merging tables in project workspaces

When a table is created in a workspace, you can merge a different Immuta data source from that workspace into that table you created.

1. Create a table in the project workspace.
2. Create a temporary view of the Immuta data source you want to merge into that table.
3. Use that temporary view as the data source you add to the project workspace.
4. Run the following command:

   ```shell
   MERGE INTO delta_native.target_native as target
   USING immuta_temp_view_data_source as source
   ON target.dr_number = source.dr_number
   WHEN MATCHED THEN
   UPDATE SET target.date_reported = source.date_reported
   ```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://documentation.immuta.com/2024.2/data-and-integrations/databricks-spark/reference-guides/delta-lake-api.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
