Delta Lake API
Audience:Data Owners and Data Users
Content Summary: This page describes the Spark SQL options to substitute for the Delta Lake API a user may need.
Introduction
When using Delta Lake, the API does not go through the normal Spark execution path. This means that Immuta's Spark extensions do not provide protection for the API. To solve this issue and ensure that Immuta has control over what a user can access, the Delta Lake API is blocked.
Spark SQL can be used instead to give the same functionality with all of Immuta's data protections.
Requests
Below is a table of the Delta Lake API with the Spark SQL that may be used instead.
Delta Lake API | Spark SQL |
---|---|
DeltaTable.convertToDelta | CONVERT TO DELTA parquet./path/to/parquet/ |
DeltaTable.delete | DELETE FROM [table_identifier delta./path/to/delta/ ] WHERE condition |
DeltaTable.generate | GENERATE symlink_format_manifest FOR TABLE [table_identifier delta./path/to/delta ] |
DeltaTable.history | DESCRIBE HISTORY [table_identifier delta./path/to/delta ] (LIMIT x) |
DeltaTable.merge | MERGE INTO |
DeltaTable.update | UPDATE [table_identifier delta./path/to/delta/ ] SET column = valueWHERE (condition) |
DeltaTable.vacuum | VACUUM [table_identifier delta./path/to/delta ] |
See here for a complete list of the Delta SQL Commands.