Migrating to Unity Catalog
When you enable Unity Catalog, Immuta automatically migrates your existing
Databricks data sources in Immuta to reference the legacy
hive_metastore catalog to account for Unity Catalog's
New data sources will reference the Unity Catalog metastore you create and attach to your Databricks workspace.
hive_metastore catalog is not managed by Unity Catalog, existing data sources
hive_metastore cannot have Unity Catalog access controls applied to them.
Data sources in the Hive Metastore must be managed by the Databricks Spark integration.
To allow Immuta to administer Unity Catalog access controls on that data, move the data to Unity Catalog and re-register those tables in Immuta by completing the steps below. If you don't move all data before configuring the integration, metastore magic will protect your existing data sources throughout the migration process.
- Disable all existing Databricks Spark integrations with Unity Catalog support or Databricks SQL integrations. Note: Immuta supports running the Databricks Spark integration with the Unity Catalog integration concurrently, so Databricks Spark integrations do not have to be disabled before migrating to Unity Catalog.
- Ensure that all Databricks clusters that have Immuta installed are stopped and the Immuta configuration is removed from the cluster. Immuta-specific cluster configuration is no longer needed with the Databricks Unity Catalog integration.
- Move all data into Unity Catalog before configuring Immuta with Unity Catalog. Existing data sources will need to be re-created after they are moved to Unity Catalog and the Unity Catalog integration is configured.
- Enable Unity Catalog.