Release Notes 2021
Bulk Approve Subscription Requests: Data Owners can approve all pending access requests at once.
- Databricks Runtime 9.1 LTS Support.
- User Impersonation in Databricks: Databricks users can impersonate Immuta users.
- Multiple Immuta instances are supported in a single Databricks workspace: This change adds a new field in the Databricks Integration UI: a Unique ID that ties the set of cluster policies to their instance of Immuta. This feature makes it easier for users to configure the integration and avoid cluster policy conflicts.
- Cluster policy option added for sparklyr.
- Support for Notebook-Scoped Libraries on Machine Learning Clusters:
Users on Databricks runtimes 8+ can manage
notebook-scoped libraries with
- GCM TLS ciphers: enabled by default in Databricks init script.
- TLS verification can be disabled in the Databricks init script when necessary, such as when JAR files for the init script are hosted where a self-signed or internal TLS CA is used.
Data source creation performance improvements.
- Redshift: Support for Redshift is now generally available.
- Permanently Delete Users: User data can be deleted and permanently removed from Immuta, which aligns with the GDPR requirement.
- Spark Direct File Reads: Users can manage Immuta policies against direct file reads in Spark.
- Apply Immuta Attributes to Groups from External IAMs: User Admins can apply attributes in Immuta to groups from external IAMs.
- User profile sync performance improvements.
- Views with
whereclauses that included a string with the SQL comment characters
--caused Immuta data source failures.
- Aliases in view create statements were case-sensitive.
mlflow.spark.log_modelwas blocked by the Immuta Security Manager and other errors.
- Views with
Databricks and Redshift integrations: Attributes with two or more single quotes were not handled correctly.
Snowflake row access policy performance improvements.
- Requesting access to a schema project with a large number of data sources (approximately ten thousand) caused 502 errors.
- When creating data sources after an Alation catalog was configured, tags were not automatically added to the data sources.
Change Data Feed: Immuta supports Change Data Feed (CDF), which shows the row-level changes between versions of a Delta table. The changes displayed include row data and metadata that indicates whether the row was inserted, deleted, or updated.
Databricks Runtime 8.4
Databricks Runtime 9.1: This runtime is only supported on Python/SQL clusters.
User Impersonation: User impersonation allows users to natively query data in Snowflake, Redshift, and Synapse as another Immuta user.
Snowflake as a Catalog: When enabled, Immuta automatically registers an external Snowflake catalog using the provided hostname and leveraging the Immuta System account that gets generated. Any Snowflake sources that are registered from that host will automatically have their relevant tags ingested into Immuta.
Snowflake Integration: Immuta manages and applies Snowflake row access and column masking policies on individual tables directly in Snowflake instead of creating views for Immuta data sources.
Snowflake Audit: Users can view audit records on the Immuta Audit page for queries run natively in Snowflake.
- Security Manager error on AWS metadata service.
- Schema detection failure occurred in Snowflake instances with tens of thousands of tables.
- Case mismatch between Databricks and SCIM users.
- Error occurred ("You may not access raw data directly") when querying Delta tables in Databricks.
- Advanced Subscription policies AND'ed together could prevent auto-subscription policies from applying correctly.
- Snowflake Governance Features Integration: If users created a database with non-default collation, and then they created a table with an explicit collation that didn't match the default collation, applying native policies failed.
- LDAP Sync:
- Each web worker (instead of a single web worker) kicked off LDAP Sync.
- Users who belonged to more than one group that had the same name in Immuta could not log in.
- Native Snowflake: Schema monitoring, Global or Local policy changes, and data source creation timed out or failed to make updates until web pods were restarted.
- Power BI: The client failed to list out databases on a Databricks cluster because of a query timeout.
- Creating a view in a scratch path database from a Snowflake data source resulted in an
Error in SQL statement: NoSuchElementException: key not found: <masked column>