LogoLogo
SaaSBook a demo
  • Immuta Documentation - SaaS
  • Configuration
    • Connect Data Platforms
      • Data Platforms Overview
      • Amazon S3 Integration
      • AWS Lake Formation
        • Getting Started with AWS Lake Formation
        • Register an AWS Lake Formation Connection
        • Reference Guides
          • AWS Lake Formation
          • Security and Compliance
          • Protecting Data
          • Accessing Data
      • Azure Synapse Analytics
        • Getting Started with Azure Synapse Analytics
        • Configure Azure Synapse Analytics Integration
        • Reference Guides
          • Azure Synapse Analytics Overview
          • Azure Synapse Analytics Pre-Configuration Details
      • Databricks
        • Databricks Spark
          • Getting Started with Databricks Spark
          • How-to Guides
            • Configure a Databricks Spark Integration
            • Manually Update Your Databricks Cluster
            • Install a Trusted Library
            • Project UDFs Cache Settings
            • Run R and Scala spark-submit Jobs on Databricks
            • DBFS Access
            • Troubleshooting
          • Reference Guides
            • Databricks Spark Integration Configuration
              • Installation and Compliance
              • Customizing the Integration
              • Setting Up Users
              • Spark Environment Variables
              • Ephemeral Overrides
            • Security and Compliance
            • Registering and Protecting Data
            • Accessing Data
              • Delta Lake API
        • Databricks Unity Catalog
          • Getting Started with Databricks Unity Catalog
          • How-to Guides
            • Register a Databricks Unity Catalog Connection
            • Configure a Databricks Unity Catalog Integration
            • Migrating to Unity Catalog
          • Databricks Unity Catalog Integration Reference Guide
      • Google BigQuery Integration
      • Redshift
        • Getting Started with Redshift
        • How-to Guides
          • Configure Redshift Integration
          • Configure Redshift Spectrum
        • Reference Guides
          • Redshift Overview
          • Redshift Pre-Configuration Details
      • Snowflake
        • Getting Started with Snowflake
        • How-to Guides
          • Register a Snowflake Connection
          • Configure a Snowflake Integration
          • Edit or Remove Your Snowflake Integration
          • Integration Settings
            • Snowflake Table Grants Private Preview Migration
            • Enable Snowflake Table Grants
            • Using Snowflake Data Sharing with Immuta
            • Enable Snowflake Low Row Access Policy Mode
              • Upgrade Snowflake Low Row Access Policy Mode
            • Configure Snowflake Lineage Tag Propagation
        • Reference Guides
          • Snowflake Integration
          • Snowflake Table Grants
          • Snowflake Data Sharing with Immuta
          • Snowflake Low Row Access Policy Mode
          • Snowflake Lineage Tag Propagation
          • Warehouse Sizing Recommendations
        • Explanatory Guides
          • Phased Snowflake Onboarding
      • Starburst (Trino)
        • Getting Started with Starburst (Trino)
        • How-to Guides
          • Configure Starburst (Trino) Integration
          • Customize Read and Write Access Policies for Starburst (Trino)
        • Starburst (Trino) Integration Reference Guide
      • Queries Immuta Runs in Your Data Platform
      • Connect Your Data
        • Registering a Connection
          • How-to Guides
            • Run Object Sync
            • Manage Connection Settings
            • Use the Connection Upgrade Manager
              • Troubleshooting
          • Reference Guides
            • Connections
            • Upgrading to Connections
              • Before You Begin
              • API Changes
              • FAQ
        • Registering Metadata
          • Data Sources in Immuta
          • Register Data Sources
            • Amazon S3 Data Source
            • Azure Synapse Analytics Data Source
            • Databricks Data Source
            • Google BigQuery Data Source
            • Redshift Data Source
            • Snowflake Data Source
              • Bulk Create Snowflake Data Sources
            • Create a Starburst (Trino) Data Source
          • Data Source Settings
            • How-to Guides
              • Manage Data Source Settings
              • Manage Data Source Members
              • Manage Access Requests and Tasks
              • Manage Data Dictionary Descriptions
              • Disable Immuta from Sampling Raw Data
            • Data Source Health Checks Reference Guide
          • Schema Monitoring
            • How-to Guides
              • Manage Schema Monitoring
              • Run Schema Monitoring and Column Detection Jobs
            • Reference Guides
              • Schema Monitoring
              • Schema Projects
            • Why Use Schema Monitoring Concept Guide
    • Manage Data Metadata
      • Connect External Catalogs
        • Configure an External Catalog
        • Reference Guides
          • External Catalog Introduction
          • Custom REST Catalog Interface Introduction
          • Custom REST Catalog Interface Endpoints
      • Data Identification
        • Introduction
        • How-to Guides
          • Use Identification
          • Manage Identifiers
          • Run and Manage Identification
          • Manage Identification Frameworks
          • Use Sensitive Data Discovery (SDD)
        • Reference Guides
          • How Competitive Pattern Analysis Works
          • Built-in Identifier Reference
            • Built-in Identifier Changelog
          • Built-in Discovered Tags Reference
      • Data Classification
        • How-to Guides
          • Activate Classification Frameworks
          • Adjust Identification and Classification Framework Tags
          • How to Use a Classification Framework with Your Own Tags
        • Reference Guide
          • Classification Frameworks
      • Manage Tags
        • How-to Guides
          • Create and Manage Tags
          • Add Tags to Data Sources and Projects
        • Tags Reference Guide
    • Manage Users
      • Getting Started with Users
      • Identity Managers (IAMs)
        • How-to Guides
          • Okta LDAP Interface
          • OpenID Connect
            • OpenID Connect Protocol
            • Okta and OpenID Connect
            • OneLogin with OpenID Connect
          • SAML
            • SAML Protocol
            • Microsoft Entra ID
            • Okta SAML SCIM
        • Reference Guides
          • Identity Managers
          • SAML Protocol Configuration Options
          • SAML Single Logout
      • Immuta Users
        • How-to Guides
          • Managing Personas and Permissions
          • User Impersonation
          • Manage Attributes and Groups
          • External User ID Mapping
          • External User Info Endpoint
        • Reference Guides
          • Permissions and Personas
          • Attributes and Groups in Immuta
    • Organize Data into Domains
      • Getting Started with Domains
      • Domains Reference Guide
    • Application Settings
      • How-to Guides
        • App Settings
        • Private Networking Support
          • Data Connection Private Networking
            • AWS PrivateLink for Redshift
            • AWS PrivateLink for API Gateway
            • Databricks Private Connectivity
              • AWS PrivateLink for Databricks
              • Azure Private Link for Databricks
            • Snowflake Private Connectivity
              • AWS PrivateLink for Snowflake
              • Azure Private Link for Snowflake
            • Starburst (Trino) Private Connectivity
              • AWS PrivateLink for Starburst (Trino)
              • Azure Private Link for Starburst (Trino)
          • Immuta SaaS Private Networking
            • Immuta SaaS Private Networking Over AWS PrivateLink
        • BI Tools
          • BI Tool Configuration Recommendations
          • Power BI Configuration Example
          • Tableau Configuration Example
        • IP Filtering
        • System Status Bundle
      • Reference Guides
        • Deployment Options
        • Data Processing
        • Encryption and Masking Practices
  • Marketplace
    • Introduction
      • User Types
      • Walkthrough
    • Share Data Products
      • How-to Guides
        • Manage Data Products
        • Customize the Marketplace Branding
      • Reference Guides
        • Marketplace App Requirements
        • Data Products
        • Marketplace Permissions Matrix
        • Setting Up Domains for Marketplace
    • Review Access Requests
      • How-to Guides
        • View and Respond to Access Requests
        • Manage Request Forms
      • Reference Guides
        • Understanding Access Provisioning and Underlying Policies in Immuta
          • S3 Provisioning Best Practices
        • Integrating with Existing Catalogs
    • Access Data Products
      • How-to Guides
        • Logging into Marketplace
        • Requesting Access to a Data Product
      • Reference Guide
        • Data Source Access Status
    • Short-Term Limitations
  • Governance
    • Introduction
      • Automate Data Access Control Decisions
        • The Two Paths
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
        • Test and Deploy Policy
      • Compliantly Open More Sensitive Data for ML and Analytics
        • Managing User Metadata
        • Managing Data Metadata
        • Author Policy
    • Author Policies for Data Access Control
      • Introduction
        • Scalability and Evolvability
        • Understandability
        • Distributed Stewardship
        • Consistency
        • Availability of Data
      • Policies
        • Authoring Policies at Scale
        • Data Engineering with Limited Policy Downtime
        • Subscription Policies
          • Overview
          • How-to Guides
            • Author a Subscription Policy
            • Author an ABAC Subscription Policy
            • Subscription Policies Advanced DSL Guide
            • Author a Restricted Subscription Policy
            • Clone, Activate, or Stage a Global Policy
          • Reference Guides
            • Subscription Policy Access Types
            • Advanced Use of Special Functions
        • Data Policies
          • Overview
          • How-to Guides
            • Author a Masking Data Policy
            • Author a Minimization Policy
            • Author a Purpose-Based Restriction Policy
            • Author a Restricted Data Policy
            • Author a Row-Level Policy
            • Author a Time-Based Restriction Policy
            • Policy Certifications and Diffs
          • Reference Guides
            • Data Policy Types
            • Masking Policies
            • Row-Level Policies
            • Custom WHERE Clause Functions
            • Data Policy Conflicts and Fallback
            • Custom Data Policy Certifications
            • Orchestrated Masking Policies
      • Projects and Purpose-Based Access Control
        • Projects and Purpose Controls
          • Getting Started
          • How-to Guides
            • Create a Project
            • Create and Manage Purposes
            • Project Management
              • Manage Projects and Project Settings
              • Manage Project Data Sources
              • Manage Project Members
          • Reference Guides
            • Projects and Purposes
          • Concept Guide
            • Why Use Purposes?
        • Equalized Access
          • Manage Project Equalization How-to Guide
          • Equalized Access Reference Guide
          • Why Use Project Equalization?
        • Masked Joins
          • Enable Masked Joins How-to Guide
          • Why Use Masked Joins?
        • Writing to Projects
          • How-to Guides
            • Create and Manage Snowflake Project Workspaces
            • Create and Manage Databricks Spark Project Workspaces
            • Write Data to the Workspace
          • Reference Guides
            • Writing to Projects
            • Project UDFs (Databricks)
      • Data Consumers
        • Subscribe to a Data Source
        • Query Data
          • Querying Snowflake Data
          • Querying Databricks Data
          • Querying Starburst (Trino) Data
          • Querying Databricks SQL Data
          • Querying Redshift Data
          • Querying Azure Synapse Analytics Data
        • Subscribe to Projects
    • Observe Access and Activity
      • Introduction
      • Audit
        • How-to Guides
          • Export Audit Logs to S3
          • Export Audit Logs to ADLS
          • Use Immuta Audit
          • Run Governance Reports
        • Reference Guides
          • Universal Audit Model (UAM)
            • UAM Schema Reference Guide
          • Query Audit Logs
            • Snowflake Query Audit Logs
            • Databricks Unity Catalog Query Audit Logs
            • Databricks Spark Query Audit Logs
            • Starburst (Trino) Query Audit Logs
          • Audit Export GraphQL Reference Guide
          • Unknown Users in Audit Logs
          • Governance Report Types
      • Dashboards
        • Use the Audit Dashboards How-To Guide
        • Audit Dashboards Reference Guide
      • Monitors
        • Manage Monitors and Observations
        • Monitors Reference Guide
  • Releases
    • Deployment Notes
      • 2024
      • 2023
      • 2022
    • Scheduled Maintenance Windows
    • Immuta Support Matrix Overview
    • Immuta CLI Release Notes
    • Preview Features
      • Features in Preview
    • Deprecations
  • Developer Guides
    • The Immuta CLI
      • Install and Configure the Immuta CLI
      • Manage Your Immuta Tenant
      • Manage Data Sources
      • Manage Sensitive Data Discovery
        • Manage Sensitive Data Discovery Rules
        • Manage Identification Frameworks
        • Run Sensitive Data Discovery on Data Sources
      • Manage Policies
      • Manage Projects
      • Manage Purposes
      • Manage Audit Export
    • The Immuta API
      • Authentication
      • Integrations API
        • Getting Started
        • How-to Guides
          • Configure an Amazon S3 Integration
          • Configure an Azure Synapse Analytics Integration
          • Configure a Databricks Unity Catalog Integration
          • Configure a Google BigQuery Integration
          • Configure a Redshift Integration
          • Configure a Snowflake Integration
          • Configure a Starburst (Trino) Integration
        • Reference Guides
          • Integrations API Endpoints
          • Integration Configuration Payload
          • Response Schema
          • HTTP Status Codes and Error Messages
      • Connections API
        • How-to Guides
          • Register a Connection
            • Register a Snowflake Connection
            • Register a Databricks Unity Catalog Connection
            • Register an AWS Lake Formation Connection
          • Manage a Connection
          • Deregister a Connection
        • Connection Registration Payloads Reference Guide
      • Marketplace API
        • Marketplace API Endpoints
        • Source Controlling Data Products
      • Immuta V2 API
        • Data Source Payload Attribute Details
          • Data Source Request Payload Examples
        • Create Policies API Examples
        • Create Projects API Examples
        • Create Purposes API Examples
      • Immuta V1 API
        • Configure Your Instance of Immuta
          • Get Job Status
          • Manage Frameworks
          • Manage IAMs
          • Manage Licenses
          • Manage Notifications
          • Manage Identification
            • API Changes - Identification Frameworks to Identifiers in Domains
            • Manage Sensitive Data Discovery (SDD)
          • Manage Tags
          • Manage Webhooks
          • Search Filters
        • Connect Your Data
          • Create and Manage an Amazon S3 Data Source
          • Create an Azure Synapse Analytics Data Source
          • Create a Databricks Data Source
          • Create a Redshift Data Source
          • Create a Snowflake Data Source
          • Create a Starburst (Trino) Data Source
          • Manage the Data Dictionary
        • Use Domains
        • Manage Data Access
          • Manage Access Requests
          • Manage Data and Subscription Policies
          • Manage Write Policies
            • Write Policies Payloads and Response Schema Reference Guide
          • Policy Handler Objects
          • Search Connection Strings
          • Search for Organizations
          • Search Schemas
        • Subscribe to and Manage Data Sources
        • Manage Projects and Purposes
          • Manage Projects
          • Manage Purposes
        • Generate Governance Reports
Powered by GitBook

Self-managed versions

  • 2025.1
  • 2024.3
  • 2024.2

Resources

  • Immuta Changelog

Copyright © 2014-2025 Immuta Inc. All rights reserved.

On this page
  • December 2022
  • December 20, 2022
  • December 8, 2022
  • November 2022
  • November 22, 2022
  • November 17, 2022
  • October 2022
  • October 19, 2022
  • September 2022
  • September 27, 2022
  • August 2022
  • August 17, 2022
  • August 4, 2022
  • July 2022
  • July 7, 2022
  • June 2022
  • June 9, 2022
  • May 2022
  • May 11, 2022
  • March 2022
  • March 31, 2022
  • March 14, 2022
  • Databricks Spark integration note
  • Bug fixes
  • Known bugs

Was this helpful?

Export as PDF
  1. Releases
  2. Deployment Notes

2022

Previous2023NextScheduled Maintenance Windows

Last updated 6 days ago

Was this helpful?

December 2022

December 20, 2022

Bug fixes

  • If users registered tables from the same schema as Immuta data sources, users could break data sources they didn't own if they deleted or changed the schema project connection.

  • The Databricks Unity Catalog integration configuration on the App Settings page asked for an "Instance Role ARN" instead of the "Instance Profile ARN."

  • Users were unable to add data sources from the Hive Metastore in the Databricks Spark integration with Unity Catalog.

December 8, 2022

Features and changes

  • Databricks Spark Integration with Unity Catalog Support: Enable Unity Catalog support on Immuta clusters to use the Metastore across your Databricks workspaces and enforce Immuta policies on your data. This integration provides a migration pathway for you to add your tables in Unity Catalog while using Immuta policies. Consequently, when additional Unity Catalog features are available, you will be ready to use them. Databricks SQL policies will continue to be enforced through a view-based method, and interactive cluster policies through the Immuta plugin method.

  • Databricks Runtime 11.2 support.

  • Write Fewer, Simpler ABAC Policies. empower users to write fewer, simpler ABAC (Users with Specific Groups/Attributes) policies. Previously, policy writers had to specify groups in separate policies to grant access. With Enhanced Subscription Policy Variables, Immuta's policy engine compares users' groups with data source or column tags in a single policy to determine if there is a match. Users who have a group that matches a tag on a data source or column will be subscribed to that data source.

  • Immuta supports registering data sources that exceed 1600 columns. However, sensitive data discovery and health checks will not run on those data sources.

  • The maximum length for the Snowflake role prefix when using is 50 characters.

  • Users cannot enable or disable impersonation when editing a previously configured integration.

Bug fixes

  • Alternative owners of data sources were not included in the subscription audit records if the data source was created using the Immuta V2 API.

  • Snowflake Table Grants: If a user who was added to a Snowflake data source through a group Subscription Policy was removed from a data source, that user could see the columns (without any data) of the table when they queried that data in Snowflake.

  • When users edited a Snowflake integration configuration and changed the authentication type to Snowflake External OAuth, the configuration was still saved as Username and Password for the authentication type.

  • Vulnerability: CVE-2022-39299

Known bugs

  • Editing a schema project to a database that already exists fails.

UI and workflow removals

The following UI elements and workflows have been removed. Reach out to your Immuta representative if you need one of these elements re-enabled.

  • Data source Metrics tab.

  • Data source Queries tab.

  • Creating data sources with a SQL statement.

  • Selecting specific columns to hide when creating a data source in the UI or V2 API.

November 2022

November 22, 2022

Enhancement

  • Tag enhancements (public preview): The tag enhancements feature will improve user experience by updating various components of the UI.

Bug fixes

  • Azure Synapse Analytics: If a user was granted access to about 1300 data sources, access to those tables was delayed.

  • Deleting an integration on the App Settings page and saving the configuration caused the Immuta UI to crash.

Known bugs

  • Editing a schema project to a database that already exists fails.

November 17, 2022

Enhancements

  • Collibra integration performance improvements.

  • Immuta's Collibra integration recognizes the implicit relationship between the Database View in Collibra and Immuta data source columns so that tags are properly applied to those columns in Immuta.

  • The Immuta V1 API /dataSource endpoint returns the remote table name so that users can get the schema and table name of a data source in one API call.

Bug fixes

  • The data source Relationships tab only displayed up to 10 associated projects.

  • If creating the Immuta database failed in the Snowflake without Snowflake Governance Controls or Databricks SQL integration, the error returned was incorrect.

  • Removed historical schema monitoring metrics that contained database connection strings.

  • Subqueries that referenced a table that didn't exist never resolved.

  • Policies:

    • Disabling a Global conditional masking policy on a data source could sometimes disable all policies or none of the policies on the data source.

    • If users submitted a Global Policy payload to the API that was missing the subscriptionType from the actions, the Global Policies page broke when trying to display Subscription Policies.

    • Global Subscription Policies that contained the @hasTagAsAttribute variable caused errors and degraded performance.

    • Snowflake with Snowflake Governance Features: Changing a column's masking policy type resulted in errors until users manually synced the policy in Immuta.

  • Redshift:

    • Users were unable to query tables that had a policy with a Limit usage to purpose(s) <ANY PURPOSE> applied to them.

    • There were error-handling inconsistencies between the Immuta UI and the database logs.

  • Vulnerabilities:

    • CVE-2022-3517

    • CVE-2022-3602

    • CVE-2022-37616

    • CVE-2022-39353

Known bugs

  • Editing a schema project to a database that already exists fails.

October 2022

October 19, 2022

Bug fixes

  • Deleting a tag hierarchy deleted any tags with a like name. For example, deleting the tag department would also delete the tag department_marketing.

  • The Refresh External Tags button appeared on the Tag page even if no external catalogs were configured.

  • Users couldn't change the schema detection owners for schema projects.

  • Collibra: If multiple values were assigned to an attribute in Collibra, they were added as a single tag in Immuta. For example, if an attribute list called Color contained values Blue, Green, and Yellow, and Blue and Green were selected in Collibra, Immuta displayed the data tag as Color.Blue,Green. Instead, Immuta should have created two tags: Color.Blue and Color.Green.

  • Webhooks that were listening to setUserAuthorizations were not triggered.

  • Deleting a Data Policy did not enable the Save Policy button.

  • With Approve to Promote enabled, adding a comment to a policy did not enable the Save Policy button.

Known bugs

  • Editing a schema project to a database that already exists fails.

September 2022

September 27, 2022

Features and enhancements

  • Use the latest Databricks Runtime with Immuta. Databricks Runtime 11.0 is now supported in Immuta.

Public preview

  • Ensure that policies are adequately reviewed and approved before they are eligible for production environments. Instead of creating policies directly in production, Approve to Promote allows policy authors to create, assess, and revise policies in a policy-authoring environment. Then, the policy must be approved by a configured number of users before it is promoted to the production environment and enforced on data sources.

Deprecations and breaking changes

  • The undocumented deletedHandlerSubscribers attribute, which indicates a subscription policy changed, was removed from the data source notifications webhook payload. If you were depending on that attribute in your customized webhooks, that code won't work.

Bug fixes

  • IAMs:

    • Microsoft Entra ID: When SCIM was enabled for Microsoft Entra ID, sometimes user attributes were removed from users in Immuta when they should not have been.

  • Policies:

    • Global Subscription Policies that were applied “When selected by data owners” could not be deleted when using Approve to Promote.

    • If a Global Subscription Policy was disabled for a data source, staging that Global Policy on the policies page caused the Subscription policy to change on the data source.

    • Local Policies using @columnTagged() were not properly applied to data in Databricks when the column was tagged.

  • Projects:

    • Project owners could not edit projects with approved purposes and data sources.

    • The baseline percent null values could not be adjusted for k-anonymized columns on the Expert Determination tab in projects.

  • Snowflake:

    • Instances that used the Snowflake integration without Snowflake Governance features were sometimes automatically migrated to using Snowflake Governance features when Immuta upgraded.

  • Vulnerability:

    • CVE-2022-25647

  • Tags sometimes did not update on data sources if those tags were quickly added or removed, which could cause policies to not be updated.

  • The data source page sometimes took several minutes to load if there were over 100,000 data sources registered in Immuta.

  • If a user was a member of a large number of groups (about 2,000), the UI search was sometimes slow.

  • When searching for data sources on an instance with over 30,000 data sources and tables with complex struct columns, the search could take several minutes to return or freeze the Immuta tenant.

  • An Adobe Font requirement caused timeout issues in the Immuta UI.

Known bugs

  • Editing a schema project to a database that already exists fails.

August 2022

August 17, 2022

Enhancements

  • Application Admins can enable policy adjustments separately from HIPAA Expert Determination on the App Settings page.

Bug fixes

  • Snowflake Integration:

    • Schema detection caused non-date columns to be incorrectly tagged "New" for data sources that were added in bulk.

    • Migrating from a Snowflake Using Snowflake Governance Controls integration to a Snowflake Without Using Snowflake Governance Controls integration failed.

    • Enabling a Snowflake Using Snowflake Governance Controls integration using the automatic setup method failed.

  • Sensitive Data Discovery did not automatically run when users bulk created data sources.

  • If Immuta was unable to communicate with an external IAM provider because of a connection failure, groups were removed from Immuta, even if the IAM was still active.

  • When creating 100,000 tables, the data source creation job sometimes expired.

  • User Admins could not delete attributes assigned to an Immuta Accounts user.

  • After configuring SAML and OpenID IAMs, users could not initially log in.

  • In Databricks runtime 10.4, ShowPartitions commands on Delta tables failed.

  • Users were unable to edit Global policies that were not on the first page of results.

  • Automatic Subscription policies could cause out of memory issues if they added about 300 users to a data source.

Known bugs

  • Editing a schema project to a database that already exists fails.

  • Project owners are unable to edit projects with approved purposes and data sources.

August 4, 2022

Breaking change

  • IAM Signing Certificate Required for SAML. You are required to upload your IAM signing certificate to Immuta to add or edit SAML-based IAMs. If you are already using Immuta's SAML integration, provide a signing certificate to existing configured IAMs for them to continue working.

Bug fixes

  • In the Snowflake Governance features integration, unmasked data was sometimes visible for a fraction of a second while data policies were being applied.

  • Databricks user impersonation did not work if backticks enclosed the username.

  • Clicking the Sync User Metadata button in the Immuta UI could queue an infinite number of profile refresh background jobs.

  • The enriched audit logs created an error if data policies did not exist on a data source.

  • The attributes type for users was inconsistent with policy attributes type in the audit logs.

  • Advanced Subscription Policies: If an advanced Subscription policy that did not contain special variables was created, customers with over 100,000 users could experience OOM issues.

  • Okta/SCIM: When adding users to Okta to sync with Immuta, TypeError: attributeValues is not iterable appeared in the logs.

  • LDAP users with parentheses in their common name caused authentication to fail when group sync was enabled.

Known bugs

  • Editing a schema project to a database that already exists fails.

  • Project owners are unable to edit projects with approved purposes and data sources.

  • Databricks Runtime 10.4: Show partitions on delta table fails.

July 2022

July 7, 2022

New features

  • Access background jobs with enhanced visibility. This feature allows you to access information to debug issues and identify the cause.

  • Use the latest Databricks Runtime with Immuta. Databricks Runtime 10.4 LTS is now supported in Immuta.

  • Prove compliance with Databricks audit trails that include denial events. When Immuta users query Databricks tables that have been registered in Immuta, the query audit logs will include denial events and the policies associated with the decision. Such audit trails are required by some information security teams to prove compliance with secure data access.

Private preview

  • Snowflake:

Removed features

Removed features are no longer available in the product.

Feature
Deprecation notice
End of life (EOL)

Advanced rules DSL for data policies

2022.1.0

2022.2

Differential privacy

2022.1.0

2022.2

The custom / external policy handler

2022.1.0

2022.2

Policy export/import

2021.4

2022.2

Alternative solutions

Bug fixes

  • Creating a policy using the Advanced DSL Data policy builder in the view-based Snowflake integration sometimes caused errors.

  • When a user's entitlements changed, Immuta did not properly send notification to the integration to GRANT or REVOKE access to tables in the remote system.

  • Entering a single quotation mark in the search bar sometimes caused an error.

  • After an Alation or Collibra catalog was configured, new data sources were not linked to the catalogs automatically.

  • Logging in to Immuta after being logged out due to inactivity sometimes displayed a blank page.

  • Local policies sometimes appeared on the Global policies page.

  • Activity panel covered the policy builder when long SQL statements were entered for conditional policies.

  • Clicking the Policies icon in the left sidebar while editing a Subscription policy displayed an empty Data Policy Builder instead of the Policies page.

  • When configuring an External REST Catalog, users could not click the Test Connection button if the No Authentication option was selected.

  • The Immuta login page did not display for some older browser versions of Edge.

Known bugs

  • LDAP users with parentheses in CN cause authentication to fail if group sync is enabled.

  • Databricks Runtime 10.4: Show partitions on delta table fails.

June 2022

June 9, 2022

Enhancements

  • The visual styles in the application have been updated.

  • Users can add multiple alternative owners to data sources at once.

Feature removal

  • Policy import/export

Bug fixes

  • When attributes were added to groups that affected an Automatic Subscription policy, users were added or removed from the data source(s) appropriately, but these changes were not audited.

  • Deleting the last values or all values from user or group attributes caused errors when processing Automatic Subscription policies.

  • Local policies that were created or updated sometimes displayed on the Global Policy page.

  • Writing a Global ABAC Subscription policy using @username in the Advanced DSL builder did not subscribe the user to the data source.

  • Changing a Global Allow Individually Selected Users Subscription policy back to a Global ABAC policy that used special functions caused an error: Error: "actions[0].exceptions.conditions[0]" does not match any of the allowed types.

  • If a policy was added through the Immuta CLI, editing that policy in the Immuta UI sometimes caused an error.

  • After being added to a data source through an Automatic Subscription policy, users sometimes encountered an error when making unmasking requests.

  • Creating a Global conditional masking policy in the Advanced DSL builder that used @iam or @username caused an error when the policy was applied to a data source.

  • Redshift:

    • Regex masking policies that used metacharacters with backslashes (\d, \s, etc.) did not mask columns.

    • Users' metadata was not updated in the integration if their usernames contained apostrophes.

May 2022

May 11, 2022

New features

Enhancement

  • Improved performance of auto-subscription policies.

Bug fixes

  • If an SSL CA cert was used when setting up an LDAP IAM, clicking the Test LDAP Sync button resulted in an error.

  • Tags were removed from data sources if they were applied after data source creation and before the external catalog health check (which is triggered by navigating to the data source). However, tags applied to a data source during creation remained on the data source.

  • Group permissions were not considered when users attempted to create data sources or Global Policies. For example, if a user was a member of a group that had the GOVERNANCE permission assigned to it, that user was not inheriting the GOVERNANCE permission. Consequently, when that user tried to apply a Global Policy to a data source, they received an error. However, if a user had the GOVERNANCE permissions applied to their account directly, they were able to create a Global Policy. This same behavior occurred with the CREATE_DATA_SOURCE permission.

  • Creating an Immuta data source from a Databricks view that contained an implicit column alias failed.

Known bugs

  • Editing a schema project to a database that already exists fails.

  • The App Settings page freezes when a user selects Migrate Users from BIM when configuring an external IAM.

  • An auto-subscription policy that adds more than 64,000 users to a data source can cause errors in the logs and impact subscription reports.

  • Integration jobs can end up in an expired state, even if they successfully are processed, under certain load conditions.

March 2022

March 31, 2022

    • Before this release, if someone was manually added by an owner or Governor and didn’t meet the ABAC policy requirements, they could query the table, but no rows would come back because they didn’t have the groups or attributes specified in the policy. Now, manually adding users overrides the ABAC policy. Therefore, any users who had been manually subscribed to a data source but could not see any data will see data after this upgrade. You can prevent this behavior by either switching the Subscription policy to auto-subscribe (which removes users who don't meet the Subscription policy) or adding a Data Policy that redacts rows for users who do not have the groups or attributes specified in the Subscription policy.

    • If users have existing Global Subscription policies that were combined, those will not change on the data source after the upgrade. However, the **Require Manual Subscription** option will automatically be enabled on those existing policies, so users who meet the conditions of the policy will not be automatically subscribed.

  • Sensitive data discovery global template and default sample size UI (public preview): Users can adjust these configurations on the App Settings page. If users already had a Global Template or default sample size configured in the Advanced Configuration section, these configurations will migrate to the new Sensitive Data Discovery section on the App Settings page when they upgrade their Immuta tenant.

  • Support for PrivateLink with Snowflake on AWS: Contact Immuta to enable this feature.

Bug fixes

  • "Active" tags on merged Share Responsibility Global policies did not show the active number of data sources they were enforced on.

  • The configuration section for project workspaces could break if a handler was not enabled.

  • Databricks:

    • If a table in Databricks had been created from an AVRO schema file, queries against the table on Immuta-enabled clusters only returned results for partition columns. Additionally, trying to create tables from an AVRO schema file on Immuta-enabled clusters returned an error: "Unable to infer the schema."

    • Fixed Databricks init script error handling when artifacts weren't downloading correctly.

    • Errors occurred when using mlflow.spark.log_model on non-Machine Learning clusters.

  • Because Immuta's built-in identity manager (BIM) is not enabled in SaaS, the App Settings page froze when a user selected Migrate Users when configuring an external IAM.

  • Snowflake:

    • When enabling a Snowflake integration with an external catalog, if the host had multiple periods in the account the Snowflake plugin was invalid.

    • When users tried to edit the Excepted Roles/Users List for the integration, the configuration saved correctly. However, when the App Settings page refreshed, the Excepted Roles/Users List was empty and the allow list in Snowflake was not updated.

    • When a user's group was deleted in an external IAM, that update appeared in Immuta but was not syncing properly in Snowflake.

    • When using Snowflake controls with Excepted Roles specified, if users tried to do an outer join using a column that had a masking policy applied, it resulted in an error: SQL compilation error: Invalid expression [] in VALUES clause.

Known bugs

  • Editing a schema project to a database that already exists fails.

  • Project owners are unable to edit projects with approved purposes and data sources.

March 14, 2022

  • Disable query engine: Application Admins can disable the Query Engine on the App Settings page.

  • New Immuta UI: Although the most significant change is the adjustment to the visual styles in the application, other UI changes include an expandable left navigation and dark mode support.

  • Support for AWS-Sydney.

Databricks Spark integration note

Bug fixes

  • Databricks:

    • Views: Although users could create views in Databricks from Immuta data sources they were subscribed to, when users tried to select from those views, they received an error saying that the Immuta data source the view was created against did not exist or that they did not have access to it.

    • External Delta Tables: Querying an external Delta table that had been added as an Immuta data source as a non-admin resulted in a NoSuchDataSourceException error if the table path had a space in it.

    • Sensitive Data Discovery failed for Databricks data sources when initiated in the UI if the cluster was configured to use ephemeral overrides.

    • The integration did not work with the Databricks Runtime 9.1 maintenance update.

  • Ephemeral Overrides:

    • The UI was not displaying the checkbox to apply the ephemeral override to multiple data sources.

    • Ephemeral overrides were not being used when calculating column detection.

  • Out of memory errors occurred when several actions or jobs ran simultaneously, such as

    • Bulk disabling data sources

    • Bulk creating data sources

    • Column detection

    • Schema detection

  • Sensitive Data Discovery: Users could not configure sampleSize to override the default number of records sampled from a data source.

  • Snowflake Governance Features Integration: When a data source existed in Immuta but not in Snowflake and a user tried to refresh the policies, Immuta continuously retried to update the policies and then failed with the following error: Execution error in store procedure UPSERT_POLICIES: SQL compilation error: Table does not exist or not authorized.

  • Vulnerabilities

    • CVE-2022-0355: Information Exposure in simple-get

    • CVE-2022-0235: Information Exposure in node-fetch

    • CVE-2022-0155: Information Exposure in follow-redirects

    • CVE-2021-3807: Regular Expression Denial of Service (ReDoS) in ansi-regex

    • CWE-451: User Interface (UI) Misrepresentation of Critical Information in swagger-ui-dist

Known bugs

  • Databricks: Errors occur when using mlflow.spark.log_model on non-Machine Learning clusters.

  • Editing a schema project to a database that already exists fails.

  • Because Immuta's built-in identity manager (BIM) is not enabled in SaaS, the App Settings page freezes when a user selects Migrate Users when configuring an external IAM.

Connect Snowflake data to Immuta without providing your account credentials. Immuta supports as a non-password authentication mechanism when configuring the Snowflake integration or creating Snowflake data sources.

Let Immuta manage privileges on your Snowflake tables instead of manually granting table access to users. With enabled, Snowflake Administrators no longer have to manually grant table access to users; instead, Immuta manages privileges on Snowflake tables and views according to the subscription policies on the corresponding Immuta data sources.

Share policy-protected data in Snowflake with other Snowflake accounts using . This integration allows you to author policies in Immuta and protect data shared with other Snowflake accounts in real time. For example, if a pharmaceutical company needed to share trial results outside their Snowflake account and needed to protect PHI, they could share that data outside their account and still have Immuta policies enforced.

Instead of using differential privacy, combine on your data. Immuta requires that you opt in to use k-anonymization. To enable k-anonymization for your account, contact your Immuta representative.

As an alternative to the policy export/import feature, use the to clone your Global policies.

Users can now specify column tags instead of just data source tags with the @hasTagAsAttribute .

: This feature empowers users to write fewer, simpler ABAC (Users with Specific Groups/Attributes) policies. Previously, policy writers had to specify user attribute keys in separate policies to grant access. With Enhanced Subscription Policy Variables, Immuta's policy engine compares user attributes with data source properties (database, host, schema, table, or tag) in a single policy to determine if there is a match. When attribute keys match the property specified, users will be able to subscribe to the data source(s).

: With this feature enabled with the Snowflake with Governance Controls integration, Snowflake Administrators no longer have to manually grant table access to users; instead, Immuta manages privileges on Snowflake tables and views according to the subscription policies on the corresponding Immuta data sources.

Edit configuration for integrations: Users can edit the configuration for , Databricks SQL, , and without disabling the integration.

: Governors can now add an approval workflow as an alternative method of access to data sources if a user does not meet the conditions of the Users with Specific Groups/Attributes (ABAC) Global Subscription Policy.

: Through this integration, Immuta applies policies directly in Starburst so that users can keep their existing tools and workflows (querying, reporting, etc.) and have per-user policies dynamically applied at query time.

Redshift integration performance issues related to .

Databricks init script: To use the updated Immuta init script and cluster policies, existing SaaS users must update their Databricks cluster configuration following this guide.

Snowflake table grants
Snowflake Data Sharing
Snowflake Table Grants
Manual approvals in ABAC global subscription policies
Starburst integration
Manually Update Your Databricks Cluster
Snowflake Table Grants
Azure Synapse
k-anonymization and randomized response policies
Redshift
Immuta CLI
Enhanced Subscription Policy Variables (Public Preview)
Enhanced Subscription Policy variable
Enhanced Subscription Policy Variables (Public Preview)
Python UDF concurrency capabilities
Snowflake
Snowflake External OAuth