Skip to content

Universal Audit Model (UAM)

Public preview

This feature is in public preview.

Immuta’s universal audit model (UAM) provides audit logs with a consistent structure for query, authentication, policy, project, and tag events from your Immuta users and data sources. You can view the information in these UAM audit logs on the Detect dashboards or export the full audit logs to S3 and ADLS for long-term backup and processing with log data processors and tools. This capability fosters convenient integrations with log monitoring services and data pipelines.

You can specify an S3 bucket destination where Immuta will periodically export audit logs when using S3. When using ADLS, you can specify the container destination where Immuta will export audit logs. If desired, users can configure both export options to export their audit logs to S3 and ADLS simultaneously.

The events captured are events relevant to user and system actions that affect Immuta or the integrated data platforms, such as creating policies or data sources and running queries.

Immuta audit service

The Immuta audit service is an independent microservice that captures audit events from Immuta and queries run against your Snowflake, Databricks, or Unity Catalog integration.

Immuta stores the export endpoints you provide during configuration, retrieves the audit records pushed to the audit service by your integration, and manages the audit exports based on an export schedule you define. These audit records are also stored to support future reporting and user interface enhancements that will allow you to search based on keywords and facets easily across the entire body of audit events.

Universal audit model (UAM) events captured

The following sets of events are captured and can be exported to S3. The export will only contain data access logs and the following configuration events:

  • Attribute events
    • AttributeApplied
    • AttributeRemoved
  • Configuration event: ConfigurationUpdated
  • Data source events
    • DatasourceCreated
    • DatasourceUpdated
    • DatasourceDeleted
    • DatasourceDisabled
    • Data source synced from data access pattern or integration events: DatasourceCatalogSynced
  • Domain events
    • DomainCreated
    • DomainDeleted
    • DomainUpdated
    • DomainDataSourcesUpdated
    • DomainPermissionsUpdated
  • Group events
    • GroupCreated
    • GroupDeleted
    • GroupUpdated
  • License events
    • LicenseCreated
    • LicenseDeleted
  • Permissions events
    • PermissionRemoved
    • PermissionApplied
  • Policy events
    • DatasourceGlobalPolicyRemoved
    • DatasourceGlobalPolicyApplied
    • DatasourcePolicyCertified
    • DatasourcePolicyDecertified
    • DatasourcePolicyCertificationExpired
    • DatasourceGlobalPolicyDisabled
    • DatasourceGlobalPolicyConflictResolved
    • GlobalPolicyCreated
    • GlobalPolicyUpdated
    • GlobalPolicyDeleted
    • GlobalPolicyReviewRequested
    • GlobalPolicyChangeRequested
    • GlobalPolicyApprovalRescinded
    • GlobalPolicyApproved
    • GlobalPolicyPromoted
    • LocalPolicyCreated
    • LocalPolicyUpdated
    • PolicyAdjustmentCreated
    • PolicyAdjustmentDeleted
  • Project events
    • ProjectCreated
    • ProjectUpdated
    • ProjectDisabled
    • ProjectDeleted
  • Purpose events
    • PurposeUpserted
    • PurposeUpdated
    • PurposeDeleted
    • DatasourceAppliedToProject
    • DatasourceRemovedFromProject
    • ProjectPurposesAcknowledged
    • ProjectPurposeApproved
    • ProjectPurposeDenied
  • QueryEngineQuery
  • Snowflake query events: SnowflakeQuery
  • Databricks query events: DatabricksQuery
  • Databricks Unity Catalog query events: DatabricksQuery1
  • Starburst query events: TrinoQuery
  • Sensitive data discovery events:
    • SDDClassifierDeleted
    • SDDClassifierCreatedRegex
    • SDDClassifierCreatedColumnNameRegex
    • SDDClassifierCreatedDictionary
    • SDDClassifierUpdatedRegex
    • SDDClassifierUpdatedColumnNameRegex
    • SDDClassifierUpdatedDictionary
    • SDDTemplateCreated
    • SDDTemplateUpdated
    • SDDTemplateDeleted
    • SDDTemplateApplied
    • SDDTemplateCloned
  • SQL credentials events
    • SqlCreate
    • SqlDelete
    • SqlResetPassword
  • Subscription events
    • SubscriptionCreated
    • SubscriptionUpdated
    • SubscriptionDeleted
    • SubscriptionRequestDenied
    • SubscriptionRequestApproved
  • Tag events
    • TagApplied
    • TagCreated
    • TagDeleted
    • TagRemoved
    • TagUpdated
  • User authentication events
    • UserAuthenticated
    • UserLogout
  • User events
    • UserUpdated
    • UserCreated
    • UserDeleted
    • UserCloned
    • UserPasswordUpdated
    • UserOneTimeTokenCreated
  • Webhook events
    • WebhookCreated
    • WebhookDeleted

For example audit events captured in UAM, see the Query audit logs pages.

Audit export workflow

  1. When you configure the audit export using the CLI for S3 and ADLS, the audit service stores the export endpoint you provided.
  2. After the integration endpoint has been configured, the export scheduler will run on the schedule you defined in your configuration.
  3. When users query data and the event is audited, the audit service receives events from your Snowflake, Databricks Spark, Databricks Unity Catalog, or Starburst (Trino) integration.
  4. Immuta exports the audit logs to your configured S3 bucket or ADLS container.

Supported audit details by integration

The table below outlines the audit support by each of Immuta's integrations with UAM and what information is included in the audit logs.

Snowflake Databricks Spark Databricks Unity Catalog Starburst (Trino) Redshift Azure Synapse Analytics
Table and user coverage Registered data sources and users Registered data sources and users All tables and users Registered data sources and users ❌ ❌
Object queried ✅ ✅ 1 ✅ ❌ ❌
Columns returned ✅ ❌ ❌ ❌ ❌ ❌
Query text ✅ ✅ 2 ✅ ❌ ❌
Unauthorized information 3 ✅ 4 ❌ ❌ ❌
Policy details ❌ ✅ ❌ ❌ ❌ ❌
User's entitlements ❌ ✅ ❌ ❌ ❌ ❌
Column tags ✅ ❌ ❌ ❌ ❌ ❌
Table tags ✅ ❌ ❌ ❌ ❌ ❌

Legend:

  • ✅ This is available and the information is included in audit logs.
  • ❌ This is not available and the information is not included in audit logs.
  • There is limited availability; see the footnote for more details.

Limitations

  • The audit service does not capture system-level logging and debugging information, such as 404 errors.

Snowflake query audit limitations

  • Snowflake query audit events from a query using cached results will show 0 for the rowsProduced field.

Unity Catalog query audit limitations

  • Enrichment of audit logs with Immuta entitlements information is not supported. While you will see these entitlements in the Databricks Spark audit logs, the following will not be in the Databricks Unity Catalog audit logs:
    • Immuta policies information
    • User attributes
    • Groups
  • Immuta determines unauthorized events based on error messages within Unity Catalog records. When the error messages contain expected language, unauthorized events will be available for Databricks Unity Catalog audit logs, in other cases it is not possible to determine the cause of an error.
  • Unauthorized logs for cluster queries are not marked as unauthorized; they always will be a failure.
  • Data source information will be provided when available:
    • For some queries, Databricks Unity Catalog does not report the target data source for the data access operation. In these cases the activity is audited, yet the audit record in Immuta will not include the target data source information.
    • The target data source information is not available for unauthorized queries and events.
  • The column affected by the query is not currently supported.
  • The cluster for the Unity Catalog integration must always be running for Immuta to audit activity and present audit logs.

Starburst (Trino) limitations

  • Audit for the columns accessed in the query is not currently supported.
  • Audit for unauthorized access is not currently supported.
  • Audit including the user’s entitlements is not currently supported.

  1. Databricks Spark and Databricks Unity Catalog audit logs will have the same event type of DatabricksQuery. They can be distinguished from each other by the service value. Databricks Spark queries will be plugin. Databricks Unity Catalog queries on a cluster will be cluster, and Databricks Unity Catalog queries on a SQL warehouse will be warehouse

  2. Audit will return the commandText which often shows the query made. 

  3. Unauthorized information is only available when the integration has table replacements enabled. 

  4. Unauthorized queries will be audited when available.