Databricks Spark Query Audit Logs
In addition to the executed Spark plan, the tables, and the tables' underlying paths for every audited Spark job, Immuta captures the code or query that triggers the Spark plan. Immuta audits the activity of Immuta users on Immuta data sources.
Requirements
Databricks users registered as Immuta users: Note that the users' Databricks usernames must be mapped to Immuta. Without this, Immuta will not know the users are Immuta users and will not collect audit events for their data access activity.
Store audit logs
By default Databricks audit logs expire after 90 days. Export the universal audit model (UAM) logs to S3 or ADLS Gen2, and store audit logs outside of Immuta in order to retain the audit logs long-term.
Audit schema
Each audit message from the Immuta platform will be a one-line JSON object containing the properties listed below.
action
The action associated with the audit log.
QUERY
actor.type
The Immuta user type of the actor who made the query.
USER_ACTOR
actor.id
The Immuta user ID of the actor who made the query.
taylor@databricks.com
actor.name
The Immuta name of the user who made the query.
Taylor
actor.identityProvider
The IAM the user is registered in. bim
is the built-in Immuta IAM.
bim
sessionId
The session ID of the user who performed the action.
01ee14d9-cab3-1ef6-9cc4-f0c315a53788
actionStatus
Indicates whether or not the user was granted access to the data. Possible values are UNAUTHORIZED
, FAILURE
, or SUCCESS
.
SUCCESS
actionStatusReason
When a user's query is denied, this property explains why. When a query is successful, this value is null
.
eventTimestamp
The time the query occurred.
2023-06-27T11:03:59.000Z
id
The unique ID of the audit record.
9f542dfd-5099-4362-a72d-8377306db3b8
customerId
The unique Databricks customer ID.
9f542dfd-5099-4362-a72d-8377306db3b8
targetType
The type of targets affected by the query; this value will always be DATASOURCE
.
DATASOURCE
targets
A list of the targets affected by the query.
See the example below
auditPayload.type
The type of audit record; this value will always be: QueryAuditPayload
.
QueryAuditPayload
auditPayload.queryId
The unique ID of the query. If the query joins multiple tables, each table will appear as a separate log, but all will have the same query ID.
01ee14da-517a-1670-afce-0c3e0fdcf7d4
auditPayload.query
The query that was run in the integration. Immuta truncates the query text to the first 2048 characters.
See the example below
auditPayload.startTime
The date and time the query started in UTC.
2023-06-27T11:03:59.000Z
auditPayload.duration
Not available for Databricks Spark audit events.
null
auditPayload.accessControls
Includes the user's groups, attributes, and current project at the time of the query.
auditPayload.policySet
Provides policy details.
auditPayload.technologyContext.type
The technology the query was made in.
DatabricksContext
auditPayload.technologyContext.clusterId
The Databricks cluster ID.
null
auditPayload.technologyContext.clusterName
The Databricks cluster name.
databricks-cluster-name
auditPayload.technologyContext.workspaceId
The Databricks workspace ID.
8765531160949612
auditPayload.technologyContext.pathUris
The Databricks URI scheme for the storage type.
["dbfs:/user/hive/warehouse/your_database.db/movies"]
auditPayload.technologyContext.metastoreTables
The Databricks metastore tables.
["your_database.movies"]
auditPayload.technologyContext.queryLanguage
The queryLanguage
corresponds to the programming language used: SQL, Python, Scala, or R. Audited JDBC queries will indicate that it came from JDBC here.
python
auditPayload.technologyContext.queryText
The queryText
will contain either the full notebook cell (when the query is the result of a notebook) or the full SQL query (when it is a query from a JDBC connection).
See the example below
auditPayload.technologyContext.immutaPluginVersion
The Immuta plugin version for the Databricks Spark integration.
2022.3.0-spark-3.1.1
receivedTimestamp
The timestamp of when the audit event was received and stored by Immuta.
2023-06-27T15:18:22.314Z
Example queryText
Below is an example of the queryText
, which contains the full notebook cell (since the query was the result of a notebook). If the query had been from a JDBC connection, the queryText
would contain the full SQL query.
This notebook cell had multiple audit records associated with it.
Example audit record
Enriched Databricks audit logs
Beyond raw audit events (such as “John Doe queried Table X in Databricks"), the Databricks audit records include the policy information enforced during the query execution, even if a query was denied.
Queries will be denied if at least one of the conditions below is true:
User does not meet policy conditions.
User is not subscribed to the data source.
Data source is not in the user's current project.
Data source is in the user's current project, but the user is not subscribed to the data source.
Data source is not registered in Immuta.
User entitlements
The user's entitlements
represent the state at the time of the query. This includes the following fields:
project
The user's current project.
attributes
The user's attributes.
groups
The user's groups.
impersonatedUsers
The user that the current user is impersonating.
Policy information
The policySet
includes the following fields:
subscriptionPolicyType
The type of subscription policy.
MANUAL
, ADVANCED
, or ENTITLEMENTS
type
Indicates whether the policy is a subscription or data policy. Query denied records will always be a subscription policy type
.
SUBSCRIPTION
or DATA
ruleAppliedForUser
True if the policy was applied for the user. If false
, the user was an exception to the policy.
true
or false
rationale
The policy rationale written by the policy creator.
-
global
True if the policy was a global policy. If false
, the policy is local.
true
or false
mergedPolicies
Shows the policy information for each of the merged global subscription policies, if available.
-
Last updated