Universal Audit Model (UAM)
This feature is in public preview.
Immuta’s universal audit model (UAM) provides audit logs with a consistent structure for query, authentication, policy, project, and tag events from your Immuta users and data sources. You can view the information in these UAM audit logs on the Detect dashboards or export the full audit logs to S3 for monitor services and data pipelines and process the audit logs with your log data processors and tools. This capability fosters convenient integrations with log monitoring services and data pipelines.
You can specify an S3 bucket destination where Immuta will periodically export audit logs. The events captured are only events relevant to user and system actions that affect Immuta or the integrated data platforms, such as creating policies or data sources and running queries.
- Snowflake Governance Features integration, Databricks integration, or Databricks Unity Catalog integration
- Immuta CLI
Immuta audit service
The Immuta audit service is an independent microservice that captures audit events from Immuta and queries run against your Snowflake, Databricks, or Unity Catalog integration.
Immuta stores the export endpoints you provide during configuration, retrieves the audit records pushed to the audit service by your integration, and manages the audit exports based on an export schedule you define. These audit records are also stored to support future reporting and user interface enhancements that will allow you to search based on keywords and facets easily across the entire body of audit events.
Universal audit model (UAM) events captured
The following sets of events are captured and can be exported to S3. The export will only contain data access logs and the following configuration events:
- Attribute events
- Configuration event:
- Data source events
- Data source synced from data access pattern or integration events:
- Domain events
- Group events
- License events
- Permissions events
- Policy events
- Project events
- Purpose events
- Snowflake query events:
- Databricks query events:
- Databricks Unity Catalog query events:
- Sensitive data discovery events:
- SQL credentials events
- Subscription events
- Tag events
- User authentication events
- User events
- Webhook events
For example audit events captured in UAM, see the Query audit logs pages.
Audit export workflow
- When you configure the audit S3 export using the CLI, the audit service stores the export endpoint you provided.
- After the integration endpoint has been configured, the export scheduler will run on the schedule you defined in your configuration.
- When users query data and the event is audited, the audit service receives events from your Snowflake, Databricks, or Unity Catalog integration.
- Immuta exports the audit logs to your configured S3 bucket.
- The export will only contain data access logs; other logs, such as those related to policy configuration, data source and project creation, and tags, will not be included.
- The audit service does not capture system-level logging and debugging information, such as 404 errors.
Snowflake query audit limitations
- Snowflake query audit events from a query using cached results will show
Unity Catalog query audit limitations
- Enrichment of audit logs with Immuta entitlements information is not supported. While you will see these
entitlements in the Databricks Spark audit logs, the following will not be in the Databricks Unity Catalog audit
- Immuta policies information
- User attributes
- Immuta determines unauthorized events based on error messages within Unity Catalog records. When the error messages contain expected language, unauthorized events will be available for Databricks Unity Catalog audit logs, in other cases it is not possible to determine the cause of an error.
- Unauthorized logs for cluster queries are not marked as unauthorized; they always will be a failure.
- Data source information will be provided when available:
- For some queries, Databricks Unity Catalog does not report the target data source for the data access operation. In these cases the activity is audited, yet the audit record in Immuta will not include the target data source information.
- The target data source information is not available for unauthorized queries and events.
- The column affected by the query is not currently supported.
- The cluster for the Unity Catalog integration must always be running for Immuta to audit activity and present audit logs.
Databricks Spark and Databricks Unity Catalog audit logs will have the same event type of
DatabricksQuery. They can be distinguished from each other by the
servicevalue. Databricks Spark queries will be
plugin. Databricks Unity Catalog queries on a cluster will be
cluster, and Databricks Unity Catalog queries on a SQL warehouse will be