Databricks Unity Catalog
Connect to Delta Lake tables in Databricks Unity Catalog through Dremio. Unity Catalog acts as a centralized metastore for managing tables across your Databricks environment, and Dremio can query these tables when they use Delta Lake's Universal Format (UniForm).
UniForm Iceberg Required
To query Delta tables through Dremio, you must enable UniForm on your tables in Databricks. UniForm provides an Iceberg metadata layer that allows Iceberg-compatible clients like Dremio to read Delta tables.
Requirements:
- UniForm must be enabled on your Delta tables. See Enable UniForm Iceberg in the Databricks documentation.
- Your tables must use Delta protocol minReaderVersion 2 or higher. See Features by protocol version.
UniForm tables have certain limitations. Review Limitations in the Databricks documentation before proceeding.
Configure Databricks Unity Catalog as a Source
To add a Unity Catalog source:
- On the Datasets page, to the right of Sources in the left panel, click
. - In the Add Data Source dialog, under Lakehouse Catalogs, select Unity Catalog.
General
To configure the source connection:
-
For Name, enter a name for the source. The name you enter must be unique in the organization. Also, consider a name that is easy for users to reference. This name cannot be edited once the source is created. The name cannot exceed 255 characters and must contain only the following characters: 0-9, A-Z, a-z, underscore (_), or hyphen (-).
-
For Unity Catalog, enter the catalog name.
-
For Endpoint URI, specify the catalog service URI. For more information on how to find your Unity Catalog URI, see Read using the Unity Catalog Iceberg catalog endpoint.
-
Under Authentication Type, select one of the following:
- Databricks Personal Access Token: Provide a Databricks Personal Access Token (PAT). Depending on your deployment, see the following:
- AWS-based Unity deployment: See Databricks personal access tokens for service principals to create a PAT.
- Azure Databricks-based Unity deployment: See Azure Databricks personal access tokens for service principals to create a PAT.
- Microsoft Entra ID: Provide the following information from your application registered in Microsoft Entra ID:
- Application ID: The Application ID of the application registered in Microsoft Entra ID.
- OAuth 2.0 Token Endpoint: The OAuth 2.0 token endpoint URL (for example,
https://login.microsoftonline.com/{tenantId}/oauth2/v2.0/token). - Application Secret: The Application Secret of the application registered in Microsoft Entra ID.
- Databricks Personal Access Token: Provide a Databricks Personal Access Token (PAT). Depending on your deployment, see the following:
-
By default, Use vended credentials is turned on. This allows Dremio to connect to the catalog and receive temporary credentials for the underlying storage location. When this option is enabled, you don't need to add the storage authentication in Advanced Options.
-
(Optional) For Allowed Namespaces, add each namespace and check the option if you want to include their entire subtrees. Tables are organized into namespaces, which can be at the top level or nested within one another. Namespace names cannot contain periods or spaces.
Advanced Options
Storage Authentication
If you disabled vended credentials in the General tab, you must manually provide the storage authentication.
Dremio supports Amazon S3, Azure Storage, and Google Cloud Storage (GCS) as object storage services. For acceptable storage authentication configurations, see the following catalog properties and credentials for each service option.
- S3 Access Key
- S3 Assumed Role
- Azure Shared Key
- GCS KeyFile
-
fs.s3a.aws.credentials.provider(property)- Value:
org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider - Description: Required field for a Unity Catalog source
- Value:
-
fs.s3a.access.key(credential)- Value:
<your_access_key> - Description: AWS access key ID used by S3A file system. Omit for IAM role-based or provider-based authentication.
- Value:
-
fs.s3a.secret.key(credential)- Value:
<your_secret_key> - Description: AWS secret key used by S3A file system. Omit for IAM role-based or provider-based authentication.
- Value:
-
fs.s3a.assumed.role.arn(property)- Value:
arn:aws:iam::*******:role/OrganizationAccountAccessRole - Description: AWS ARN for the role to be assumed
- Value:
-
fs.s3a.aws.credentials.provider(property)- Value:
com.dremio.plugins.s3.store.STSCredentialProviderV1 - Description: Required field for a Unity Catalog source
- Value:
-
fs.s3a.assumed.role.credentials.provider(property)- Value:
org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider - Description: Use only if the credential provider is
AssumedRoleCredentialProvider; lists credential providers to authenticate with the STS endpoint and retrieve short-lived role credentials.
- Value:
fs.azure.account.key(credential)- Value:
<your_account_key> - Description: Storage account key
- Value:
-
fs.AbstractFileSystem.gs.impl(property)- Value:
com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS - Description: Required field for a Unity Catalog source
- Value:
-
fs.gs.auth.service.account.enable(property)- Value:
true - Description: Required field for a Unity Catalog source
- Value:
-
fs.gs.impl(property)- Value:
com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem - Description: Required field for a Unity Catalog source
- Value:
-
fs.gs.bucket(property)- Value:
<your_bucket> - Description: Bucket where your data is stored for Unity Catalog in GCS
- Value:
-
fs.gs.project.id(property)- Value:
<your_project_ID> - Description: Project ID from GCS
- Value:
-
fs.gs.auth.service.account.email(property)- Value:
<your_client_email> - Description: Client email from GCS
- Value:
-
fs.gs.auth.service.account.private.key.id(property)- Value:
<your_private_key_id> - Description: Private key ID from GCS
- Value:
-
dremio.gcs.use_keyfile(property)- Value:
true - Description: Required field for a Unity Catalog source
- Value:
-
fs.gs.auth.service.account.private.key(credential)- Value:
<your_private_key> - Description: Private key from GCS
- Value:
Cache Options
- Enable local caching when possible: Selected by default. Along with asynchronous access for cloud caching, local caching can improve query performance.
- Max percent of total available cache space to use when possible: Specifies the disk quota, as a percentage, that a source can use on any single executor node only when local caching is enabled. The default is 100 percent of the total disk space available on the mount point provided for caching. You can either manually enter a percentage in the value field or use the arrows to the far right to adjust the percentage.
Reflection Refresh
You can set the policy that controls how often reflections are scheduled to be refreshed automatically, as well as the time limit after which reflections expire and are removed.
Refresh Settings
- Never refresh: Prevent automatic Reflection refresh. By default, Reflections refresh automatically.
- Refresh every: Set the refresh interval in hours, days, or weeks. Ignored if Never refresh is selected.
- Set refresh schedule: Specify a daily or weekly refresh schedule.
Expire Settings
- Never expire: Prevent reflections from expiring. By default, reflections expire after the configured time limit.
- Expire after: The time limit after which reflections are removed from Dremio, specified in hours, days, or weeks. Ignored if Never expire is selected.
Metadata
Specifying metadata options is handled with the following settings.
Dataset Handling
Remove dataset definitions if underlying data is unavailable (default).
Metadata Refresh
These are the optional Metadata Refresh parameters:
Dataset Discovery
The refresh interval for fetching top-level source object names such as databases and tables.
- Fetch every: You can choose to set the frequency to fetch object names in minutes, hours, days, or weeks. The default frequency to fetch object names is 1 hour.
Dataset Details
The metadata that Dremio needs for query planning, such as information needed for fields, types, shards, statistics, and locality.
- Fetch mode: You can choose to fetch only from queried datasets. Dremio updates details for previously queried objects in a source. By default, this is set to Only Queried Datasets.
- Fetch every: You can choose to set the frequency to fetch dataset details in minutes, hours, days, or weeks. The default frequency to fetch dataset details is 1 hour.
- Expire after: You can choose to set the expiry time of dataset details in minutes, hours, days, or weeks. The default expiry time of dataset details is 3 hours.
Privileges
You can grant privileges to specific users or roles. See Privileges for additional information.
To grant access to a user or role:
- For Privileges, enter the user name or role name that you want to grant access to and click the Add to Privileges button. The added user or role is displayed in the USERS/ROLES table.
- For the users or roles in the USERS/ROLES table, toggle the checkmark for each privilege you want to grant on the Dremio source that is being created.
- Click Save after setting the configuration.
Update a Databricks Unity Catalog Source
To update a Unity Catalog source:
- On the Datasets page, under Lakehouse Catalogs in the panel on the left, find the name of the source you want to edit.
- Right-click the source name and select Settings from the list of actions. Alternatively, click the source name, and then click the
at the top-right corner of the page. - In the Source Settings dialog, edit the settings you wish to update. Dremio does not support updating the source name. For information about the settings options, see Configure Databricks Unity Catalog as a Source.
- Click Save.
Delete a Databricks Unity Catalog Source
To delete a Unity Catalog source:
- On the Datasets page, click Sources > Lakehouse Catalogs in the panel on the left.
- In the list of data sources, hover over the name of the source you want to remove and right-click.
- From the list of actions, click Delete.
- In the Delete Source dialog, click Delete to confirm that you want to remove the source.
If the source is in a bad state (for example, Dremio cannot authenticate to the source or the source is otherwise unavailable), only users who belong to the ADMIN role can delete the source.