Snowflake Open Catalog
Dremio supports Snowflake Open Catalog as an Iceberg catalog source. With this source connector, you can connect to and read from internal and external Snowflake Open Catalogs and write to external Snowflake Open Catalogs.
Prerequisites
You will need the catalog Service URI, Client ID, and Client Secret from the Snowflake setup. For a walkthrough of the Snowflake setup, refer to Query a table in Snowflake Open Catalog using a third-party engine.
Configure Snowflake Open Catalog as a Source
To add a Snowflake Open Catalog source:
- On the Datasets page, to the right of Sources in the left panel, click
. - In the Add Data Source dialog, under Lakehouse Catalogs, select Snowflake Open Catalog.
General
To configure the source connection:
- For Name, enter a name for the source. The name you enter must be unique in the organization. Consider a name that is easy for users to reference. This name cannot be edited once the source is created. The name cannot exceed 255 characters and must contain only the following characters: 0-9, A-Z, a-z, underscore (_), or hyphen (-).
- Enter the name of the Snowflake Open Catalog.
- For Endpoint URI, specify the catalog service URI.
- In the Authentication section, use the Client ID and Client Secret created during the configuration of a service connection for the Snowflake Open Catalog.
- By default, Use vended credentials is enabled. This allows Dremio to connect to the catalog and receive temporary credentials to the underlying storage location. If this is enabled, you do not need to add storage authentication in Advanced Options.
- (Optional) For Allowed Namespaces, add each namespace and select the option if you want to include their entire subtrees. Tables are organized into namespaces, which can be at the top level or nested within one another. Namespace names cannot contain periods or spaces.
Advanced Options
Storage Authentication
If you disabled vended credentials in the General tab, you must manually provide the storage authentication.
Dremio supports Amazon S3, Azure Storage, and Google Cloud Storage (GCS) as object storage services. For acceptable storage authentication configurations, see the following catalog properties and credentials for each service option.
- S3 Access Key
- S3 Assumed Role
- Azure with Entra ID
- Azure Shared Key
- GCS Default Credentials
- GCS KeyFile
-
fs.s3a.aws.credentials.provider(property)- Value:
org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider - Description: Required value for a Snowflake Open Catalog source
- Value:
-
fs.s3a.access.key(credential)- Value:
<your_access_key> - Description: AWS access key ID used by S3A file system
- Value:
-
fs.s3a.secret.key(credential)- Value:
<your_secret_key> - Description: AWS secret key used by S3A file system
- Value:
-
fs.s3a.assumed.role.arn(property)- Value:
arn:aws:iam::*******:role/OrganizationAccountAccessRole - Description: AWS ARN for the role to be assumed
- Value:
-
fs.s3a.aws.credentials.provider(property)- Value:
com.dremio.plugins.s3.store.STSCredentialProviderV1 - Description: Required value for a Snowflake Open Catalog source
- Value:
-
fs.s3a.assumed.role.credentials.provider(property)- Value:
org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider - Description: Use only if the credential provider is
AssumedRoleCredentialProvider; lists credential providers to authenticate with the STS endpoint and retrieve short-lived role credentials
- Value:
-
fs.s3a.access.key(credential)- Value:
<your_access_key> - Description: AWS access key ID used by S3A file system
- Value:
-
fs.s3a.secret.key(credential)- Value:
<your_secret_key> - Description: AWS secret key used by S3A file system
- Value:
-
fs.azure.account.auth.type(property)- Value:
OAuth - Description: Authentication type
- Value:
-
fs.azure.account.oauth2.client.id(property)- Value:
<your_client_ID> - Description: Client ID from App Registration within Azure Portal
- Value:
-
fs.azure.account.oauth2.client.endpoint(property)- Value:
https://login.microsoftonline.com/<ENTRA_ID>/oauth2/token - Description: Microsoft Entra ID from Azure Portal
- Value:
-
fs.azure.account.oauth2.client.secret(credential)- Value:
<your_client_secret> - Description: Client secret from App Registration within Azure Portal
- Value:
fs.azure.account.key(credential)- Value:
<your_account_key> - Description: Storage account key
- Value:
dremio.gcs.use_keyfile(property)- Value:
false - Description: Required value for a Snowflake Open Catalog source
- Value:
-
dremio.gcs.clientId(property)- Value:
<your_client_ID> - Description: Client ID from GCS
- Value:
-
dremio.gcs.projectId(property)- Value:
<your_project_ID> - Description: Project ID from GCS
- Value:
-
dremio.gcs.clientEmail(property)- Value:
<your_client_email> - Description: Client email from GCS
- Value:
-
dremio.gcs.privateKeyId(property)- Value:
<your_private_key_ID> - Description: Private key ID from GCS
- Value:
-
dremio.gcs.use_keyfile(property)- Value:
true - Description: Required value for a Snowflake Open Catalog source
- Value:
-
dremio.gcs.privateKey(credential)- Value:
<your_private_key> - Description: Private key from GCS
- Value:
Cache Options
- Enable local caching when possible: Selected by default. Along with asynchronous access for cloud caching, local caching can improve query performance.
- Max percent of total available cache space to use when possible: Specifies the disk quota, as a percentage, that a source can use on any single executor node only when local caching is enabled. The default is 100 percent of the total disk space available on the mount point provided for caching. You can either manually enter a percentage in the value field or use the arrows to the far right to adjust the percentage.
Reflection Refresh
You can set the policy that controls how often reflections are scheduled to be refreshed automatically, as well as the time limit after which reflections expire and are removed.
Refresh Settings
- Never refresh: Prevent automatic Reflection refresh. By default, Reflections refresh automatically.
- Refresh every: Set the refresh interval in hours, days, or weeks. Ignored if Never refresh is selected.
- Set refresh schedule: Specify a daily or weekly refresh schedule.
Expire Settings
- Never expire: Prevent Reflections from expiring. By default, Reflections expire after the configured time limit.
- Expire after: The time limit after which Reflections are removed from Dremio, specified in hours, days, or weeks. Ignored if Never expire is selected.
Metadata
Specify metadata options with the following settings.
Dataset Handling
Remove dataset definitions if underlying data is unavailable (default).
Metadata Refresh
These are the optional Metadata Refresh parameters:
Dataset Discovery
The refresh interval for fetching top-level source object names such as databases and tables.
- Fetch every: You can choose to set the frequency to fetch object names in minutes, hours, days, or weeks. The default frequency to fetch object names is 1 hour.
Dataset Details
The metadata that Dremio needs for query planning, such as information needed for fields, types, shards, statistics, and locality.
- Fetch mode: You can choose to fetch only from queried datasets. Dremio updates details for previously queried objects in a source. By default, this is set to Only Queried Datasets.
- Fetch every: You can choose to set the frequency to fetch dataset details in minutes, hours, days, or weeks. The default frequency to fetch dataset details is 1 hour.
- Expire after: You can choose to set the expiry time of dataset details in minutes, hours, days, or weeks. The default expiry time of dataset details is 3 hours.
Privileges
You can grant privileges to specific users or roles. See Privileges for more information.
To grant access to a user or role:
- For Privileges, enter the user name or role name to which you want to grant access and click the Add to Privileges button. The added user or role is displayed in the USERS/ROLES table.
- For the users or roles in the USERS/ROLES table, toggle the checkmark for each privilege you want to grant on the Dremio source that is being created.
- Click Save after setting the configuration.
Update a Snowflake Open Catalog Source
To update a Snowflake Open Catalog source:
- On the Datasets page, under Lakehouse Catalogs in the panel on the left, find the name of the source you want to edit.
- Right-click the source name and select Settings from the list of actions. Alternatively, click the source name and then the
at the top right corner of the page. - In the Source Settings dialog, edit the settings you want to update. Dremio does not support updating the source name. For information about the settings options, see Configure Snowflake Open Catalog as a Source.
- Click Save.
Delete a Snowflake Open Catalog Source
To delete a Snowflake Open Catalog source:
- On the Datasets page, click Sources > Lakehouse Catalogs in the panel on the left.
- In the list of data sources, hover over the name of the source you want to remove and right-click.
- From the list of actions, click Delete.
- In the Delete Source dialog, click Delete to confirm that you want to remove the source.
If the source is in a bad state (for example, Dremio cannot authenticate to the source or the source is otherwise unavailable), only users who belong to the ADMIN role can delete the source.