Snowflake
Snowflake is a cloud data warehouse.
Prerequisite
- Ensure that your Dremio cluster is at version 23.1 or later.
Configuring Snowflake as a Source
- On the Datasets page, to the right of Sources in the left panel, click .
- In the Add Data Source dialog, under Databases, select Snowflake.
General
-
In the Name field, specify the name by which you want the Snowflake source to appear in the Databases section. The name cannot include the following special characters:
/
,:
,[
, or]
. -
Under Connection, follow these steps:
noteThe optional connection parameters are case-sensitive. For example, if the name of a warehouse uses upper case only (e.g., WAREHOUSE1), specify it the same way in the Warehouse field.
- In the Host field, specify the hostname of the Snowflake source in this format:
LOCATOR_ID.snowflakecomputing.com
. - (Optional) In the Database field, specify the default database to use.
- (Optional) In the Role field, specify the default access-control role to use.
- (Optional) In the Schema field, specify the default schema to use.
- (Optional) In the Warehouse field, specify the warehouse that will provide resources for executing DML statements and queries.
- In the Host field, specify the hostname of the Snowflake source in this format:
-
Under Authentication, select either Login-password authentication or Key-pair authentication:
-
Login-password authentication: In the Username field, specify the Snowflake username. Under Password, in the dropdown menu, choose a method for providing the Snowflake password. If you choose Dremio, provide the Snowflake password in plain text in the provided field. Dremio stores the password. You may also choose to use one of the supported secrets managers to provide the Snowflake password:
-
Azure Key Vault: Provide the URI for the Azure Key Vault secret that stores the Snowflake password. The URI format is
https://<vault_name>.vault.azure.net/secrets/<secret_name>
(for example,https://myvault.vault.azure.net/secrets/mysecret
).noteTo use Azure Key Vault as your application secret store, you must:
- Deploy Dremio on Azure.
- Complete the Requirements for Authenticating with Azure Key Vault.It is not necessary to restart the Dremio coordinator when you rotate secrets stored in Azure Key Vault. Read Requirements for Secrets Rotation for more information.
-
AWS Secrets Manager: Provide the Amazon Resource Name (ARN) for the AWS Secrets Manager secret that holds the Snowflake password, which is available in the AWS web console or using command line tools.
-
HashiCorp Vault: Choose the HashiCorp secrets engine you're using from the dropdown menu and enter the secret reference for the Snowflake password in the correct format in the provided field.
-
-
Key-pair authentication (see Snowflake's key-pair documentation): In the Username field, specify the Snowflake username. Under Private Key and Private key passphrase, in the dropdown menus, choose a method for providing the Snowflake private key and private key passphrase, respectively. If you choose Dremio, provide the Snowflake private key and private key passphrase in plain text in the provided fields. Dremio stores the private key and private key passphrase. You may also choose to use one of the supported secrets managers to provide the Snowflake private key and private key passphrase:
-
Azure Key Vault: Provide the URI for the Azure Key Vault secret that stores the Snowflake private key and private key passphrase. The URI format is
https://<vault_name>.vault.azure.net/secrets/<secret_name>
(for example,https://myvault.vault.azure.net/secrets/mysecret
).noteTo use Azure Key Vault as your application secret store, you must:
- Deploy Dremio on Azure.
- Complete the Requirements for Authenticating with Azure Key Vault.It is not necessary to restart the Dremio coordinator when you rotate secrets stored in Azure Key Vault. Read Requirements for Secrets Rotation for more information.
-
AWS Secrets Manager: Provide the Amazon Resource Name (ARN) for the AWS Secrets Manager secret that holds the Snowflake private key and private key passphrase, which is available in the AWS web console or using command line tools.
-
HashiCorp Vault: Choose the HashiCorp secrets engine you're using from the dropdown menu and enter the secret reference for the Snowflake private key and private key passphrase in the correct format in the provided field.
-
-
Advanced Options
On the Advanced Options page, you can set values for these non-required options:
Option | Description |
---|---|
Maximum Idle Connections | The total number of connections allowed to be idle at a given time. The default maximum idle connections is 8. |
Connection Idle Time | The amount of time (in seconds) allowed for a connection to remain idle before the connection is terminated. The default connection idle time is 60 seconds. |
Query Timeout | The amount of time (in seconds) allowed to wait for the results of a query. If this time expires, the connection being used is returned to an idle state. |
Record Fetch Size | The maximum number of records to allow a single query to fetch. This setting prevents queries from using too many resources. |
Reflection Refresh
On the Reflection Refresh page, set the policy that controls how often reflections are scheduled to be refreshed automatically, as well as the time limit after which reflections expire and are removed.
Option | Description |
---|---|
Never refresh | Select to prevent automatic reflection refresh, default is to automatically refresh. |
Refresh every | How often to refresh reflections, specified in hours, days or weeks. This option is ignored if Never refresh is selected. |
Never expire | Select to prevent reflections from expiring, default is to automatically expire after the time limit below. |
Expire after | The time limit after which reflections expire and are removed from Dremio, specified in hours, days or weeks. This option is ignored if Never expire is selected. |
Metadata
On the Metadata page, you can configure settings to refresh metadata and handle datasets.
Dataset Handling
These are the optional Dataset Handling parameters.
Parameter | Description |
---|---|
Remove dataset definitions if underlying data is unavailable | By default, Dremio removes dataset definitions if underlying data is unavailable. Useful when files are temporarily deleted and added back in the same location with new sets of files. |
Metadata Refresh
These are the optional Metadata Refresh parameters:
-
Dataset Discovery: The refresh interval for fetching top-level source object names such as databases and tables. Set the time interval using this parameter.
Parameter Description (Optional) Fetch every You can choose to set the frequency to fetch object names in minutes, hours, days, or weeks. The default frequency to fetch object names is 1 hour. -
Dataset Details: The metadata that Dremio needs for query planning such as information required for fields, types, shards, statistics, and locality. These are the parameters to fetch the dataset information.
Parameter Description Fetch mode You can choose to fetch only from queried datasets that are set by default. Dremio updates details for previously queried objects in a source. Fetching from all datasets is deprecated. Fetch every You can choose to set the frequency to fetch dataset details in minutes, hours, days, or weeks. The default frequency to fetch dataset details is 1 hour. Expire after You can choose to set the expiry time of dataset details in minutes, hours, days, or weeks. The default expiry time of dataset details is 3 hours.
Privileges
On the Privileges tab, you can grant privileges to specific users or roles. See Access Controls for additional information about privileges.
All privileges are optional.
- For Privileges, enter the user name or role name that you want to grant access to and click the Add to Privileges button. The added user or role is displayed in the USERS/ROLES table.
- For the users or roles in the USERS/ROLES table, toggle the checkmark for each privilege you want to grant on the Dremio source that is being created.
- Click Save after setting the configuration.
Updating a Snowflake Source
To update a Snowflake source:
- On the Datasets page, under Databases in the panel on the left, find the name of the source you want to update.
- Right-click the source name and select Settings from the list of actions. Alternatively, click the source name and then the at the top right corner of the page.
- In the Source Settings dialog, edit the settings you wish to update. Dremio does not support updating the source name. For information about the settings options, see Configuring Snowflake as a Source.
- Click Save.
Deleting a Snowflake Source
If the source is in a bad state (for example, Dremio cannot authenticate to the source or the source is otherwise unavailable), only users who belong to the ADMIN role can delete the source.
To delete a Snowflake source, perform these steps:
- On the Datasets page, click Sources > Databases in the panel on the left.
- In the list of data sources, hover over the name of the source you want to remove and right-click.
- From the list of actions, click Delete.
- In the Delete Source dialog, click Delete to confirm that you want to remove the source.
Deleting a source causes all downstream views that depend on objects in the source to break.
Upgrading from Dremio Hub's Community Snowflake Plugin
Removing a Snowflake source will drop all tables in the source. If you have any reflections configured on tables or table-level ACLs (customized privileges) in your Snowflake sources, copy the details of those items before you remove any sources. After upgrading and re-adding your sources, you will need to recreate those reflections and ACLs.
Views are not affected by removing and re-adding Snowflake sources, provided the sources are re-added with the same names.
The community Snowflake plugin from Dremio Hub is not compatible with Dremio version 23.0 and later. You should use Dremio version 23.1 or later if you have Snowflake sources because it comes with an official Snowflake plugin.
If you are upgrading an older version of Dremio to version 23.1 or later, you must do the following:
-
Note the details of any reflections and ACLs configured on tables in Snowflake sources.
-
Remove your Snowflake sources from Dremio.
-
Remove the community Snowflake plugin and the existing Snowflake JDBC driver.
-
Upgrade Dremio to version 23.1 or later.
-
Add your Snowflake sources to Dremio with the same names.
-
Recreate any table-level reflections and ACLs on your Snowflake sources.
Predicate Pushdowns
Dremio delegates the execution of these expressions and functions to the database being queried, often dramatically improving query performance. It can also offload entire SQL queries that include one or more of these expressions and functions.
||
, AND, OR
+
, -
, /
, *
<=
, <
, >
, >=
, =
, <>
, !=
ABS
ADD_MONTHS
AVG
BETWEEN
CASE
CAST
CEIL
CEILING
CHARACTER_LENGTH
CHAR_LENGTH
COALESCE
CONCAT
COUNT
COUNT_DISTINCT
COUNT_DISTINCT_MULTI
COUNT_FUNCTIONS
COUNT_MULTI
COUNT_STAR
DATE_ADD
DATE_SUB
DATE_TRUNC
DATE_TRUNC_DAY
DATE_TRUNC_HOUR
DATE_TRUNC_MINUTE
DATE_TRUNC_MONTH
DATE_TRUNC_QUARTER
DATE_TRUNC_WEEK
DATE_TRUNC_YEAR
DAYOFMONTH
DAYOFWEEK
DAYOFYEAR
EXTRACT
FLOOR
ILIKE
IN
IS DISTINCT FROM
IS NOT DISTINCT FROM
IS NOT NULL
IS NULL
LAST_DAY
LEFT
LENGTH
LIKE
LOCATE
LOWER
LPAD
LTRIM
MAX
MEDIAN
MIN
MOD
NOT
PERCENT_CONT
PERCENT_DISC
PERCENT_RANK
POSITION
REGEXP_LIKE
REPLACE
REVERSE
RIGHT
ROUND
RPAD
RTRIM
SIGN
SQRT
STDDEV
STDDEV_POP
STDDEV_SAMP
SUBSTR
SUBSTRING
SUM
TO_CHAR
TO_DATE
TRIM
TRUNC
TRUNCATE
UPPER
Running Queries Directly on Snowflake Through Dremio
Dremio users can run pass queries through Dremio to run on Snowflake. Doing so can sometimes decrease query execution times. For more information, see Querying Relational-Database Sources Directly.