Snowflake
Snowflake is a cloud data warehouse.
Configuring Snowflake as a Source
- In the bottom-left corner of the Datasets page, click Add Source.
- Under Databases in the Add Data Source dialog, select Snowflake.
General
Perform these steps in the General tab:
- For Name, specify the name by which you want the Snowflake source to appear in the Databases section. The name cannot include the following special characters:
/
,:
,[
, or]
. - For Host, specify the account URL of the Snowflake source in the format
https://LOCATOR_ID.snowflakecomputing.com
. - For Port, enter the port number. The default port is 443.
The optional connection parameters are case-sensitive. For example, if the name of a warehouse uses upper case only (e.g., WAREHOUSE1), specify it the same way in the Warehouse field.
- (Optional) For Database, specify the default database to use.
- (Optional) For Role, specify the default access-control role to use.
- (Optional) For Schema, specify the default schema to use.
- (Optional) For Warehouse, specify the warehouse that will provide resources for executing DML statements and queries.
- Under Authentication, you must choose one of the following authentication methods:
- Login-password authentication:
- For Username, enter your Snowflake username.
- For Password, enter your Snowflake password.
- Key-pair authentication (see Snowflake's key-pair documentation):
- For Username, enter your Snowflake username.
- For Private Key, enter your generated Snowflake private key in Privacy Enhanced Mail (PEM) format.
- (Optional) For Private key passphrase, enter the passphrase if you are using an encrypted private key.
- Login-password authentication:
Advanced
On the Advanced Options page, you can set values for these non-required options:
Option | Description |
---|---|
Maximum Idle Connections | The total number of connections allowed to be idle at a given time. The default maximum idle connections is 8. |
Connection Idle Time | The amount of time (in seconds) allowed for a connection to remain idle before the connection is terminated. The default connection idle time is 60 seconds. |
Query Timeout | The amount of time (in seconds) allowed to wait for the results of a query. If this time expires, the connection being used is returned to an idle state. |
Reflection Refresh
On the Reflection Refresh page, set the policy that controls how often reflections are scheduled to be refreshed automatically, as well as the time limit after which reflections expire and are removed.
Option | Description |
---|---|
Never refresh | Select to prevent automatic reflection refresh, default is to automatically refresh. |
Refresh every | How often to refresh reflections, specified in hours, days or weeks. This option is ignored if Never refresh is selected. |
Never expire | Select to prevent reflections from expiring, default is to automatically expire after the time limit below. |
Expire after | The time limit after which reflections expire and are removed from Dremio, specified in hours, days or weeks. This option is ignored if Never expire is selected. |
Metadata Options
On the Metadata page, you can configure settings to refresh metadata and handle datasets.
Dataset Handling
These are the optional Dataset Handling parameters.
Parameter | Description |
---|---|
Remove dataset definitions if underlying data is unavailable | By default, Dremio removes dataset definitions if underlying data is unavailable. Useful when files are temporarily deleted and added back in the same location with new sets of files. |
Metadata Refresh
These are the optional Metadata Refresh parameters:
-
Dataset Discovery: The refresh interval for fetching top-level source object names such as databases and tables. Set the time interval using this parameter.
Parameter Description (Optional) Fetch every You can choose to set the frequency to fetch object names in minutes, hours, days, or weeks. The default frequency to fetch object names is 1 hour. -
Dataset Details: The metadata that Dremio needs for query planning such as information required for fields, types, shards, statistics, and locality. These are the parameters to fetch the dataset information.
Parameter Description Fetch mode You can choose to fetch only from queried datasets that are set by default. Dremio updates details for previously queried objects in a source. Fetching from all datasets is deprecated. Fetch every You can choose to set the frequency to fetch dataset details in minutes, hours, days, or weeks. The default frequency to fetch dataset details is 1 hour. Expire after You can choose to set the expiry time of dataset details in minutes, hours, days, or weeks. The default expiry time of dataset details is 3 hours.
Privileges
On the Privileges page, you can grant privileges to specific users or roles. See Access Control for additional information about user privileges.
- (Optional) For Privileges, enter the user name or role name that you want to grant access to and click the Add to Privileges button. The added user or role is displayed in the Users table.
- (Optional) For the users or roles in the Users table, toggle the green checkmark for each privilege you want to grant on the Dremio source that is being created.
- Click Save after setting the configuration.
Editing a Snowflake Source
To edit a Snowflake source:
-
On the Datasets page, click External Sources at the bottom-left of the page. A list of sources is displayed.
-
Hover over the external source and click the Settings icon that appears next to the source.
-
In the Source Settings dialog, you cannot edit the name. Editing other parameters is optional. For parameters and advanced options, see Configuring Snowflake as a Source.
-
Click Save.
Removing a Snowflake Source
To remove a Snowflake source, perform these steps:
-
On the Datasets page, click External Sources at the bottom-left of the page. A list of sources is displayed.
-
Hover over the source and click the More (...) icon that appears next to the source.
-
From the list of actions, click Remove Source. Confirm that you want to remove the source.
cautionRemoving a source causes all downstream views dependent on objects in this source to break.
Predicate Pushdowns
These operations and functions are performed by Snowflake warehouses:
||
, AND
, OR
+
, -
, /
, *
<=
, <
, >
, >=
, =
, <>
, !=
ABS
ADD_MONTHS
AVG
BETWEEN
CASE
CAST
CEIL
CEILING
CHARACTER_LENGTH
CHAR_LENGTH
COALESCE
CONCAT
COUNT
COUNT_DISTINCT
COUNT_DISTINCT_MULTI
COUNT_FUNCTIONS
COUNT_MULTI
COUNT_STAR
DATE_ADD
DATE_SUB
DATE_TRUNC
DATE_TRUNC_DAY
DATE_TRUNC_HOUR
DATE_TRUNC_MINUTE
DATE_TRUNC_MONTH
DATE_TRUNC_QUARTER
DATE_TRUNC_WEEK
DATE_TRUNC_YEAR
DAYOFMONTH
DAYOFWEEK
DAYOFYEAR
EXTRACT
FLOOR
ILIKE
IN
IS DISTINCT FROM
IS NOT DISTINCT FROM
IS NOT NULL
IS NULL
LAST_DAY
LEFT
LENGTH
LIKE
LOCATE
LOWER
LPAD
LTRIM
MAX
MEDIAN
MIN
MOD
NOT
PERCENT_CONT
PERCENT_DISC
PERCENT_RANK
POSITION
REGEXP_LIKE
REPLACE
REVERSE
RIGHT
ROUND
RPAD
RTRIM
SIGN
SQRT
STDDEV
STDDEV_POP
STDDEV_SAMP
SUBSTR
SUBSTRING
SUM
TO_CHAR
TO_DATE
TRIM
TRUNC
TRUNCATE
UPPER