Azure Storage Beta
The Dremio source connector for Azure Storage is not generally available and can only be enabled by contacting Dremio Support.
The Dremio source connector for Azure Storage includes support for the following Azure Storage services:
Azure Blob Storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data, such as text or binary data.
Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on top of Azure Blob storage, and converges the capabilities of Azure Blob Storage and Azure Data Lake Storage Gen1.
Adding an Azure Storage Source
To add an Azure Storage source to your project:
From the Datasets page, click (+) Add Source at the bottom of the Sources pane.
noteAlternatively, click Object Storage to display all object storage sources. From the top-right of the page, click the Add object storage button.
In the Add Data Source dialog, under Object Storage, click Azure Storage.
The New Azure Storage Source dialog box appears, which contains the following sections: General, Advanced Options, Reflection Refresh, Metadata, Privileges
Refer to the following for guidance on how to complete each section.
General
Section | Field/Option | Description |
---|---|---|
Connection | Name | Provide a name to use for this Azure Storage source. |
Connection | Account Name | The name of the Azure Storage account from the Azure Portal App. |
Connection | Encrypt connection | Enabled by default, this option encrypts network traffic with TLS. It is recommended that you leave this option enabled. |
Authentication | Account Kind | Select the Azure Storage version for this source connection. Default: StorageV2 |
Authentication | Shared Access Key | Select this option to authenticate using the Shared Access Key from the Azure Portal App. |
Authentication | Azure Active Directory | Select this option to use Azure Active Directory credentials for authentication. |
Azure Active Directory Authentication
To configure the Azure Storage source to use Azure Active Directory for Authentication, provide the following values from the OAuth 2.0 application that you created in the Azure Portal for this source:
Application ID - The Application (Client) ID in Azure.
OAuth 2.0 Token Endpoint - The OAuth 2.0 token endpoint (v1.0).
Client Secret - The secret key generated for the application.
Advanced Options
Advanced Options include:
Advanced Option | Description |
---|---|
Enable asynchronous access when possible | Activated by default, clear the checkbox to deactivate. Enables cloud caching for Azure Storage to support simultaneous actions such as adding and editing a new source. |
Enable partition column inference | If a source dataset uses Parquet files and the data is partitioned on one or more columns, enabling this option will append a column named dir<n> for each partition level and use subfolder names for values in those columns. Dremio detects the name of the partition column, appends a column that uses that name, detects values in the names of subfolders, and uses those values in the appended column. |
Root Path | The root path for the Azure Storage location. The default root path is /. |
Default CTAS Format | Choose the default format for tables you create in Dremio, either Iceberg or Parquet. |
Advanced Properties | Provide the custom key value pairs for the connection relevant to the source.
|
Blob Containers & Filesystem Allowlist | Add an approved Azure Storage account in the text field. You can add multiple accounts this way. When using this option to add specific accounts, you will only be able to see those accounts and not all accounts that may be available in the source. |
Under Cache Options, review the following table and edit the options to meet your needs.
Cache Options | Description |
---|---|
Enable local caching when possible | Selected by default, along with asynchronous access for cloud caching. Uncheck the checkbox to disable this option. For more information about local caching, see the note below this table. |
Max percent of total available cache space to use when possible | Specifies the disk quota, as a percentage, that a source can use on any single node only when local caching is enabled. The default is 100 percent of the total disk space available. You can either manually enter in a percentage in the value field or use the arrows to the far right to adjust the percentage. |
Columnar Cloud Cache (C3) enables Dremio to achieve NVMe-level I/O performance on S3/ADLS by leveraging the NVMe/SSD built into cloud compute instances. C3 caches only the data required to satisfy your workloads and can even cache individual microblocks within datasets. If your table has 1,000 columns and you only query a subset of those columns and filter for data within a certain timeframe, C3 will cache only that portion of your table. By selectively caching data, C3 eliminates over 90% of S3/ADLS I/O costs, which can make up 10-15% of the costs for each query you run.
Reflection Refresh
These settings define how often reflections are refreshed and how long data can be served before expiration. To learn more about reflections, refer to Accelerating Queries with Reflections.
All reflection parameters are optional.
You can set the following refresh policies for reflections:
- Refresh period: Manage the refresh period by either enabling the option to never refresh or setting a refresh frequency in hours, days, or weeks. The default frequency to refresh reflections is every hour.
- Expiration period: Set the expiration period for the length of time that data can be served by either enabling the option to never expire or setting an expiration time in hours, days, or weeks. The default expiration time is set to three hours.
Metadata
Metadata settings include Dataset Handling options and Metadata Refresh options.
Dataset Handling
- Remove dataset definitions if underlying data is unavailable (Default).
- When selected, datasets are automatically removed if their underlying files/folders are removed from Azure Storage or if the folder or source are not accessible. If this option is not selected, Dremio will not remove dataset definitions if underlying files/folder are removed from Azure Storage. This may be useful if files are temporarily deleted and replaced with a new set of files.
- Automatically format files into tables when users issue queries.
- When selected, Dremio will automatically promote a folder to a table using default options. If you have CSV files, especially with non-default formatting, it might be useful to not select this option.
Metadata
These options let you configure settings to refresh metadata and enable other dataset options.
All metadata parameters are optional.
Dataset Handling
You can review each option provided in the following table to set up the dataset handling options to meet your needs.
Parameter | Description |
---|---|
Remove dataset definitions if underlying data is unavailable | By default, Dremio removes dataset definitions if underlying data is unavailable. This option is for scenarios when files are temporarily deleted and added back in the same location with new sets of files. |
Automatically format files into tables when users issue queries | Enable this option to allow Dremio to automatically format files into tables when you run queries. This option is for scenarios when the data contains CSV files with non-default options. |
Metadata Refresh
The Metadata Refresh parameters include Dataset Discovery and Dataset Details.
Dataset Discovery: The refresh interval for fetching top-level source object names such as databases and tables. Use this parameter to set the time interval.
You can choose to set the frequency to fetch object names in minutes, hours, days, or weeks. The default frequency to fetch object names is one day.Dataset Details: The metadata that Dremio needs for query planning such as information required for fields, types, shards, statistics, and locality. The following table describes the parameters that fetch the dataset information.
Parameter | Description |
---|---|
Fetch mode | You can choose to fetch only from queried datasets that are set by default. Dremio updates details for previously queried objects in a source. Fetching from all datasets is deprecated. |
Fetch every | You can choose to set the frequency to fetch dataset details in minutes, hours, days, or weeks. The default frequency to fetch dataset details is one day. |
Expire after | You can choose to set the expiry time of dataset details in minutes, hours, days, or weeks. The default expiry time of dataset details is three days. |
Privileges
This section lets you grant privileges on the source to specific users or roles. To learn more about how Dremio allows for the implementation of granular-level privileges, see Privileges.
All privileges parameters are optional.
To add a privilege for a user or to a role:
- In the Add User/Role field, enter the user or role to which you want to apply privileges, then click Add to Privileges. The user or role is added to the Users table.
To set privileges for a user or role:
- In the Users table, identify the user to set privileges for and click under the appropriate column (Select, Alter, Create Table, etc.) to either enable or disable that privilege. A green checkmark indicates that the privilege is enabled.
- Click Save.