Skip to main content
Version: 24.3.x

Azure Data Lake Storage Gen1

This topic describes how to configure Azure Data Lake Storage (ADLS) Gen1 as a source in Dremio. ADLS Gen1 is an optimized storage for big data analytics workloads and supports Data Lake Gen1 file systems.

caution

On February 29, 2024, ADLS Gen1 was retired. For more information, see the official Microsoft announcement.

For information about Dremio support for ADLS Gen2, see Azure Storage.

Compatibility

ADLS Gen1 is supported as a data source in Dremio Software on-premise deployments.

note

As of Dremio Software version 21.3.0+, 22.0.3+, and 23.0.0+, ADLS Gen1 is supported as a data source in AWS Edition. To enable ADLS Gen1 as a source in AWS Edition, you must enable the plugins.adl.awse.enabled Support Key.

Azure Configuration

Dremio connects to ADLS using a pre-generated authentication key.

note

Dremio supports offheap memory buffers for reading Parquet files from ADLS.

Registering application with Azure Active Directory

  1. Go to Azure Portal and select Azure Active Directory from the left navigation bar. Then select App Registrations.

  2. Click on Endpoints and note down OAUTH 2.0 TOKEN ENDPOINT. This will be the OAuth 2.0 Token Endpoint for your ADLS source in Dremio.

  3. Click New application registration. Select a Name, choose Web app / API as type and enter a Sign-on URL. This URL can be a valid arbitrary URL.

  4. Once created, click on the registered app and note down Application ID. This will be the Application ID for your ADLS source in Dremio.

  5. While in the registered app, select Keys under API Access. Then enter a Description and select an expiration. Click Save and note down Value. This will be the Password for your ADLS source in Dremio.

Granting ADLS Access

  1. Go to your desired Data Lake Store resource. Note down the name of the Data Lake Store. This will be the Resource Name for your ADLS source in Dremio.

  2. Click on Access control (IAM) and click Add. Select a Role that has at minimum Read access and find the name of the application you registered above (i.e. the Owner Role can be used). Click Save.

  3. For the same Data Lake Store resource, go to Data explorer, then navigate to the top level directory that'll be used by Dremio. Click Access and add Read, Execute and Write (if using as distributed store) permissions to the application registered above.

Dremio Configuration

General

Dremio FieldAzure Property
Resource NameName of the resource
Application IDApplication ID of the registered application under Azure Active Directory
OAuth 2.0 Token EndpointAzure Active Directory OAuth 2.0 Token Endpoint for registered applications
Access key valueGenerated password value for the registered application

REQUIRED: After a successful first connection in Dremio, do the following:

  1. Ensure that the source is successfully created and tested.

  2. Add and set the fs.adl.impl.disable.cache connection property to false.
    This property is set to optimize performance. Do not set this property before the source is created and tested; it impacts credential usage and can cause issues if you are reconfiguring the connection.

    Per DX-13859

  3. Restart your cluster after saving the source with this property.

Advanced Options

Advanced options include:

  • Enable asynchronous access when possible (default)
  • Enable exports into the source (CTAS and DROP).
  • Root Path -- Root path for the source.
  • Connection Properties -- A list of additional connection properties.
  • Cache Options
    • Enable local caching when possible
    • Max percent of total available cache space to use when possible.
![Advanced Options](/images/adls-gen1-adv-options.png)

Reflection Refresh

![](/images/hdfs-refresh-policy.png)
  • Never refresh -- Specifies how often to refresh based on hours, days, weeks, or never.
  • Never expire -- Specifies how often to expire based on hours, days, weeks, or never.

Metadata

![](/images/s3-metadata.png)

Dataset Handling

  • Remove dataset definitions if underlying data is unavailable (Default).
    If this box is not checked and the underlying files under a folder are removed or the folder/source is not accessible, Dremio does not remove the dataset definitions. This option is useful in cases when files are temporarily deleted and put back in place with new sets of files.
  • Automatically format files into tables when users issue queries. If this box is checked and a query runs against the un-promoted table/folder, Dremio automatically promotes using default options. If you have CSV files, especially with non-default options, it might be useful to not check this box.

Metadata Refresh

  • Dataset Discovery -- Refresh interval for top-level source object names such as names of DBs and tables.
    • Fetch every -- Specify fetch time based on minutes, hours, days, or weeks. Default: 1 hour
  • Dataset Details -- The metadata that Dremio needs for query planning such as information needed for fields, types, shards, statistics, and locality.
    • Fetch mode -- Specify either Only Queried Datasets, All Datasets, or As Needed. Default: Only Queried Datasets
      • Only Queried Datasets -- Dremio updates details for previously queried objects in a source.
        This mode increases query performance because less work is needed at query time for these datasets.
      • All Datasets -- Dremio updates details for all datasets in a source. This mode increases query performance because less work is needed at query time.
      • As Needed -- Dremio updates details for a dataset at query time. This mode minimized metadata queries on a source when not used, but might lead to longer planning times.
    • Fetch every -- Specify fetch time based on minutes, hours, days, or weeks. Default: 1 hour
    • Expire after -- Specify expiration time based on minutes, hours, days, or weeks. Default: 3 hours

Sharing

![](/images/hdfs-sharing.png)
You can specify which users can edit. Options include:
  • All users can edit.
  • Specific users can edit.

Configuring Cloud Cache

Cloud caching is available in Dremio Enterprise edition. See Configuring Cloud Cache for more information.