On this page

    4.3 Release Notes

    What’s New

    AWS Edition

    • Dremio AWS Edition provides an improved experience with projects.

    • The default time period for Dremio to automatically stop engines that haven’t received queries is increased to 15 minutes.

    • Added project ID and name tags to EC2 instances.

    • Automatically add S3 Sample Data as a data source.

    Other Enhancements

    • Hive data sources using Apache Parquet now support the following:

      • Asynchronous access, enabled by default

      • Columnar Cloud Cache

      • Complex data types, enabled by default and with a maximum of 16 nested levels

    • Cache options for Hive data sources appear on the Advanced Options tab of the New Hive Source dialog.

    • The Node Activity page identifies node types.

    • Added a JDBC parameter that allows users to specify which timezone to use when converting values.

    • The sort operator supports micro-spilling by default.

    • Query profiles display metrics for minimum, maximum, and average I/O times.

    • Added OAuth exceptions to log messages.

    • Added log messages for cache management.

    • Added log message for slow I/O writes.

    • Added log messages for slow spilling.

    • Improved error messages for the Elasticsearch plugin; clarify that Dremio doesn’t support null values in Elasticsearch lists.

    • Improved error messages for the MongoDB plugin.

    • Updated error messages for the JDBC plugin to differentiate between errors in Dremio and the data source.

    • Added remaining task IDs to error message for slow slicing threads.

    • The JDBC driver accepts the calendar and timeZone connection parameters.


    • Dremio no longer supports versions of MongoDB prior to 3.6.

    Changed Behavior

    • Updated the IAM permissions for the IAM role required for AWS Edition deployments.

    Fixed Issues in 4.3.0

    _* catalog/{id} REST API doesn’t return catalog entity IDs**_
    Fixed memory leak in column readers.

    Memory leak in executor nodes
    Fixed memory leak affecting reads of files in Apache Parquet format.

    A single Hive data source does not support multiple Azure accounts
    Fixed issue that prevented a single Hive data source from supporting multiple Azure accounts.

    No space left error when running a query in AWS Edition
    Fixed by reconfiguring the AWS Edition query results store.

    IllegalArgumentException when calling trunc mathematical function on Redshift data sources for the first time
    Fixed issue with Redshift metadata.

    Query fails with OutOfMemoryError exception
    Fixed issue that caused left query fragments in a pending state.

    Queries to Tableau data sources time out in AWS Edition
    Fixed issue that caused timeouts and other miscellaneous issues for AWS Edition deployments.

    Dremio resources not released on shutdown
    Fixed issue that caused the Dremio daemon to wait indefinitely if an error occurred while Dremio shuts down.

    Cannot search spaces, folders, and virtual data sets after restoring from a backup
    Fixed issue that prevented a full reindex of the KVStore after performing a backup.

    An Unexpected Error Occured dialog appears when executing new query on the Data Reflection page

    date_trunc Gandiva function fails
    Fixed issue that required calls to Gandiva functions to be case sensitive.

    Errors when provisioning Elastic Engines in AWS Edition
    Fixed provisioning scripts.

    Data in the NYC taxi sample data set is misaligned in AWS Edition
    Fixed schema issue.

    Invalid region error message when provisioning Elastic Engines for AWS Edition
    Fixed provisioning scripts.

    4.3.1 Release Notes

    Fixed Issues in 4.3.1

    Network timeout occurs while trying to connect to an S3 data source
    Added a retry policy for verifying credentials in the S3 plugin.

    Unable to delete and restore Dremio projects in a different AWS availablity zone in AWS Edition
    Fixed by adding ability to delete and restore backups across AWS availability zones.

    Greater than operator (>) doesn’t work correctly in conditional expressions with both positive and negative numeric values
    Fixed bug in ReduceTrigFunctionsRule.java.

    to_char type conversion function returns incorrect output with Oracle, PostgreSQL, and Redshift data sources
    Fixed by adding support to pushdown to_char functions with DateTime data types.

    Unable to start Dremio after upgrading to Dremio 4.2.1
    Fixed issue with Kerberized HDFS connectivity.

    Query fails with the following error: Node [rel#18082411:Subset#253.PHYSICAL.SINGLETON([]).[]] could not be implemented; planner state
    Fixed by updating Calcite.

    Preview of a query using the FLATTEN nested data function never completes
    Fixed issue with incoming record count in FLATTEN function.

    Dremio 4.1 doesn’t enforce the 32K limit for fields in a record
    Dremio now enforces the 32K limit for fields in a record.