Skip to main content
Version: current [25.x]

Integrating with Amazon S3 and Azure Blob Storage

You can use Fluent Bit to process logs and metrics for Amazon S3 and Azure Blob Storage.

Retrieving from Dremio Cluster

To upload logs to Amazon S3 or Azure Blob Storage, complete these steps:

  1. Install Fluent Bit on each node of your Dremio cluster.

  2. Create a configuration file in the <FLUENT_BIT_HOME>/etc/fluent-bit directory such as dremio_fluent-bit.conf, using the following example to append the Input sections:

Example for dremio_fluent-bit.conf

# To learn more about Dremio log files please see the documentation located at
# https://docs.dremio.com/current/sonar/monitoring/#logs

[INPUT]
# Audit log.
name tail
path <DREMIO_LOG_PATH>/json/audit.json
tag dremio.audit
multiline.parser java

[INPUT]
# HTTP access log for the Dremio web server. This log will be generated by coordinator nodes only.
name tail
path <DREMIO_LOG_PATH>/access.log
tag dremio.access
multiline.parser java

[INPUT]
# Garbage collection log.
name tail
path <DREMIO_LOG_PATH>/server.gc
tag dremio.server_gc
multiline.parser java

[INPUT]
# Server log.
name tail
path <DREMIO_LOG_PATH>/json/server.json
tag dremio.server
multiline.parser java

[INPUT]
# Log for Dremio daemon standard out.
name tail
path <DREMIO_LOG_PATH>/server.out
tag dremio.server_out
multiline.parser java

[INPUT]
# Metadata refresh log.
name tail
path <DREMIO_LOG_PATH>/metadata_refresh.log
tag dremio.metadata_refresh
multiline.parser java

[INPUT]
# Tracker log.
name tail
path <DREMIO_LOG_PATH>/tracker.json
tag dremio.tracker
multiline.parser java

[INPUT]
# Query log.
name tail
path <DREMIO_LOG_PATH>/queries.json
tag dremio.query
multiline.parser java
  1. Configure for Amazon S3 or Azure Blob Storage.

a. For uploading to Amazon S3, append the Output section using the example below and configure the AWS credentials.

Example for Amazon S3
[OUTPUT]
name s3
match *
bucket <BUCKET>
region <AWS_REGION>
s3_key_format /$TAG[0]/$TAG[1]/%Y/%m/%d/%H%M%S-$UUID

b. For uploading to Azure Blob Storage, use the following example to append the Output section:

Example for Azure Blob Storage
[OUTPUT]
name azure_blob
match *
account_name <ACCOUNT_NAME>
shared_key <SHARED_ACCESS_KEY>
path dremio_logs
container_name fluentbit-upload
auto_create_container on
tls on
  1. Start Fluent Bit. See the following example:
Example of starting Fluent Bit
fluent-bit -c dremio_fluent-bit.conf

Retrieving from Kubernetes

When Dremio is deployed on a Kubernetes cluster, all logs are written to the container's console simultaneously.

To upload these logs to Amazon S3 or Azure Blob Storage, complete these steps:

  1. Install Fluent Bit on your Kubernetes cluster.

  2. Create a helm values override file such as dremio-fluentbit.values.yaml.

  3. Configure for Amazon S3 or Azure Blob Storage.

a. For uploading to Amazon S3, append the Output section using the example below and configure the AWS credentials.

config:            
outputs: |
[OUTPUT]
name s3
match *
bucket <BUCKET>
region <AWS_REGION>
s3_key_format /$TAG[0]/$TAG[1]/%Y/%m/%d/%H%M%S-$UUID

b. For uploading to Azure Blob Storage, use the following example to append the Output section:

config:            
outputs: |
[OUTPUT]
name azure_blob
match *
account_name <ACCOUNT_NAME>
shared_key <SHARED_ACCESS_KEY>
path dremio_logs
container_name fluentbit-upload
auto_create_container on
tls on
  1. Upgrade Fluent Bit by running the following command:
helm upgrade --install fluent-bit fluent/fluent-bit -f dremio-fluentbit.values.yaml