Configure Model Providers
Model providers connect Dremio to the large language models (LLMs) that power AI features, including the AI agent and AI functions. A model provider is the external service that hosts the model (for example, Anthropic, OpenAI, or Amazon Bedrock); the model is the specific LLM within that service (for example, claude-opus-4-6 or gpt-5.1). Every organization includes the Dremio-Provided LLM by default, but you can configure your own.
Supported Model Providers
| Service | Connection Method(s) |
|---|---|
| OpenAI | Organization ID + API key + region (US, EU, or Global). US and EU route LLM calls to region-specific infrastructure; Global routes to OpenAI's default infrastructure without data residency guarantees. Region selection adheres to OpenAI Regional Data Controls. |
| Anthropic | API key |
| Google Gemini | API key without HTTP Referer restrictions. Dremio does not send an HTTP Referer header, causing restricted keys to fail with Requests from referer <empty> are blocked. |
| Amazon Bedrock | Choose one of the following:
|
| Azure OpenAI | Resource name + directory ID + application ID + client secret value |
| OpenAI-Compatible API | API endpoint URL, and optionally organization ID and API key |
Add a Model Provider
Before you add a model provider, ensure you have:
- Admin access to your Dremio organization
- API credentials for your chosen model provider (see Supported Model Providers). If you haven't chosen a provider yet, see Model Selection Guidelines.
Select the tab for your model provider service and follow the steps to connect it to Dremio.
- OpenAI, Anthropic, Gemini, Azure OpenAI
- Amazon Bedrock
- OpenAI-Compatible API
To add a model provider for OpenAI, Anthropic, Gemini, or Azure OpenAI:
- In the Dremio console, click
in the side navigation bar and select Organization. - Select AI Configurations from the organization settings sidebar.
- Click Add model provider.
- In the Add model provider dialog, select the Service for your provider (for example, OpenAI, Anthropic, or Google Gemini).
- Enter a Name for the model provider.
- Enter the required connection credentials for your provider. See Supported Model Providers for the required fields.
- For Default Model ID, enter the model to use for AI agent interactions. For Azure OpenAI, enter the deployment name you configured in the Azure portal (for example,
my-gpt4-deployment), not the underlying model name (for example,gpt-4). Dremio does not verify this value; if the deployment name is incorrect or inaccessible, errors appear at runtime. - (Optional) For Allowed Model IDs for AI Functions, enter the model IDs you want to make available for AI functions.
- Click Add.
Amazon Bedrock provides access to foundation models from Anthropic and other providers through a managed AWS service. Connecting Bedrock to Dremio lets you use these models to power the AI agent and AI functions without routing data through a third-party API.
Complete the steps for your chosen authentication method.
- Access Keys
- IAM Role
Access keys are best for more general use.
Prerequisites
Before configuring Amazon Bedrock in Dremio, ensure you:
-
have an Amazon Bedrock model ID that complies with Anthropic requirements for supported regions and models for inference profiles. The Amazon Bedrock model ID must be a valid model identifier that begins with a regional prefix (for example,
us.anthropic.claude-opus-4-6-20260101-v1:0).Anthropic models require one-time activationTo use Anthropic models (such as Claude), you must submit the Anthropic use case form in the Amazon Bedrock console once per AWS account before those models are available. See Manage access to Amazon Bedrock foundation models in the AWS documentation.
-
obtain an AWS access key ID and secret access key for an IAM user with
AmazonBedrockLimitedAccesspermission. See Creating access keys for an IAM user in the AWS documentation.
Set Up Access Key Authentication
To set up access key authentication:
- In the Dremio console, click
in the side navigation bar and select Organization. - Select AI Configurations from the organization settings sidebar.
- Click Add model provider.
- In the Add model provider dialog, select Amazon Bedrock as the model provider service.
- For Name, enter a name for the model provider.
- For Region, select your Bedrock region (for example,
us-east-1). - For Authentication Method, select Access Key.
- For Access Key ID, enter your access key ID.
- For Secret Access Key, enter your secret access key.
- (Optional) For Guardrail Identifier and Guardrail Version, enter the Amazon Bedrock guardrail identifier and version that Dremio applies to requests sent to models through Bedrock. If specified, Dremio enforces the guardrail at runtime to help meet governance requirements, such as personally identifiable information (PII) protection and policy controls.
- For Default Model ID, enter the model ID you want to use as the default. Dremio does not verify this value; if the model ID is incorrect or unavailable in your region, errors appear at runtime.
- (Optional) For Allowed Model IDs for AI Functions, enter the model IDs you want to make available for AI functions.
- Click Add.
IAM roles are best for role-based projects with stricter credential policies.
Prerequisites
Before configuring Amazon Bedrock in Dremio, ensure you:
-
have an Amazon Bedrock model ID that complies with Anthropic requirements for supported regions and models for inference profiles. The Amazon Bedrock model ID must be a valid model identifier that begins with a regional prefix (for example,
us.anthropic.claude-opus-4-6-20260101-v1:0).Anthropic models require one-time activationTo use Anthropic models (such as Claude), you must submit the Anthropic use case form in the Amazon Bedrock console once per AWS account before those models are available. See Manage access to Amazon Bedrock foundation models in the AWS documentation.
-
create an IAM role with the Dremio trust policy:
-
Navigate to IAM Console > Roles > Create role.
-
On the Select trusted entity page, under Trusted entity type, select Custom trust policy.
-
Delete the current JSON policy and paste in the custom trust policy template below.
Custom trust policy template{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowTagSessionFromCallerRole",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<dremio_trust_account>:root"
},
"Action": "sts:TagSession"
},
{
"Sid": "AllowAssumeRoleWithExternalId",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<dremio_trust_account>:root"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "<organization ID>"
}
}
}
]
} -
Replace
<dremio_trust_account>with the Dremio trust account ID for your deployment:Deployment Dremio Trust Account ID US 894535543691EU 314270375726 -
Replace
<organization ID>with your Dremio organization ID (found in Settings > Organization). -
Click Next. On the Add permissions page, search for and select
AmazonBedrockLimitedAccess. Click Next. -
Enter a role name (for example,
DremioBedrockRole) and click Create role. -
Note the Role ARN from the role summary page (for example,
arn:aws:iam::<your-account-id>:role/DremioBedrockRole).
-
Set Up IAM Role Authentication
To set up IAM role authentication:
- In the Dremio console, click
in the side navigation bar and select Organization. - Select AI Configurations from the organization settings sidebar.
- Click Add model provider.
- In the Add model provider dialog, select Amazon Bedrock as the model provider service.
- For Name, enter a name for the model provider.
- For Region, select your Bedrock region (for example,
us-east-1). - For Authentication Method, select IAM Role.
- For IAM Role ARN, enter the ARN from the prerequisite step (for example,
arn:aws:iam::<your-account-id>:role/DremioBedrockRole). - (Optional) For Guardrail Identifier and Guardrail Version, enter the Amazon Bedrock guardrail identifier and version that Dremio applies to requests sent to models through Bedrock. If specified, Dremio enforces the guardrail at runtime to help meet governance requirements, such as personally identifiable information (PII) protection and policy controls.
- For Default Model ID, enter the model ID you want to use as the default. Dremio does not verify this value; if the model ID is incorrect or unavailable in your region, errors appear at runtime.
- (Optional) For Allowed Model IDs for AI Functions, enter the model IDs you want to make available for AI functions.
- Click Add.
Troubleshoot
| Issue | Cause | Solution |
|---|---|---|
| Access denied (access keys) | Incorrect credentials, missing permissions, or wrong region | Verify the access key is correct, the user has AmazonBedrockLimitedAccess permission, the region matches the model, and you have accepted the Anthropic terms (if using Claude) |
| Access denied (IAM role) | Trust policy or external ID mismatch | Check that the trust policy and external ID in the IAM role match your Dremio organization ID |
| "Role Not Found" | Incorrect or nonexistent Role ARN | Verify the Role ARN is correct and the role exists in the specified AWS account |
| Model access denied | Model not available in region, or Anthropic terms not accepted | Check model availability for your region and submit the Anthropic use case form if using Claude models |
| 429 Too Many Tokens | Account quota exceeded | See Amazon Bedrock service quotas to request a quota increase |
The OpenAI-Compatible API provider lets you connect Dremio to any service that implements the OpenAI API specification — including self-hosted models and third-party providers not listed above — without waiting for Dremio to certify specific models.
To configure an OpenAI-compatible model provider:
- In the Dremio console, click
in the side navigation bar and select Organization. - Select AI Configurations from the organization settings sidebar.
- Click Add model provider.
- In the Add model provider dialog, select OpenAI-Compatible API as the model provider service.
- For Name, enter a name for the model provider.
- For API endpoint URL, enter the base URL of your OpenAI-compatible endpoint (for example,
https://api.example.com/v1). - (Optional) For Organization ID, enter your provider's organization ID.
- (Optional) For API Key, enter your API key.
- For Default Model ID, enter the model ID to use as the default.
- (Optional) For Allowed Model IDs for AI Functions, enter the model IDs to make available for AI functions.
- (Optional) Expand Advanced Configuration and enter JSON in Model parameters (JSON) to customize model behavior. See Model Parameters below.
- Click Add.
Model Parameters
The Model parameters (JSON) field under Advanced Configuration controls how models behave when called by Dremio. You can set defaults that apply to all models, and override them for specific models by exact name or regex pattern.
| Field | Description |
|---|---|
parameters | Default parameters applied to all models |
per_model | Parameters for a specific model, matched by exact model name |
per_model_pattern | Parameters for models whose name matches a regex pattern |
Supported Parameters
| Parameter | Type | Description |
|---|---|---|
temperature | Number | Controls response randomness. Range: 0.0–1.0. Lower values produce more deterministic output. |
max_tokens | Integer | Maximum number of tokens to generate; synonym for max_completion_tokens. Use this for inference servers that do not accept max_completion_tokens. |
max_completion_tokens | Integer | Maximum number of tokens to generate; synonym for max_tokens. Use this for inference servers that do not accept max_tokens. |
top_p | Number | Nucleus sampling parameter. Range: 0.0–1.0. |
send_user_identifier | Boolean | Whether Dremio sends a user identifier to the provider with each request. Defaults to true. Set to false for providers that do not accept the user field (for example, Mistral). |
When the same parameter is defined in multiple places, the most specific definition takes effect: per_model overrides per_model_pattern, which overrides parameters.
{
"parameters": {
"temperature": 0.2,
"max_tokens": 8192
},
"per_model_pattern": {
".*mistral.*": {
"send_user_identifier": false
}
},
"per_model": {
"gpt-4": {
"top_p": 0.95,
"max_tokens": 4096
}
}
}
Model Allowlist
Admins can configure an allowlist of models available for use in AI functions. Any model on the allowlist can be referenced by name in AI_GENERATE and other AI function calls. Models not on the allowlist are not available to users, even if the model provider is otherwise configured.
Default Model Provider
Admins set the default model provider through Admin > Organization > AI Configurations. The default is used for all AI agent interactions and AI function calls; changes take effect for all users immediately. By default, CALL MODEL is granted to all users for all new model providers, so users can continue using the AI agent without interruption if the default changes.
Edit a Model Provider
You can edit a model provider, such as to rotate API credentials or change the default model ID. To change the default model provider, you must have MODIFY privilege on both the current default model provider and the new proposed default.
To edit a model provider:
- In the Dremio console, click
in the side navigation bar and select Organization. - Select AI Configurations from the organization settings sidebar.
- Click Edit next to the model provider.
Delete a Model Provider
You must have MODIFY privilege on the model provider in order to delete it. If the provider is currently set as the default, assign a new default before deleting it. If no other configured providers exist, the Dremio-Provided LLM automatically becomes the default when you delete the provider.
To delete a model provider:
- In the Dremio console, click
in the side navigation bar and select Organization. - Select AI Configurations from the organization settings sidebar.
- Click Delete next to the model provider.
Model Selection Guidelines
Dremio includes the Dremio-Provided LLM, which requires no configuration. LLM calls route to US or EU endpoints based on your organization's control plane. This provider cannot be deleted, but you can add your own.
Dremio recommends that you have a mid-tier or higher LLM for the AI agent. An underpowered model generates invalid SQL, fails to recover from errors, and produces unreliable responses.
| Tier | Examples | Notes |
|---|---|---|
| Mid | Claude Sonnet 4.5, GPT-5.1 | Best balance of quality, latency, and cost for most organizations. Handles multi-step queries, joins, visualizations, and error recovery well. |
| High | Claude Opus 4.6, GPT-5.2 | Most accurate SQL generation and strongest reasoning on complex analytical questions. Higher cost than mid-tier. |
When selecting a model, consider:
- Data residency: OpenAI model selection is region-aware. Selecting US, EU, or Global determines where LLM calls are routed. For EU deployments, select an EU-compliant endpoint to meet data residency requirements.
- Cost and throughput: Higher-capability models generally have higher token costs. If you have high-volume AI function usage, evaluate the cost implications of your default model selection.
- Model availability: Some models may be restricted in certain regions even if your provider offers them. Refer to your provider's documentation to confirm model availability in your region.
The AI agent requires a text-generation (chat/completion) model with a context window of at least 128K tokens. Embedding, image generation, audio, video, computer-use, and lightweight (Nano, Mini, Flash-class) models are not supported for the AI agent.
When you have selected your model provider, see Add a Model Provider for the steps.
Related Topics
- Analyze Using AI Agent – Learn how users interact with the AI agent once a model provider is configured.
- Data Privacy – Learn more about Dremio's data privacy practices.