Skip to main content

Configure Model Providers

You configure model providers for your organization for AI features when deploying Dremio. After you configure at least one model provider, you must set a default model provider and optionally set an allowlist of available models. Dremio uses this default provider for all Dremio's AI Agent interactions, whereas the allowlist models can be used by anyone writing AI functions. By default CALL MODEL is granted to all users for all new model providers so if the default changes users can continue to use the AI Agent without interruption.

Dremio-Provided LLM

Dremio provides all organizations with an out-of-the-box model provider so that all users can begin engaging with the AI Agent and AI functions without any other configuration required. For US deployments, the LLM calls are routed to a US endpoint; for EU deployments, the LLM calls are routed to an EU endpoint. Once you have added your own model provider and set it as the new default, the Dremio-Provided LLM will no longer be used. If you delete all other model providers, then the Dremio-Provided LLM will revert to the organization's default model provider. This model provider cannot be deleted.

Supported Model Providers

Dremio supports configuration of the following model providers. Dremio recommends using a medium or high tier enterprise-grade reasoning model for the best performance and experience.

ServiceConnection Method(s)
OpenAIOrganization ID + API Key + Region (US, EU, or Global). Region selection adheres to OpenAI Regional Data Controls.
AnthropicAPI Key
Google GeminiAPI Key
Amazon BedrockChoose one of the following:
  • Region + Access Key ID + Secret Access Key
  • Region + IAM Role ARN
Azure OpenAIResource Name + Directory ID + Application ID + Client Secret Value
OpenAI-Compatible APIAPI endpoint URL, and optionally Organization ID and API Key
note

Dremio does not validate the model associated with an AWS Bedrock model ID or Azure OpenAI deployment name upon model provider creation. Using an unsupported model can lead to runtime errors or a higher rate of inaccurate responses.

Default Model Provider

To delete the model provider, you must assign a new default unless you are deleting the last available model provider that you have configured. To update the default model provider to a new one, you must have MODIFY privilege on both the current default and the new proposed default model provider.

Add Model Provider

For steps on adding an AWS Bedrock model provider, see Configure AWS Bedrock as a Model Provider.

For steps on adding an OpenAI-Compatible API model provider, see Configure an OpenAI-Compatible API Model Provider.

For all other model providers, follow these steps to add a model provider in the Dremio console:

  1. Click Admin icon in the side navigation bar and select Organization.
  2. Select AI Configurations from the organization settings sidebar.
  3. Click Add model provider.

Configure an OpenAI-Compatible Model Provider

The OpenAI-Compatible API provider lets you connect Dremio to any service that implements the OpenAI API specification—including self-hosted models and third-party providers not listed above—without waiting for Dremio to certify specific models.

To configure an OpenAI-compatible model provider:

  1. Click Admin icon in the side navigation bar and select Organization.
  2. Select AI Configurations from the organization settings sidebar.
  3. Click Add model provider.
  4. In the Add model provider dialog, select OpenAI-Compatible API as the model provider service.
  5. For Name, enter a name for the model provider.
  6. For API endpoint URL, enter the base URL of your OpenAI-compatible endpoint (e.g., https://api.example.com/v1).
  7. (Optional) For Organization ID, enter your provider's organization ID.
  8. (Optional) For API Key, enter your API key.
  9. For Default Model ID, enter the model ID to use as the default.
  10. (Optional) For Allowed Model IDs for AI Functions, enter the model IDs to make available for AI functions.
  11. (Optional) Expand Advanced Configuration and enter JSON in Model parameters (JSON) to customize model behavior. See Model Parameters.
  12. Click Add.

Model Parameters

The Model parameters (JSON) field under Advanced Configuration controls how models behave when called by Dremio. You can set defaults that apply to all models, and override them for specific models by exact name or regex pattern.

FieldDescription
parametersDefault parameters applied to all models
per_modelParameters for a specific model, matched by exact model name
per_model_patternParameters for models whose name matches a regex pattern

Supported Parameters

ParameterTypeDescription
temperatureNumberControls response randomness. Range: 0.01.0. Lower values produce more deterministic output.
max_tokensIntegerMaximum number of tokens to generate; synonym for max_completion_tokens. Use this for inference servers that do not accept max_completion_tokens.
max_completion_tokensIntegerMaximum number of tokens to generate; synonym for max_tokens. Use this for inference servers that do not accept max_tokens.
top_pNumberNucleus sampling parameter. Range: 0.01.0.
send_user_identifierBooleanWhether Dremio sends a user identifier to the provider with each request. Defaults to true. Set to false for providers that do not accept the user field (for example, Mistral).

When the same parameter is defined in multiple places, the most specific definition takes effect: per_model overrides per_model_pattern, which overrides parameters.

Model parameters example
{
"parameters": {
"temperature": 0.2,
"max_tokens": 8192
},
"per_model_pattern": {
".*mistral.*": {
"send_user_identifier": false
}
},
"per_model": {
"gpt-4": {
"top_p": 0.95,
"max_tokens": 4096
}
}
}