Configure Model Providers Enterprise
Starting with Dremio Software 26.1, you can configure model providers for AI functionality when deploying Dremio clusters on Kubernetes. After you configure at least one model provider, you must set a default model provider and optionally set an allowlist of available models. Dremio uses this default provider for all Dremio's AI Agent interactions and whereas the allowlist models can be used by anyone writing AI functions.
Supported Model Providers
Dremio supports configuration of the following model providers and models. Dremio recommends using enterprise-grade reasoning models for the best performance and experience.
| Category | Models | Connection Method(s) |
|---|---|---|
| OpenAI |
|
|
| Anthropic |
|
|
| Google Gemini |
|
|
| AWS Bedrock |
|
|
| Azure OpenAI |
| Combination of
|
Add Model Provider
For steps on adding an AWS Bedrock model provider, see Configure AWS Bedrock as a Model Provider.
For all other model providers, follow these steps to add a model provider in the Dremio console:
- Click
in the side navigation bar to go to the Settings page. - Select Preferences in the settings sidebar.
- Enable the AI Features flag.
- Click Add model provider.
Default Model Provider
To delete the model provider, you must assign a new default unless you are deleting the last available model provider. To update the default model provider to a new one, you must have MODIFY privilege on both the current default and the new proposed default model provider.