Anthropic’s Claude family of large language models (LLMs) is known for advanced reasoning capabilities and built-in safety features. Use Anthropic as your model provider when you need Claude’s unique capabilities for document analysis and processing.

Anthropic doesn’t provide embedding models. Configure a different provider (OpenAI, Azure, or AWS Bedrock) for the model that uses the default-embedding label.

Environment variables

Set the Anthropic API key in your Docker Compose file or command line.

services:
ai-assistant:
environment:
- ANTHROPIC_API_KEY=your-anthropic-api-key
...

Refer to the configuration options guide for all the configurable environment variables.

Service configuration file

To use Anthropic with AI Assistant, create a service configuration file, as explained in the model-provider configuration guide.

Since Anthropic doesn’t provide embedding models, you’ll need to configure a separate provider for embeddings. The following example uses OpenAI for embeddings:

version: "2"
providers:
- name: "anthropic"
apiKey: "your-anthropic-api-key" # Optional if set via environment variable.
- name: "openai"
apiKey: "your-openai-api-key" # Optional if set via environment variable.
models:
- model: "anthropic:claude-sonnet-4-20250514"
labels: ["default-llm"]
- model: "openai:text-embedding-3-small"
labels: ["default-embedding"]
  • providers:
    • name — The name of the provider. Set this to anthropic.
    • apiKey — The API key for the Anthropic service. This is optional if you’ve set ANTHROPIC_API_KEY as an environment variable.
  • models:
    • model — The model in provider:model format.
    • labels — Use default-llm for the Claude model and default-embedding for the embedding model.

Anthropic offers several Claude models with different capabilities and pricing. Choose a model based on your performance and cost requirements:

ModelDescriptionUse case
claude-haiku-4-5Fast and cost-effectiveHigh-volume, simpler tasks
claude-sonnet-4-5Balanced model with strong reasoningGeneral document analysis and Q&A
claude-opus-4-5Most capable model for complex tasksComplex document workflows, slower

For the most current model names and capabilities, see the Anthropic model documentation(opens in a new tab).

Example configurations

Since Anthropic doesn’t provide embedding models, you’ll need to pair Anthropic’s chat models with embeddings from another provider.

Anthropic chat with OpenAI embeddings

version: "2"
providers:
- name: "anthropic"
- name: "openai"
models:
- model: "anthropic:claude-sonnet-4-5"
labels: ["default-llm"]
- model: "openai:text-embedding-3-small"
labels: ["default-embedding"]

Anthropic chat with Azure OpenAI embeddings

version: "2"
providers:
- name: "anthropic"
- name: "azure"
instanceName: "your-azure-instance"
models:
- model: "anthropic:claude-sonnet-4-5"
labels: ["default-llm"]
- model: "azure:your-embedding-deployment-name"
labels: ["default-embedding"]

Anthropic chat with AWS Bedrock embeddings

version: "2"
providers:
- name: "anthropic"
- name: "bedrock"
region: "us-west-2"
models:
- model: "anthropic:claude-sonnet-4-5"
labels: ["default-llm"]
- model: "bedrock:amazon.titan-embed-text-v1"
labels: ["default-embedding"]

Pricing

Anthropic uses a pay-per-token pricing model. Costs vary based on the model selected and the volume of input and output tokens processed. For current pricing, refer to the Anthropic pricing page(opens in a new tab).