Anthropic model provider
Anthropic’s Claude family of large language models (LLMs) is known for advanced reasoning capabilities and built-in safety features. Use Anthropic as your model provider when you need Claude’s unique capabilities for document analysis and processing.
Anthropic doesn’t provide embedding models. Configure a different provider (OpenAI, Azure, or AWS Bedrock) for the model that uses the default-embedding label.
Environment variables
Set the Anthropic API key in your Docker Compose file or command line.
ANTHROPIC_API_KEY— The API key for the Anthropic service. Create an API key in the Anthropic Console(opens in a new tab):
services: ai-assistant: environment: - ANTHROPIC_API_KEY=your-anthropic-api-key ...Refer to the configuration options guide for all the configurable environment variables.
Service configuration file
To use Anthropic with AI Assistant, create a service configuration file, as explained in the model-provider configuration guide.
Since Anthropic doesn’t provide embedding models, you’ll need to configure a separate provider for embeddings. The following example uses OpenAI for embeddings:
version: "2"
providers: - name: "anthropic" apiKey: "your-anthropic-api-key" # Optional if set via environment variable. - name: "openai" apiKey: "your-openai-api-key" # Optional if set via environment variable.
models: - model: "anthropic:claude-sonnet-4-20250514" labels: ["default-llm"] - model: "openai:text-embedding-3-small" labels: ["default-embedding"]providers:name— The name of the provider. Set this toanthropic.apiKey— The API key for the Anthropic service. This is optional if you’ve setANTHROPIC_API_KEYas an environment variable.
models:model— The model inprovider:modelformat.labels— Usedefault-llmfor the Claude model anddefault-embeddingfor the embedding model.
Recommended models
Anthropic offers several Claude models with different capabilities and pricing. Choose a model based on your performance and cost requirements:
| Model | Description | Use case |
|---|---|---|
claude-haiku-4-5 | Fast and cost-effective | High-volume, simpler tasks |
claude-sonnet-4-5 | Balanced model with strong reasoning | General document analysis and Q&A |
claude-opus-4-5 | Most capable model for complex tasks | Complex document workflows, slower |
For the most current model names and capabilities, see the Anthropic model documentation(opens in a new tab).
Example configurations
Since Anthropic doesn’t provide embedding models, you’ll need to pair Anthropic’s chat models with embeddings from another provider.
Anthropic chat with OpenAI embeddings
version: "2"
providers: - name: "anthropic" - name: "openai"
models: - model: "anthropic:claude-sonnet-4-5" labels: ["default-llm"] - model: "openai:text-embedding-3-small" labels: ["default-embedding"]Anthropic chat with Azure OpenAI embeddings
version: "2"
providers: - name: "anthropic" - name: "azure" instanceName: "your-azure-instance"
models: - model: "anthropic:claude-sonnet-4-5" labels: ["default-llm"] - model: "azure:your-embedding-deployment-name" labels: ["default-embedding"]Anthropic chat with AWS Bedrock embeddings
version: "2"
providers: - name: "anthropic" - name: "bedrock" region: "us-west-2"
models: - model: "anthropic:claude-sonnet-4-5" labels: ["default-llm"] - model: "bedrock:amazon.titan-embed-text-v1" labels: ["default-embedding"]Pricing
Anthropic uses a pay-per-token pricing model. Costs vary based on the model selected and the volume of input and output tokens processed. For current pricing, refer to the Anthropic pricing page(opens in a new tab).