Anthropic model provider
Anthropic’s Claude family of large language models (LLMs) is known for advanced reasoning capabilities and built-in safety features. Use Anthropic as your model provider when you need Claude’s unique capabilities for document analysis and processing.
Anthropic doesn’t provide embedding models. Configure a different provider (OpenAI, Azure, or AWS Bedrock) for the textEmbeddings service.
Environment variables
Set the Anthropic API key in your Docker Compose file or command line.
ANTHROPIC_API_KEY— The API key for the Anthropic service. Create an API key in the Anthropic Console(opens in a new tab):
services: ai-assistant: environment: - ANTHROPIC_API_KEY=your-anthropic-api-key ...Refer to the configuration options guide for all the configurable environment variables.
Service configuration file
To use Anthropic with AI Assistant, create a service configuration file, as explained in the model-provider configuration guide.
Since Anthropic doesn’t provide embedding models, you’ll need to configure a separate provider for the textEmbeddings service. The following example uses OpenAI for embeddings:
version: '1'
aiServices: chat: provider: name: 'anthropic' apiKey: 'your-anthropic-api-key' # Optional if set via environment variable. model: 'claude-sonnet-4-20250514' textEmbeddings: provider: name: 'openai' apiKey: 'your-openai-api-key' # Optional if set via environment variable. model: 'text-embedding-3-small'provider:name— The name of the provider. Set this toanthropic.apiKey— The API key for the Anthropic service. This is optional if you’ve setANTHROPIC_API_KEYas an environment variable.
model— The name of the Claude model to use. See the recommended models section below.
Recommended models
Anthropic offers several Claude models with different capabilities and pricing. Choose a model based on your performance and cost requirements:
| Model | Description | Use case |
|---|---|---|
claude-haiku-4-5 | Fast and cost-effective | High-volume, simpler tasks |
claude-sonnet-4-5 | Balanced model with strong reasoning | General document analysis and Q&A |
claude-opus-4-5 | Most capable model for complex tasks | Complex document workflows, slower |
For the most current model names and capabilities, see the Anthropic model documentation(opens in a new tab).
Example configurations
Since Anthropic doesn’t provide embedding models, you’ll need to pair Anthropic’s chat models with embeddings from another provider.
Anthropic chat with OpenAI embeddings
version: '1'
aiServices: chat: provider: name: 'anthropic' model: 'claude-sonnet-4-5' textEmbeddings: provider: name: 'openai' model: 'text-embedding-3-small'Anthropic chat with Azure OpenAI embeddings
version: '1'
aiServices: chat: provider: name: 'anthropic' model: 'claude-sonnet-4-5' textEmbeddings: provider: name: 'azure' instanceName: 'your-azure-instance' model: 'your-embedding-deployment-name'Anthropic chat with AWS Bedrock embeddings
version: '1'
aiServices: chat: provider: name: 'anthropic' model: 'claude-sonnet-4-5' textEmbeddings: provider: name: 'bedrock' region: 'us-west-2' model: 'amazon.titan-embed-text-v1'Pricing
Anthropic uses a pay-per-token pricing model. Costs vary based on the model selected and the volume of input and output tokens processed. For current pricing, refer to the Anthropic pricing page(opens in a new tab).