Quick Start
Get Mistral AI working in 3 steps:Tip: You can also set
provider="@mistral-ai" in Portkey() and use just model="mistral-large-latest" in the request.Add Provider in Model Catalog
- Go to Model Catalog → Add Provider
- Select Mistral AI
- Choose existing credentials or create new by entering your Mistral AI API key
- Name your provider (e.g.,
mistral-ai-prod)
Complete Setup Guide →
See all setup options, code examples, and detailed instructions
Codestral Endpoint
Mistral AI provides a dedicated Codestral endpoint for code generation. Use thecustomHost property to access it:

Codestral vs Mistral API Endpoint
Here’s a guide for when to use the Codestral endpoint vs the standard Mistral API endpoint:
Mistral Tool Calling
Tool calling lets models trigger external tools based on conversation context. You define available functions, the model chooses when to use them, and your application executes them and returns results. Portkey supports Mistral tool calling and makes it interoperable across multiple providers. With Portkey Prompts, you can templatize your prompts and tool schemas.Managing Mistral AI Prompts
Manage all prompt templates to Mistral AI in the Prompt Library. All current Mistral AI models are supported, and you can easily test different prompts. Use theportkey.prompts.completions.create interface to use the prompt in an application.
Next Steps
Add Metadata
Add metadata to your Mistral AI requests
Gateway Configs
Add gateway configs to your Mistral AI requests
Tracing
Trace your Mistral AI requests
Fallbacks
Setup fallback from OpenAI to Mistral AI
SDK Reference
Complete Portkey SDK documentation

