If you are running the open source Portkey Gateway, refer to this guide on how to connect Portkey with Ollama.
Integration Steps
1
Expose your Ollama API
Expose your Ollama API using a tunneling service like ngrok or make it publicly accessible. Skip this if you’re self-hosting the Gateway.For using Ollama with ngrok, here’s a useful guide:
2
Add to Model Catalog
- Go to Model Catalog → Add Provider
- Enable “Local/Privately hosted provider” toggle
- Select Ollama as the provider type
- Enter your Ollama URL in Custom Host:
https://your-ollama.ngrok-free.app - Name your provider (e.g.,
my-ollama)
Complete Setup Guide
See all setup options
3
Use in Your Application
Important: For Ollama integration, you only need to pass the base URL to
customHost without the version identifier (such as /v1) - Portkey handles the rest!Supported Models
Ollama supports a wide range of models including:- Llama 3, Llama 3.1, Llama 3.2
- Mistral, Mixtral
- Gemma, Gemma 2
- Phi-3
- Qwen 2
- And many more!
Next Steps
Gateway Configs
Add retries, timeouts, and fallbacks
Observability
Monitor your Ollama requests
Custom Host Guide
Learn more about custom host setup
BYOLLM Guide
Complete guide for private LLMs
SDK Reference
Complete Portkey SDK documentation

