Integration Steps
1
Expose your LocalAI Server
Ensure your LocalAI API is externally accessible. If running on
http://localhost, use a tool like ngrok to create a public URL.2
Add to Model Catalog
- Go to Model Catalog → Add Provider
- Enable “Local/Privately hosted provider” toggle
- Select OpenAI as the provider type (LocalAI follows OpenAI API schema)
- Enter your LocalAI URL with
/v1in Custom Host:https://your-localai.ngrok-free.app/v1 - Name your provider (e.g.,
my-localai)
Complete Setup Guide
See all setup options
3
Use in Your Application
Important:
- Don’t forget to include the version identifier (
/v1) in the Custom Host URL - Portkey supports all endpoints that adhere to the OpenAI specification
Next Steps
Gateway Configs
Add retries, timeouts, and fallbacks
Observability
Monitor your LocalAI requests
Custom Host Guide
Learn more about custom host setup
BYOLLM Guide
Complete guide for private LLMs
SDK Reference
Complete Portkey SDK documentation

