Skip to main content
Portkey provides a robust gateway to facilitate the integration of your locally hosted models through LocalAI.

Integration Steps

1

Expose your LocalAI Server

Ensure your LocalAI API is externally accessible. If running on http://localhost, use a tool like ngrok to create a public URL.
ngrok http 8080
2

Add to Model Catalog

  1. Go to Model Catalog → Add Provider
  2. Enable “Local/Privately hosted provider” toggle
  3. Select OpenAI as the provider type (LocalAI follows OpenAI API schema)
  4. Enter your LocalAI URL with /v1 in Custom Host: https://your-localai.ngrok-free.app/v1
  5. Name your provider (e.g., my-localai)

Complete Setup Guide

See all setup options
3

Use in Your Application

from portkey_ai import Portkey

portkey = Portkey(
    api_key="PORTKEY_API_KEY",
    provider="@my-localai"
)

response = portkey.chat.completions.create(
    model="ggml-koala-7b-model-q4_0-r2.bin",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
Or use custom host directly:
from portkey_ai import Portkey

portkey = Portkey(
    api_key="PORTKEY_API_KEY",
    provider="openai",
    custom_host="https://your-localai.ngrok-free.app/v1"
)
Important:
  • Don’t forget to include the version identifier (/v1) in the Custom Host URL
  • Portkey supports all endpoints that adhere to the OpenAI specification

Next Steps

For complete SDK documentation:

SDK Reference

Complete Portkey SDK documentation