Skip to main content
Traceloop’s OpenLLMetry provides automatic LLM observability without code changes. Combined with Portkey, get comprehensive tracing plus gateway features like caching, fallbacks, and load balancing.

Why Traceloop + Portkey?

Non-Intrusive Monitoring

Automatic instrumentation without changing application code

OpenTelemetry Native

Built on industry-standard OpenTelemetry for maximum compatibility

Flexible Export

Send traces to Portkey or any OpenTelemetry-compatible backend

Gateway Intelligence

Portkey adds caching, fallbacks, and load balancing to every request

Quick Start

pip install openai traceloop-sdk
from traceloop.sdk import Traceloop
from openai import OpenAI

# Send traces to Portkey
Traceloop.init(
    disable_batch=True,
    api_endpoint="https://api.portkey.ai/v1/logs/otel",
    headers="x-portkey-api-key=YOUR_PORTKEY_API_KEY",
    telemetry_enabled=False
)

# Use Portkey gateway
client = OpenAI(
    api_key="YOUR_PORTKEY_API_KEY",
    base_url="https://api.portkey.ai/v1"
)

response = client.chat.completions.create(
    model="@openai-prod/gpt-4.1",  # Provider slug from Model Catalog
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
OpenTelemetry traces in Portkey

Setup

  1. Add provider in Model Catalog → get provider slug (e.g., @openai-prod)
  2. Get Portkey API key
  3. Use model="@provider-slug/model-name" in requests

Next Steps


See Your Traces in Action

Once configured, view your Traceloop instrumentation combined with Portkey gateway intelligence in the Portkey dashboard:
OpenTelemetry traces in Portkey