Skip to main content

Quick Start

Get started with Upstage AI in under 2 minutes:
from portkey_ai import Portkey

# 1. Install: pip install portkey-ai
# 2. Add @upstage provider in model catalog
# 3. Use it:

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
    model="@upstage/solar-pro",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Add Provider in Model Catalog

Before making requests, add Upstage to your Model Catalog:
  1. Go to Model Catalog → Add Provider
  2. Select Upstage
  3. Enter your Upstage API key
  4. Name your provider (e.g., upstage)

Complete Setup Guide

See all setup options and detailed configuration instructions

Upstage Documentation

Explore the official Upstage documentation

Upstage Capabilities

Streaming

Stream responses for real-time output:
from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY", provider="@upstage")

stream = portkey.chat.completions.create(
    model="solar-pro",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)

for chunk in stream:
    print(chunk.choices[0].delta.content or "", end="", flush=True)

Function Calling

Use Upstage’s function calling capabilities:
from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY", provider="@upstage")

tools = [{
    "type": "function",
    "function": {
        "name": "getWeather",
        "description": "Get the current weather",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {"type": "string", "description": "City and state"},
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
            },
            "required": ["location"]
        }
    }
}]

response = portkey.chat.completions.create(
    model="solar-pro",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "What's the weather like in Delhi - respond in JSON"}
    ],
    tools=tools,
    tool_choice="auto"
)

print(response.choices[0].message)

Embeddings

Generate embeddings for text:
from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY", provider="@upstage")

response = portkey.embeddings.create(
    input="Your text string goes here",
    model="embedding-query"
)

print(response.data[0].embedding)

Supported Models

Chat Models:
  • solar-pro
  • solar-mini
  • solar-mini-ja
Embedding Models:
  • embedding-passage
  • embedding-query

Supported Endpoints and Parameters

EndpointSupported Parameters
/chat/completionsmessages, max_tokens, temperature, top_p, stream, presence_penalty, frequency_penalty, tools, tool_choice
/embeddingsmodel, input, encoding_format, dimensions, user

Next Steps

For complete SDK documentation:

SDK Reference

Complete Portkey SDK documentation