Skip to main content
Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including DeepSeek’s models. With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through Model Catalog.

Quick Start

Get DeepSeek working in 3 steps:
from portkey_ai import Portkey

# 1. Install: pip install portkey-ai
# 2. Add @deepseek provider in model catalog
# 3. Use it:

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
    model="@deepseek/deepseek-chat",
    messages=[{"role": "user", "content": "Say this is a test"}]
)

print(response.choices[0].message.content)
Tip: You can also set provider="@deepseek" in Portkey() and use just model="deepseek-chat" in the request.

Add Provider in Model Catalog

  1. Go to Model Catalog → Add Provider
  2. Select DeepSeek
  3. Choose existing credentials or create new by entering your DeepSeek API key
  4. Name your provider (e.g., deepseek-prod)

Complete Setup Guide →

See all setup options, code examples, and detailed instructions

Advanced Features

Multi-round Conversations

DeepSeek supports multi-turn conversations where context is maintained across messages:
from portkey_ai import Portkey

    client = Portkey(
    api_key="PORTKEY_API_KEY",
    provider="@deepseek"
    )

    # Round 1
    messages = [{"role": "user", "content": "What's the highest mountain in the world?"}]
    response = client.chat.completions.create(
        model="deepseek-chat",
        messages=messages
    )

    messages.append(response.choices[0].message)
    print(f"Messages Round 1: {messages}")

    # Round 2
    messages.append({"role": "user", "content": "What is the second?"})
    response = client.chat.completions.create(
        model="deepseek-chat",
        messages=messages
    )

    messages.append(response.choices[0].message)
    print(f"Messages Round 2: {messages}")

JSON Output

Force structured JSON responses from DeepSeek models:
    import json
from portkey_ai import Portkey

    client = Portkey(
    api_key="PORTKEY_API_KEY",
    provider="@deepseek"
    )

    system_prompt = """
    The user will provide some exam text. Please parse the "question" and "answer" and output them in JSON format.

    EXAMPLE INPUT:
    Which is the highest mountain in the world? Mount Everest.

    EXAMPLE JSON OUTPUT:
    {
        "question": "Which is the highest mountain in the world?",
        "answer": "Mount Everest"
    }
    """

    user_prompt = "Which is the longest river in the world? The Nile River."

messages = [
    {"role": "system", "content": system_prompt},
    {"role": "user", "content": user_prompt}
]

    response = client.chat.completions.create(
        model="deepseek-chat",
        messages=messages,
    response_format={"type": "json_object"}
    )

    print(json.loads(response.choices[0].message.content))

Managing DeepSeek Prompts

Manage all prompt templates to DeepSeek in the Prompt Library. All current DeepSeek models are supported, and you can easily test different prompts. Use the portkey.prompts.completions.create interface to use the prompt in an application.

Supported Endpoints

  • Chat Completions
  • Streaming Chat Completions

Next Steps

For complete SDK documentation:

SDK Reference

Complete Portkey SDK documentation