Integrate DeepSeek models with Portkey’s AI Gateway
Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including DeepSeek’s models.With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through Model Catalog.
from portkey_ai import Portkey# 1. Install: pip install portkey-ai# 2. Add @deepseek provider in model catalog# 3. Use it:portkey = Portkey(api_key="PORTKEY_API_KEY")response = portkey.chat.completions.create( model="@deepseek/deepseek-chat", messages=[{"role": "user", "content": "Say this is a test"}])print(response.choices[0].message.content)
Tip: You can also set provider="@deepseek" in Portkey() and use just model="deepseek-chat" in the request.
Force structured JSON responses from DeepSeek models:
Copy
Ask AI
import jsonfrom portkey_ai import Portkey client = Portkey( api_key="PORTKEY_API_KEY", provider="@deepseek" ) system_prompt = """ The user will provide some exam text. Please parse the "question" and "answer" and output them in JSON format. EXAMPLE INPUT: Which is the highest mountain in the world? Mount Everest. EXAMPLE JSON OUTPUT: { "question": "Which is the highest mountain in the world?", "answer": "Mount Everest" } """ user_prompt = "Which is the longest river in the world? The Nile River."messages = [ {"role": "system", "content": system_prompt}, {"role": "user", "content": user_prompt}] response = client.chat.completions.create( model="deepseek-chat", messages=messages, response_format={"type": "json_object"} ) print(json.loads(response.choices[0].message.content))
Manage all prompt templates to DeepSeek in the Prompt Library. All current DeepSeek models are supported, and you can easily test different prompts.Use the portkey.prompts.completions.create interface to use the prompt in an application.