Quick Start
Get started with LemonFox AI in under 2 minutes:Add Provider in Model Catalog
Before making requests, add LemonFox AI to your Model Catalog:- Go to Model Catalog → Add Provider
- Select LemonFox AI
- Enter your LemonFox API key
- Name your provider (e.g.,
lemonfox-ai)
Complete Setup Guide
See all setup options and detailed configuration instructions
Lemonfox AI Documentation
Explore the official Lemonfox AI documentation
LemonFox Capabilities
Streaming
Stream responses for real-time output:Image Generation
Generate images with Stable Diffusion XL:Speech-to-Text
Transcribe audio with Whisper:Supported Models
Supported Lemonfox AI Models
Supported Lemonfox AI Models
Chat Models:
- Mixtral AI
- Llama 3.1 8B
- Llama 3.1 70B
- Whisper large-v3
- Stable Diffusion XL (SDXL)
Supported Endpoints and Parameters
| Endpoint | Supported Parameters |
|---|---|
/chat/completions | messages, max_tokens, temperature, top_p, stream, presence_penalty, frequency_penalty |
/images/generations | prompt, response_format, negative_prompt, size, n |
/audio/transcriptions | translate, language, prompt, response_format, file |
Next Steps
Gateway Configs
Add fallbacks, load balancing, and more
Observability
Monitor and trace your LemonFox requests
Prompt Library
Manage and version your prompts
Metadata
Add custom metadata to requests
SDK Reference
Complete Portkey SDK documentation

