Quick Start
Get started with Inference.net in under 2 minutes:Add Provider in Model Catalog
Before making requests, add Inference.net to your Model Catalog:- Go to Model Catalog → Add Provider
- Select Inference.net
- Enter your Inference.net API key
- Name your provider (e.g.,
inference-net)
Complete Setup Guide
See all setup options and detailed configuration instructions
Supported Models
Inference.net provides distributed GPU compute for various open-source models including:- Llama 3
- Mistral
- And other popular open-source models
Next Steps
Gateway Configs
Add fallbacks, load balancing, and more
Observability
Monitor and trace your Inference.net requests
Prompt Library
Manage and version your prompts
Metadata
Add custom metadata to requests
SDK Reference
Complete Portkey SDK documentation

