LLMG Documentation
LLMG is a high-performance LLM gateway written in Rust. It provides a single, unified OpenAI-compatible API that routes requests to 70+ LLM providers — including OpenAI, Anthropic, Azure, Google, AWS Bedrock, Groq, DeepSeek, Mistral, and many more.
Use it as a Rust library in your applications or deploy it as an HTTP gateway in front of your LLM infrastructure.
Get Started
Section titled “Get Started”- Introduction — What LLMG is and why it exists
- Quick Start — Send your first request in under a minute
- Installation — Library and gateway setup
Learn More
Section titled “Learn More”- Gateway Configuration — Env vars, config files, and model aliases
- Docker Deployment — Run the gateway with Docker or Docker Compose
- Provider Overview — Browse all 70+ supported providers
- Library Usage — Use LLMG as a Rust crate directly