Groq
Configuration
Section titled “Configuration”| Variable | Required |
|---|---|
GROQ_API_KEY | Yes |
Gateway
Section titled “Gateway”curl -X POST http://localhost:8080/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{"model": "groq/llama3-70b-8192", "messages": [{"role": "user", "content": "Hello!"}]}'Library
Section titled “Library”use llmg_providers::groq::GroqClient;use llmg_core::provider::Provider;
let client = GroqClient::from_env()?;// orlet client = GroqClient::new("gsk_...");Features
Section titled “Features”- Chat completions (OpenAI-compatible)
- Embeddings
- Ultra-low latency inference