Skip to content

Groq

VariableRequired
GROQ_API_KEYYes
Terminal window
curl -X POST http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "groq/llama3-70b-8192", "messages": [{"role": "user", "content": "Hello!"}]}'
use llmg_providers::groq::GroqClient;
use llmg_core::provider::Provider;
let client = GroqClient::from_env()?;
// or
let client = GroqClient::new("gsk_...");
  • Chat completions (OpenAI-compatible)
  • Embeddings
  • Ultra-low latency inference