Skip to content

Ollama

VariableDefaultRequired
OLLAMA_BASE_URLhttp://localhost:11434No

No API key is required — Ollama runs locally.

Terminal window
curl -X POST http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "ollama/llama3", "messages": [{"role": "user", "content": "Hello!"}]}'
use llmg_providers::ollama::OllamaClient;
use llmg_core::provider::Provider;
let client = OllamaClient::from_env();
// or
let client = OllamaClient::new().with_base_url("http://my-server:11434");
  • Chat completions (auto-converts to Ollama format)
  • Embeddings
  • Custom base URL for remote Ollama instances

Note: OllamaClient::from_env() returns Self directly (not Result) since no API key is required. This differs from cloud providers like OpenAiClient::from_env() which return Result<Self, LlmError>.