llm
A multi-provider LLM client for Carp.
Supports Anthropic, OpenAI, Ollama, and Google Gemini behind a single common API.
Features chat, streaming, tool use, and structured output (JSON mode and JSON schema) across all providers.
Quick start (Ollama, no API key)
(load "llm.carp")
(defn main []
(let [config (LLM.ollama "http://localhost:11434")
req (LLM.chat-request "llama3" [(Message.user "hello")] 256 0.7)]
(match (LLM.chat &config &req)
(Result.Success r) (println* (LLMResponse.content &r))
(Result.Error e) (IO.errorln &(LLMError.str &e)))))
Switching providers
(LLM.anthropic "sk-ant-...")
(LLM.openai "sk-...")
(LLM.ollama "http://localhost:11434")
(LLM.gemini "AIza...")
Streaming
(match (LLM.chat-stream &config &req)
(Result.Success stream)
(do
(while-do true
(match (LlmStream.poll &stream)
(Maybe.Nothing) (break)
(Maybe.Just tok) (IO.print &tok)))
(LlmStream.close stream))
(Result.Error e) (IO.errorln &(LLMError.str &e)))
Tool use
(let [schema (JSON.obj [(JSON.entry @"type" (JSON.Str @"object"))])
tools [(ToolDef.init @"get_weather" @"Get weather" schema)]
req (LLM.chat-request-with-tools "gpt-4" msgs 256 0.7 tools)]
(LLM.chat &config &req))
JSON output
; Any valid JSON
(LLM.chat-request-json model msgs max-tokens temp)
; Schema-constrained
(LLM.chat-request-with-schema model msgs max-tokens temp schema)
Requirements
Requires OpenSSL for HTTPS providers (Anthropic, OpenAI, Gemini). Ollama over plain HTTP works without OpenSSL.
Provider quirks
- Anthropic: No native JSON mode —
chat-request-jsonandchat-request-with-schemafall back to a system prompt instruction. Best-effort, not guaranteed. - Gemini: Uses the
/v1betaendpoint for tool support. Streaming uses:streamGenerateContent?alt=sse. No call IDs on tool calls. - Ollama: Runs over plain HTTP. Uses NDJSON for streaming (one JSON object per line) instead of SSE.