LLMRequest
a chat request to an LLM provider.
Build with LLM.chat-request, LLM.chat-request-with-tools,
LLM.chat-request-json, or LLM.chat-request-with-schema rather than
constructing directly.
model: provider-specific model identifiermessages: the conversation historymax-tokens: maximum output tokenstemperature: sampling temperature (0.0 to 2.0)tools: tool definitions (empty array if none)format-mode:format-text,format-json, orformat-json-schemaformat-schema: JSON schema for structured output (used when mode isformat-json-schema)
delete
(Fn [LLMRequest] ())
deletes a LLMRequest. Should usually not be called manually.
format-mode
(Fn [(Ref LLMRequest a)] (Ref Int a))
gets the format-mode property of a LLMRequest.
format-schema
(Fn [(Ref LLMRequest a)] (Ref JSON a))
gets the format-schema property of a LLMRequest.
init
(Fn [String, (Array Message), Int, Double, (Array ToolDef), Int, JSON] LLMRequest)
creates a LLMRequest.
max-tokens
(Fn [(Ref LLMRequest a)] (Ref Int a))
gets the max-tokens property of a LLMRequest.
messages
(Fn [(Ref LLMRequest a)] (Ref (Array Message) a))
gets the messages property of a LLMRequest.
set-format-mode
(Fn [LLMRequest, Int] LLMRequest)
sets the format-mode property of a LLMRequest.
set-format-mode!
(Fn [(Ref LLMRequest a), Int] ())
sets the format-mode property of a LLMRequest in place.
set-format-schema
(Fn [LLMRequest, JSON] LLMRequest)
sets the format-schema property of a LLMRequest.
set-format-schema!
(Fn [(Ref LLMRequest a), JSON] ())
sets the format-schema property of a LLMRequest in place.
set-max-tokens
(Fn [LLMRequest, Int] LLMRequest)
sets the max-tokens property of a LLMRequest.
set-max-tokens!
(Fn [(Ref LLMRequest a), Int] ())
sets the max-tokens property of a LLMRequest in place.
set-messages
(Fn [LLMRequest, (Array Message)] LLMRequest)
sets the messages property of a LLMRequest.
set-messages!
(Fn [(Ref LLMRequest a), (Array Message)] ())
sets the messages property of a LLMRequest in place.
set-model!
(Fn [(Ref LLMRequest a), String] ())
sets the model property of a LLMRequest in place.
set-temperature
(Fn [LLMRequest, Double] LLMRequest)
sets the temperature property of a LLMRequest.
set-temperature!
(Fn [(Ref LLMRequest a), Double] ())
sets the temperature property of a LLMRequest in place.
set-tools
(Fn [LLMRequest, (Array ToolDef)] LLMRequest)
sets the tools property of a LLMRequest.
set-tools!
(Fn [(Ref LLMRequest a), (Array ToolDef)] ())
sets the tools property of a LLMRequest in place.
temperature
(Fn [(Ref LLMRequest a)] (Ref Double a))
gets the temperature property of a LLMRequest.
tools
(Fn [(Ref LLMRequest a)] (Ref (Array ToolDef) a))
gets the tools property of a LLMRequest.
update-format-mode
(Fn [LLMRequest, (Ref (Fn [Int] Int a) b)] LLMRequest)
updates the format-mode property of a LLMRequest using a function f.
update-format-schema
(Fn [LLMRequest, (Ref (Fn [JSON] JSON a) b)] LLMRequest)
updates the format-schema property of a LLMRequest using a function f.
update-max-tokens
(Fn [LLMRequest, (Ref (Fn [Int] Int a) b)] LLMRequest)
updates the max-tokens property of a LLMRequest using a function f.
update-messages
(Fn [LLMRequest, (Ref (Fn [(Array Message)] (Array Message) a) b)] LLMRequest)
updates the messages property of a LLMRequest using a function f.
update-model
(Fn [LLMRequest, (Ref (Fn [String] String a) b)] LLMRequest)
updates the model property of a LLMRequest using a function f.
update-temperature
(Fn [LLMRequest, (Ref (Fn [Double] Double a) b)] LLMRequest)
updates the temperature property of a LLMRequest using a function f.
update-tools
(Fn [LLMRequest, (Ref (Fn [(Array ToolDef)] (Array ToolDef) a) b)] LLMRequest)
updates the tools property of a LLMRequest using a function f.