LLMParams
@Serializable
Represents configuration parameters for controlling the behavior of a language model.
Constructors
Link copied to clipboard
constructor(temperature: Double? = null, speculation: String? = null, schema: LLMParams.Schema? = null, toolChoice: LLMParams.ToolChoice? = null)
Properties
Link copied to clipboard
Link copied to clipboard
Reserved for speculative proposition of how result would look like, supported only by a number of models, but may greatly improve speed and accuracy of result. For example, in OpenAI that feature is called PredictedOutput
Link copied to clipboard
A parameter to control the randomness in the output. Higher values encourage more diverse results, while lower values produce deterministically focused outputs. The value is optional and defaults to null.
Link copied to clipboard
Used to switch tool calling behavior of LLM.