LLMParamsUpdateContext

Represents a mutable context for updating the parameters of an LLM (Language Learning Model). The class is used internally to facilitate changes to various configurations, such as temperature, speculation, schema, and tool choice, before converting back to an immutable LLMParams instance.

Properties

Link copied to clipboard

A schema configuration that describes the structure of the output. This can include JSON-based schema definitions for fine-tuned output generation. This property is mutable for schema updates.

Link copied to clipboard

A speculative configuration string that influences model behavior, designed to enhance result speed and accuracy. This property is mutable for modifying the speculation setting.

Link copied to clipboard

The temperature value that adjusts randomness in the model's output. Higher values produce diverse results, while lower values yield deterministic responses. This property is mutable to allow updates during the context's lifecycle.

Link copied to clipboard

Defines the behavior of the LLM regarding tool usage, allowing choices such as automatic tool invocations or restricted tool interactions. This property is mutable to enable reconfiguration.

Functions

Link copied to clipboard

Converts the current context of parameters into an instance of LLMParams.