SingleLLMPromptExecutor
Deprecated
Please use MultiLLMPromptExecutor instead
Replace with
import ai.koog.prompt.executor.llms.MultiLLMPromptExecutor
MultiLLMPromptExecutorExecutes prompts using a direct client for communication with large language model (LLM) providers.
This class provides functionality to execute prompts with optional tools and retrieve either a list of responses or a streaming flow of response chunks from the LLM provider. It delegates the actual LLM interaction to the provided implementation of LLMClient.
Parameters
The client used for direct communication with the LLM provider.
Functions
Executes a given prompt using the specified LLM and tools, returning a list of responses from the model.
Receives multiple independent choices from the LLM. The method is implemented only for some specific providers which support multiple LLM choices.
Executes a given prompt using the specified LLM and returns a stream of output as a flow of StreamFrame objects.
Executes a prompt with structured output, enhancing it with schema instructions or native structured output parameter, and parses the response into the defined structure.
Basic JSON schema generator required for the given model. Return BasicJsonSchemaGenerator by default.
Standard JSON schema generator required for the given model. Return StandardJsonSchemaGenerator by default.
Moderates the content of a given message with attachments using a specified LLM.
Parses a structured response from the assistant message using the provided structured output configuration and language model. If a fixing parser is specified in the configuration, it will be used; otherwise, the structure will be parsed directly.