AIAgentLLMWriteSession

A session for managing interactions with a language learning model (LLM) and tools in an agent environment. This class provides functionality for executing LLM requests, managing tools, and customizing prompts dynamically within a specific session context.

Properties

Link copied to clipboard
open override var model: LLModel

Represents an override property model of type LLModel. This property is backed by an ActiveProperty, which ensures the property value is dynamically updated based on the active state determined by the isActive parameter.

Link copied to clipboard
open override var prompt: Prompt

Represents the prompt object used within the session. The prompt can be accessed or modified only when the session is in an active state, as determined by the isActive predicate.

Link copied to clipboard
open override var tools: List<ToolDescriptor>

Represents a collection of tools that are available for the session. The tools can be accessed or modified only if the session is in an active state.

Functions

Link copied to clipboard
inline suspend fun <ToolT : Tool<*, *>> callTool(args: Tool.Args): SafeTool.Result<out ToolResult>

Invokes a tool of the specified type with the provided arguments.

Executes the specified tool with the given arguments and returns the result within a SafeTool.Result wrapper.

inline suspend fun <TArgs : Tool.Args> callTool(toolName: String, args: TArgs): SafeTool.Result<out ToolResult>

Executes a tool by its name with the provided arguments.

inline suspend fun <TArgs : Tool.Args, TResult : ToolResult> callTool(toolClass: KClass<out Tool<TArgs, TResult>>, args: TArgs): SafeTool.Result<TResult>

Executes a tool operation based on the provided tool class and arguments.

Link copied to clipboard
inline suspend fun <TResult> AIAgentLLMWriteSession.callTool(toolFunction: KFunction<TResult>, vararg args: Any?): SafeToolFromCallable.Result<TResult>

Invokes a specified tool function within the AI Agent's write session context.

Link copied to clipboard
inline suspend fun <TArgs : Tool.Args> callToolRaw(toolName: String, args: TArgs): String

Executes a tool identified by its name with the provided arguments and returns the raw string result.

Link copied to clipboard
fun changeLLMParams(newParams: LLMParams)

Updates the language model's parameters used in the current session prompt.

Link copied to clipboard
fun changeModel(newModel: LLModel)

Updates the underlying model in the current prompt with the specified new model.

Link copied to clipboard
Link copied to clipboard
override fun close()
Link copied to clipboard

Executes a parallelized tool call using the provided data flow and tool function within the session.

Link copied to clipboard
inline fun <DataArgs, TResult> AIAgentLLMWriteSession.emitParallelToolCallsRaw(flow: Flow<DataArgs>, foolFunction: KFunction<TResult>, concurrency: Int = 16): Flow<String>

Executes parallel tool calls in a raw format using the provided flow of data arguments.

Link copied to clipboard
inline fun <ToolT : Tool<*, *>> findTool(): SafeTool<*, *>

Finds and retrieves a tool of the specified type from the current stage of the tool registry. If no tool of the given type is found, an exception is thrown.

Finds and retrieves a tool of the specified type from the tool registry.

Link copied to clipboard

Finds a specific tool within the tool registry using the given tool function and returns it as a safe tool.

Link copied to clipboard
inline fun <TArgs : Tool.Args> findToolByName(toolName: String): SafeTool<TArgs, *>

Finds a tool by its name and ensures its arguments are compatible with the specified type.

Link copied to clipboard

Finds and retrieves a tool by its name and argument/result types.

Link copied to clipboard
suspend fun AIAgentLLMWriteSession.replaceHistoryWithTLDR(strategy: HistoryCompressionStrategy = HistoryCompressionStrategy.WholeHistory, preserveMemory: Boolean = true)

Rewrites LLM message history, leaving only user message and resulting TLDR.

Link copied to clipboard
open suspend override fun requestLLM(): Message.Response

Makes an asynchronous request to a Large Language Model (LLM) and updates the current prompt with the response received from the LLM.

Link copied to clipboard
open suspend override fun requestLLMForceOneTool(tool: ToolDescriptor): Message.Response

Requests an LLM (Large Language Model) to forcefully utilize a specific tool during its operation.

open suspend override fun requestLLMForceOneTool(tool: Tool<*, *>): Message.Response

Requests the execution of a single specified tool, enforcing its use, and updates the prompt based on the generated response.

Link copied to clipboard
open suspend override fun requestLLMMultiple(): List<Message.Response>

Requests multiple responses from the LLM and updates the prompt with the received responses.

Link copied to clipboard
open suspend override fun requestLLMOnlyCallingTools(): Message.Response

Requests a response from the Language Learning Model (LLM) while also processing the response by updating the current prompt with the received message.

Link copied to clipboard
suspend fun requestLLMStreaming(definition: StructuredDataDefinition? = null): Flow<String>

Streams the result of a request to a language model.

Link copied to clipboard
open suspend override fun <T> requestLLMStructured(structure: StructuredData<T>, retries: Int, fixingModel: LLModel): Result<StructuredResponse<T>>

Requests an LLM (Language Model) to generate a structured output based on the provided structure. The response is post-processed to update the prompt with the raw response.

Link copied to clipboard
open suspend override fun <T> requestLLMStructuredOneShot(structure: StructuredData<T>): StructuredResponse<T>

Sends a request to the LLM using the given structured data and expects a structured response in one attempt. Updates the prompt with the raw response received from the LLM.

Link copied to clipboard
open suspend override fun requestLLMWithoutTools(): Message.Response

Sends a request to the Language Model (LLM) without including any tools, processes the response, and updates the prompt with the returned message.

Link copied to clipboard
fun rewritePrompt(body: (prompt: Prompt) -> Prompt)

Rewrites the current prompt by applying a transformation function.

Link copied to clipboard
Link copied to clipboard

Set the ai.koog.prompt.params.LLMParams.ToolChoice to ai.koog.prompt.params.LLMParams.ToolChoice.Auto to make LLM automatically decide between calling tools and generating text

Link copied to clipboard
inline fun <TArgs : Tool.Args, TResult : ToolResult> Flow<TArgs>.toParallelToolCalls(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Transforms a flow of arguments into a flow of results by asynchronously executing the given tool in parallel.

inline fun <TArgs : Tool.Args, TResult : ToolResult> Flow<TArgs>.toParallelToolCalls(tool: Tool<TArgs, TResult>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Executes the given tool in parallel for each element in the flow of arguments, up to the specified level of concurrency.

inline fun <TArgs : Tool.Args, TResult : ToolResult> Flow<TArgs>.toParallelToolCalls(toolClass: KClass<out Tool<TArgs, TResult>>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Transforms a Flow of tool argument objects into a Flow of parallel tool execution results, using the specified tool class.

Link copied to clipboard
inline fun <TArgs : Tool.Args, TResult : ToolResult> Flow<TArgs>.toParallelToolCallsRaw(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): Flow<String>

Executes a flow of tool arguments in parallel by invoking the provided tool's raw execution method. Converts each argument in the flow into a string result returned from the tool.

inline fun <TArgs : Tool.Args, TResult : ToolResult> Flow<TArgs>.toParallelToolCallsRaw(toolClass: KClass<out Tool<TArgs, TResult>>, concurrency: Int = 16): Flow<String>

Converts a flow of arguments into a flow of raw string results by executing the corresponding tool calls in parallel.

Link copied to clipboard

Unset the ai.koog.prompt.params.LLMParams.ToolChoice. Mostly, if left unspecified, the default value of this parameter is ai.koog.prompt.params.LLMParams.ToolChoice.Auto

Link copied to clipboard

Updates the current prompt by applying modifications defined in the provided block. The modifications are applied using a PromptBuilder instance, allowing for customization of the prompt's content, structure, and associated messages.