AIAgentLLMWriteSession

A session for managing interactions with a language learning model (LLM) and tools in an agent environment. This class provides functionality for executing LLM requests, managing tools, and customizing prompts dynamically within a specific session context.

JVM actual implementation of a mutable LLM session.

In addition to common suspend APIs, this class exposes Java-friendly wrappers that run session operations on the strategy dispatcher.

A session for managing interactions with a language learning model (LLM) and tools in an agent environment. This class provides functionality for executing LLM requests, managing tools, and customizing prompts dynamically within a specific session context.

Constructors

Link copied to clipboard
actual constructor(environment: AIAgentEnvironment, executor: ERROR CLASS: Symbol not found for PromptExecutor, tools: List<ERROR CLASS: Symbol not found for ToolDescriptor>, toolRegistry: ERROR CLASS: Symbol not found for ToolRegistry, prompt: ERROR CLASS: Symbol not found for Prompt, model: ERROR CLASS: Symbol not found for LLModel, responseProcessor: ERROR CLASS: Symbol not found for ResponseProcessor??, config: AIAgentConfig, clock: ERROR CLASS: Symbol not found for Clock)
actual constructor(environment: AIAgentEnvironment, executor: PromptExecutor, tools: List<ToolDescriptor>, toolRegistry: ToolRegistry, prompt: Prompt, model: LLModel, responseProcessor: ResponseProcessor?, config: AIAgentConfig, clock: Clock)

Properties

Link copied to clipboard
val clock: kotlin/time/Clock
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard

Represents the active language model used within the session.

var model: ai/koog/prompt/llm/LLModel

Represents the active language model used within the session.

Represents the active language model used within the session.

Link copied to clipboard

Represents the current prompt associated with the LLM session.

var prompt: ai/koog/prompt/dsl/Prompt

Represents the current prompt associated with the LLM session.

Represents the current prompt associated with the LLM session.

Link copied to clipboard

Represents the active response processor within the session.

var responseProcessor: ai/koog/prompt/processor/ResponseProcessor??

Represents the active response processor within the session.

Represents the active response processor within the session.

Link copied to clipboard
val toolRegistry: ai/koog/agents/core/tools/ToolRegistry
Link copied to clipboard

Provides a list of available tools in the session.

var tools: List<ai/koog/agents/core/tools/ToolDescriptor>

Provides a list of available tools in the session.

Provides a list of available tools in the session.

Functions

Link copied to clipboard

Appends messages to the current prompt using PromptBuilder.

fun appendPrompt(body: ai/koog/prompt/dsl/PromptBuilder.() -> Unit)

Appends messages to the current prompt using PromptBuilder.

Appends messages to the current prompt using PromptBuilder.

Link copied to clipboard
inline suspend fun <ToolT : Tool<Any?, Any?>> AIAgentLLMWriteSession.callTool(args: Any?): SafeTool.Result<out Any?>

Invokes a tool of the specified type with the provided arguments.

Executes the specified tool with the given arguments and returns the result within a SafeTool.Result wrapper.

suspend fun <TArgs> AIAgentLLMWriteSession.callTool(toolName: String, args: TArgs): SafeTool.Result<out Any?>

Executes a tool by its name with the provided arguments.

Executes a tool operation based on the provided tool class and arguments.

inline suspend fun <TResult> AIAgentLLMWriteSession.callTool(toolFunction: KFunction<TResult>, vararg args: Any?): SafeTool.Result<TResult>

Invokes a specified tool function within the AI Agent's write session context.

Link copied to clipboard
suspend fun <TArgs> AIAgentLLMWriteSession.callToolRaw(toolName: String, args: TArgs): String

Executes a tool identified by its name with the provided arguments and returns the raw string result.

Link copied to clipboard
fun changeLLMParams(newParams: LLMParams)

Updates LLM parameters on the current prompt.

fun changeLLMParams(newParams: ai/koog/prompt/params/LLMParams)

Updates LLM parameters on the current prompt.

fun changeLLMParams(newParams: LLMParams)

Updates LLM parameters on the current prompt.

Link copied to clipboard
fun changeModel(newModel: LLModel)

Updates the active model in this session.

fun changeModel(newModel: ai/koog/prompt/llm/LLModel)

Updates the active model in this session.

fun changeModel(newModel: LLModel)

Updates the active model in this session.

Link copied to clipboard

Clears the history of messages in the current AI Agent LLM Write Session.

Clears the history of messages in the current AI Agent LLM Write Session.

Clears the history of messages in the current AI Agent LLM Write Session.

Link copied to clipboard
open override fun close()
open override fun close()
open override fun close()
Link copied to clipboard
fun dropLastNMessages(n: Int, preserveSystemMessages: Boolean = true)

Removes the last n messages from the current prompt in the write session.

fun dropLastNMessages(n: Int, preserveSystemMessages: Boolean = true)

Removes the last n messages from the current prompt in the write session.

fun dropLastNMessages(n: Int, preserveSystemMessages: Boolean = true)

Removes the last n messages from the current prompt in the write session.

Link copied to clipboard

Drops all trailing tool call messages from the current prompt

Drops all trailing tool call messages from the current prompt

Drops all trailing tool call messages from the current prompt

Link copied to clipboard

Finds a specific tool instance from the tool registry by a tool instance type.

Finds a specific tool instance from the tool registry by tool class.

fun <TArgs, TResult> findTool(tool: ai/koog/agents/core/tools/Tool<TArgs, TResult>): SafeTool<TArgs, TResult>

Finds a specific tool instance from the tool registry by a tool instance type.

fun <TArgs, TResult> findTool(toolClass: KClass<out ai/koog/agents/core/tools/Tool<TArgs, TResult>>): SafeTool<TArgs, TResult>

Finds a specific tool instance from the tool registry by tool class.

Finds a specific tool instance from the tool registry by a tool instance type.

Finds a specific tool instance from the tool registry by tool class.

Link copied to clipboard
inline fun <TResult> AIAgentLLMWriteSession.findTool(toolFunction: KFunction<TResult>): SafeTool<ERROR CLASS: Symbol not found for ToolFromCallable.Args, TResult>

Finds a specific tool within the tool registry using the given tool function and returns it as a safe tool.

Link copied to clipboard

Finds a tool by its name and ensures its arguments are compatible with the specified type.

Link copied to clipboard

Finds and retrieves a tool by its name and argument/result types.

Link copied to clipboard

Appends a prompt using the provided prompt update action.

Link copied to clipboard
fun javaRequestLLMStreaming(structureDefinition: ERROR CLASS: Symbol not found for StructureDefinition, executorService: ExecutorService? = null): Flow.Publisher<ERROR CLASS: Symbol not found for StreamFrame>
Link copied to clipboard
fun leaveLastNMessages(n: Int, preserveSystemMessages: Boolean = true)

Keeps only the last N messages in the session's prompt by removing all earlier messages.

fun leaveLastNMessages(n: Int, preserveSystemMessages: Boolean = true)

Keeps only the last N messages in the session's prompt by removing all earlier messages.

fun leaveLastNMessages(n: Int, preserveSystemMessages: Boolean = true)

Keeps only the last N messages in the session's prompt by removing all earlier messages.

Link copied to clipboard
fun leaveMessagesFromTimestamp(timestamp: Instant, preserveSystemMessages: Boolean = true)

Removes all messages from the current session's prompt that have a timestamp earlier than the specified timestamp.

fun leaveMessagesFromTimestamp(timestamp: kotlin/time/Instant, preserveSystemMessages: Boolean = true)

Removes all messages from the current session's prompt that have a timestamp earlier than the specified timestamp.

fun leaveMessagesFromTimestamp(timestamp: Instant, preserveSystemMessages: Boolean = true)

Removes all messages from the current session's prompt that have a timestamp earlier than the specified timestamp.

Link copied to clipboard

Parses a structured response from an assistant message using the specified configuration.

fun <T> parseResponseToStructuredResponse(response: ERROR CLASS: Symbol not found for Message.Assistant, config: ERROR CLASS: Symbol not found for StructuredRequestConfig<T>, fixingParser: ERROR CLASS: Symbol not found for StructureFixingParser?? = null, executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for StructuredResponse<T>

Parses an assistant response into a strongly typed StructuredResponse according to the given configuration.

suspend fun <T> parseResponseToStructuredResponse(response: ai/koog/prompt/message/Message.Assistant, config: ai/koog/prompt/structure/StructuredRequestConfig<T>, fixingParser: ai/koog/prompt/executor/model/StructureFixingParser?? = null): ai/koog/prompt/structure/StructuredResponse<T>

Parses a structured response from an assistant message using the specified configuration.

Parses a structured response from an assistant message using the specified configuration.

Link copied to clipboard
suspend fun replaceHistoryWithTLDR(strategy: HistoryCompressionStrategy = HistoryCompressionStrategy.WholeHistory, preserveMemory: Boolean = true)

Rewrites LLM message history, leaving only user message and resulting TLDR.

fun replaceHistoryWithTLDR(strategy: HistoryCompressionStrategy = HistoryCompressionStrategy.WholeHistory, preserveMemory: Boolean = true, executorService: ExecutorService? = null)
suspend fun replaceHistoryWithTLDR(strategy: HistoryCompressionStrategy = HistoryCompressionStrategy.WholeHistory, preserveMemory: Boolean = true)

Rewrites LLM message history, leaving only user message and resulting TLDR.

suspend fun replaceHistoryWithTLDR(strategy: HistoryCompressionStrategy = HistoryCompressionStrategy.WholeHistory, preserveMemory: Boolean = true)

Rewrites LLM message history, leaving only user message and resulting TLDR.

Link copied to clipboard

Sends a request to LLM and appends the response to the prompt.

fun requestLLM(executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Message.Response

Sends a request to the language model using the current session configuration and returns a single response.

suspend fun requestLLM(): ai/koog/prompt/message/Message.Response

Sends a request to LLM and appends the response to the prompt.

Sends a request to LLM and appends the response to the prompt.

Link copied to clipboard
suspend fun requestLLMForceOneTool(tool: Tool<*, *>): Message.Response

Sends a request while forcing a specific tool and appends the response to the prompt.

fun requestLLMForceOneTool(tool: ERROR CLASS: Symbol not found for Tool<*, *>, executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Message.Response

Sends a request to the language model and forces it to use exactly one specific tool instance.

fun requestLLMForceOneTool(tool: ERROR CLASS: Symbol not found for ToolDescriptor, executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Message.Response

Sends a request to the language model and forces it to use exactly one specific tool, identified by a ToolDescriptor.

suspend fun requestLLMForceOneTool(tool: ai/koog/agents/core/tools/Tool<*, *>): ai/koog/prompt/message/Message.Response
suspend fun requestLLMForceOneTool(tool: ai/koog/agents/core/tools/ToolDescriptor): ai/koog/prompt/message/Message.Response

Sends a request while forcing a specific tool and appends the response to the prompt.

suspend fun requestLLMForceOneTool(tool: Tool<*, *>): Message.Response

Sends a request while forcing a specific tool and appends the response to the prompt.

Link copied to clipboard

Sends a request to LLM and appends all received responses to the prompt.

fun requestLLMMultiple(executorService: ExecutorService? = null): List<ERROR CLASS: Symbol not found for Message.Response>

Sends a request to the language model and returns multiple responses.

suspend fun requestLLMMultiple(): List<ai/koog/prompt/message/Message.Response>

Sends a request to LLM and appends all received responses to the prompt.

Sends a request to LLM and appends all received responses to the prompt.

Link copied to clipboard

Sends a request to LLM and returns all available response choices.

fun requestLLMMultipleChoices(executorService: ExecutorService? = null): List<ERROR CLASS: Symbol not found for LLMChoice>

Sends a request to the language model and returns multiple choice alternatives.

suspend fun requestLLMMultipleChoices(): List<List<ai/koog/prompt/message/Message.Response>>

Sends a request to LLM and returns all available response choices.

Sends a request to LLM and returns all available response choices.

Link copied to clipboard

Sends a request that enforces tool calling and appends all received responses to the prompt.

fun requestLLMMultipleOnlyCallingTools(executorService: ExecutorService? = null): List<ERROR CLASS: Symbol not found for Message.Response>

Requests a response from the Language Model (LLM) enforcing tool usage (ToolChoice.Required), validates the session, and processes all returned messages (e.g. thinking + tool call).

suspend fun requestLLMMultipleOnlyCallingTools(): List<ai/koog/prompt/message/Message.Response>

Sends a request that enforces tool calling and appends all received responses to the prompt.

Sends a request that enforces tool calling and appends all received responses to the prompt.

Link copied to clipboard

Sends a request without tool usage and appends all received responses to the prompt.

fun requestLLMMultipleWithoutTools(executorService: ExecutorService? = null): List<ERROR CLASS: Symbol not found for Message.Response>

Sends a request to the language model without utilizing any tools and returns multiple responses.

suspend fun requestLLMMultipleWithoutTools(): List<ai/koog/prompt/message/Message.Response>

Sends a request without tool usage and appends all received responses to the prompt.

Sends a request without tool usage and appends all received responses to the prompt.

Link copied to clipboard

Sends a request that enforces tool calling and appends the received response to the prompt.

fun requestLLMOnlyCallingTools(executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Message.Response

Sends a request to the language model that is allowed to only perform tool calls without generating a regular text response.

suspend fun requestLLMOnlyCallingTools(): ai/koog/prompt/message/Message.Response

Sends a request that enforces tool calling and appends the received response to the prompt.

Sends a request that enforces tool calling and appends the received response to the prompt.

Link copied to clipboard

Sends a streaming request to LLM.

suspend fun requestLLMStreaming(definition: StructureDefinition? = null): Flow<StreamFrame>

Streams a response from LLM, optionally adding a structure definition to the prompt beforehand.

fun requestLLMStreaming(executorService: ExecutorService? = null): Flow.Publisher<ERROR CLASS: Symbol not found for StreamFrame>

Sends a request to the language model and returns a streaming response as a Flow of StreamFrame.

suspend fun requestLLMStreaming(): kotlinx/coroutines/flow/Flow<ai/koog/prompt/streaming/StreamFrame>

Sends a streaming request to LLM.

suspend fun requestLLMStreaming(definition: ai/koog/prompt/structure/StructureDefinition?? = null): kotlinx/coroutines/flow/Flow<ai/koog/prompt/streaming/StreamFrame>

Streams a response from LLM, optionally adding a structure definition to the prompt beforehand.

Sends a streaming request to LLM.

suspend fun requestLLMStreaming(definition: StructureDefinition? = null): Flow<StreamFrame>

Streams a response from LLM, optionally adding a structure definition to the prompt beforehand.

Link copied to clipboard
suspend fun <T> requestLLMStructured(serializer: KSerializer<T>, examples: List<T> = emptyList(), fixingParser: StructureFixingParser? = null): Result<StructuredResponse<T>>

Sends a request to LLM and gets a structured response, appending the assistant message on success.

inline suspend fun <T> requestLLMStructured(examples: List<T> = emptyList(), fixingParser: StructureFixingParser? = null): Result<StructuredResponse<T>>

Requests a structured response from the language model using a reified serializer.

fun <T> requestLLMStructured(config: ERROR CLASS: Symbol not found for StructuredRequestConfig<T>, fixingParser: ERROR CLASS: Symbol not found for StructureFixingParser?? = null, executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Result<ERROR CLASS: Symbol not found for StructuredResponse<T>>

Sends a structured request to the language model using a StructuredRequestConfig.

fun <T> requestLLMStructured(serializer: ERROR CLASS: Symbol not found for KSerializer<T>, examples: List<T> = emptyList(), fixingParser: ERROR CLASS: Symbol not found for StructureFixingParser?? = null, executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Result<ERROR CLASS: Symbol not found for StructuredResponse<T>>

Sends a structured request to the language model using an explicit serializer and example values.

suspend fun <T> requestLLMStructured(config: ai/koog/prompt/structure/StructuredRequestConfig<T>, fixingParser: ai/koog/prompt/executor/model/StructureFixingParser?? = null): kotlin/Result<ai/koog/prompt/structure/StructuredResponse<T>>
suspend fun <T> requestLLMStructured(serializer: kotlinx/serialization/KSerializer<T>, examples: List<T> = emptyList(), fixingParser: ai/koog/prompt/executor/model/StructureFixingParser?? = null): kotlin/Result<ai/koog/prompt/structure/StructuredResponse<T>>

Sends a request to LLM and gets a structured response, appending the assistant message on success.

inline suspend fun <T> requestLLMStructured(examples: List<T> = emptyList(), fixingParser: ai/koog/prompt/executor/model/StructureFixingParser?? = null): kotlin/Result<ai/koog/prompt/structure/StructuredResponse<T>>

Requests a structured response from the language model using a reified serializer.

suspend fun <T> requestLLMStructured(serializer: KSerializer<T>, examples: List<T> = emptyList(), fixingParser: StructureFixingParser? = null): Result<StructuredResponse<T>>

Sends a request to LLM and gets a structured response, appending the assistant message on success.

inline suspend fun <T> requestLLMStructured(examples: List<T> = emptyList(), fixingParser: StructureFixingParser? = null): Result<StructuredResponse<T>>

Requests a structured response from the language model using a reified serializer.

Link copied to clipboard

Sends a request without tool usage and appends the received response to the prompt.

fun requestLLMWithoutTools(executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for Message.Response

Sends a request to the language model without utilizing any tools and returns a single response.

suspend fun requestLLMWithoutTools(): ai/koog/prompt/message/Message.Response

Sends a request without tool usage and appends the received response to the prompt.

Sends a request without tool usage and appends the received response to the prompt.

Link copied to clipboard
suspend fun requestModeration(moderatingModel: LLModel? = null): ModerationResult

Sends a moderation request using the specified moderating model or the session model.

fun requestModeration(moderatingModel: ERROR CLASS: Symbol not found for LLModel?? = null, executorService: ExecutorService? = null): ERROR CLASS: Symbol not found for ModerationResult

Sends a moderation request to the moderation model.

suspend fun requestModeration(moderatingModel: ai/koog/prompt/llm/LLModel?? = null): ai/koog/prompt/dsl/ModerationResult

Sends a moderation request using the specified moderating model or the session model.

suspend fun requestModeration(moderatingModel: LLModel? = null): ModerationResult

Sends a moderation request using the specified moderating model or the session model.

Link copied to clipboard
fun rewritePrompt(body: (prompt: Prompt) -> Prompt)

Rewrites the current prompt by applying a transformation function.

fun rewritePrompt(body: (@R|kotlin/ParameterName|(name = String(prompt)) ai/koog/prompt/dsl/Prompt) -> ai/koog/prompt/dsl/Prompt)

Rewrites the current prompt by applying a transformation function.

fun rewritePrompt(body: (prompt: Prompt) -> Prompt)

Rewrites the current prompt by applying a transformation function.

Link copied to clipboard

Sets the ai.koog.prompt.params.LLMParams.ToolChoice for this LLM session.

fun setToolChoice(toolChoice: ai/koog/prompt/params/LLMParams.ToolChoice??)

Sets the ai.koog.prompt.params.LLMParams.ToolChoice for this LLM session.

Sets the ai.koog.prompt.params.LLMParams.ToolChoice for this LLM session.

Link copied to clipboard

Set the ai.koog.prompt.params.LLMParams.ToolChoice to ai.koog.prompt.params.LLMParams.ToolChoice.Auto to make LLM automatically decide between calling tools and generating text

Set the ai.koog.prompt.params.LLMParams.ToolChoice to ai.koog.prompt.params.LLMParams.ToolChoice.Auto to make LLM automatically decide between calling tools and generating text

Set the ai.koog.prompt.params.LLMParams.ToolChoice to ai.koog.prompt.params.LLMParams.ToolChoice.Auto to make LLM automatically decide between calling tools and generating text

Link copied to clipboard
inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCalls(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using an already resolved SafeTool.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCalls(tool: Tool<TArgs, TResult>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using a tool instance.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCalls(toolClass: KClass<out Tool<TArgs, TResult>>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using a tool class.

inline fun <TArgs, TResult> kotlinx/coroutines/flow/Flow<TArgs>.toParallelToolCalls(tool: ai/koog/agents/core/tools/Tool<TArgs, TResult>, concurrency: Int = 16): kotlinx/coroutines/flow/Flow<ai/koog/agents/core/environment/SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using a tool instance.

inline fun <TArgs, TResult> kotlinx/coroutines/flow/Flow<TArgs>.toParallelToolCalls(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): kotlinx/coroutines/flow/Flow<ai/koog/agents/core/environment/SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using an already resolved SafeTool.

inline fun <TArgs, TResult> kotlinx/coroutines/flow/Flow<TArgs>.toParallelToolCalls(toolClass: KClass<out ai/koog/agents/core/tools/Tool<TArgs, TResult>>, concurrency: Int = 16): kotlinx/coroutines/flow/Flow<ai/koog/agents/core/environment/SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using a tool class.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCalls(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using an already resolved SafeTool.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCalls(tool: Tool<TArgs, TResult>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using a tool instance.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCalls(toolClass: KClass<out Tool<TArgs, TResult>>, concurrency: Int = 16): Flow<SafeTool.Result<TResult>>

Converts each flow item into a parallel tool call using a tool class.

Link copied to clipboard
inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCallsRaw(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): Flow<String>

Converts each flow item into a parallel tool call and emits only raw string content.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCallsRaw(toolClass: KClass<out Tool<TArgs, TResult>>, concurrency: Int = 16): Flow<String>

Converts each flow item into a parallel tool call using a tool class and emits raw string content.

inline fun <TArgs, TResult> kotlinx/coroutines/flow/Flow<TArgs>.toParallelToolCallsRaw(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): kotlinx/coroutines/flow/Flow<kotlin/String>

Converts each flow item into a parallel tool call and emits only raw string content.

inline fun <TArgs, TResult> kotlinx/coroutines/flow/Flow<TArgs>.toParallelToolCallsRaw(toolClass: KClass<out ai/koog/agents/core/tools/Tool<TArgs, TResult>>, concurrency: Int = 16): kotlinx/coroutines/flow/Flow<kotlin/String>

Converts each flow item into a parallel tool call using a tool class and emits raw string content.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCallsRaw(safeTool: SafeTool<TArgs, TResult>, concurrency: Int = 16): Flow<String>

Converts each flow item into a parallel tool call and emits only raw string content.

inline fun <TArgs, TResult> Flow<TArgs>.toParallelToolCallsRaw(toolClass: KClass<out Tool<TArgs, TResult>>, concurrency: Int = 16): Flow<String>

Converts each flow item into a parallel tool call using a tool class and emits raw string content.

Link copied to clipboard

Unset the ai.koog.prompt.params.LLMParams.ToolChoice. Mostly, if left unspecified, the default value of this parameter is ai.koog.prompt.params.LLMParams.ToolChoice.Auto

Unset the ai.koog.prompt.params.LLMParams.ToolChoice. Mostly, if left unspecified, the default value of this parameter is ai.koog.prompt.params.LLMParams.ToolChoice.Auto

Unset the ai.koog.prompt.params.LLMParams.ToolChoice. Mostly, if left unspecified, the default value of this parameter is ai.koog.prompt.params.LLMParams.ToolChoice.Auto

Link copied to clipboard

Updates the current prompt by applying modifications defined in the provided block.

fun updatePrompt(body: ai/koog/prompt/dsl/PromptBuilder.() -> Unit)

Updates the current prompt by applying modifications defined in the provided block.

Updates the current prompt by applying modifications defined in the provided block.