requestLLM

expect open suspend override fun requestLLM(message: String, allowToolCalls: Boolean): Message.Response(source)

Sends a message to a Large Language Model (LLM) and optionally allows the use of tools during the LLM interaction. The message becomes part of the current prompt, and the LLM's response is processed accordingly, either with or without tool integrations based on the provided parameters.

Parameters

message

The content of the message to be sent to the LLM.

allowToolCalls

Specifies whether tool calls are allowed during the LLM interaction. Defaults to true.

fun requestLLM(message: String, allowToolCalls: Boolean = true, executorService: ERROR CLASS: Symbol not found for ExecutorService?? = null): Message.Response(source)

Sends a request to the Large Language Model (LLM) and retrieves its response.

Return

A Message.Response object containing the

Parameters

message

The input message to be sent to the LLM.

allowToolCalls

Determines whether the LLM is allowed to use tools during its response generation. Defaults to true.

executorService

An optional ExecutorService instance that enables custom thread management for the request. Defaults to null.


actual open suspend override fun requestLLM(message: String, allowToolCalls: Boolean): Message.Response(source)

Sends a message to a Large Language Model (LLM) and optionally allows the use of tools during the LLM interaction. The message becomes part of the current prompt, and the LLM's response is processed accordingly, either with or without tool integrations based on the provided parameters.

Parameters

message

The content of the message to be sent to the LLM.

allowToolCalls

Specifies whether tool calls are allowed during the LLM interaction. Defaults to true.