requestLLMStreaming

expect open suspend override fun requestLLMStreaming(message: String, structureDefinition: StructureDefinition?): Flow<StreamFrame>(source)

Sends a message to a Large Language Model (LLM) and streams the LLM response. The message becomes part of the current prompt, and the LLM's response is streamed as it's generated.

Return

A flow of StreamFrame objects from the LLM response.

Parameters

message

The content of the message to be sent to the LLM.

structureDefinition

Optional structure to guide the LLM response.

fun requestLLMStreaming(message: String, structureDefinition: StructureDefinition?, executorService: ERROR CLASS: Symbol not found for ExecutorService?? = null): ERROR CLASS: Symbol not found for Flow.Publisher<ai/koog/prompt/streaming/StreamFrame>(source)

Sends a request to the Language Learning Model (LLM) for streaming data.

Return

A Publisher that emits StreamFrame objects representing the streamed response from the LLM.

Parameters

message

The message or query to be sent to the LLM for processing.

structureDefinition

An optional parameter specifying the structured data definition for parsing or validating the response.

executorService

An optional executor service to be used for managing coroutine execution. Defaults to null, which will use the default executor service.