onLLMStreamingStarting

suspend fun onLLMStreamingStarting(eventId: String, executionInfo: AgentExecutionInfo, runId: String, prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>, context: AIAgentContext)(source)

Invoked before streaming from a language model begins.

This method notifies all registered stream handlers that streaming is about to start, allowing them to perform preprocessing or logging operations.

Parameters

eventId

The unique identifier for the event group;

executionInfo

The execution information for the LLM streaming event;

runId

The unique identifier for this streaming session;

prompt

The prompt being sent to the language model;

model

The language model being used for streaming;

tools

The list of available tool descriptors for this streaming session;

context

The AI agent context associated with the streaming operation.