onLLMStreamingFrameReceived

suspend fun onLLMStreamingFrameReceived(eventId: String, executionInfo: AgentExecutionInfo, runId: String, prompt: Prompt, model: LLModel, streamFrame: StreamFrame, context: AIAgentContext)(source)

Invoked when a stream frame is received during the streaming process.

This method notifies all registered stream handlers about each incoming stream frame, allowing them to process, transform, or aggregate the streaming content in real-time.

Parameters

eventId

The unique identifier for the event group;

executionInfo

The execution information for the LLM streaming event;

runId

The unique identifier for this streaming session;

prompt

The prompt being sent to the language model;

model

The language model being used for streaming;

streamFrame

The individual stream frame containing partial response data;

context

The AI agent context associated with the streaming operation.