nodeLLMRequestStreamingAndSendResults

A node that performs LLM streaming, collects all stream frames, converts them to response messages, and updates the prompt with the results.

This node is useful when you want to:

  • Stream responses from the LLM for real-time feedback

  • Collect the complete streamed response as messages

  • Automatically update the conversation history with the streamed responses

The node will:

  1. Initiate a streaming request to the LLM

  2. Collect all stream frames (text, tool calls, etc.)

  3. Convert the collected frames into proper Message.Response objects

  4. Update the prompt with these messages for conversation continuity

  5. Return the collected messages

Return

A node delegate that accepts input of type T and returns a list of response messages

Parameters

T

The type of input this node accepts (passed through without modification)

name

Optional node name for identification in the agent graph

structureDefinition

Optional structure definition to guide the LLM's response format

See also

for streaming without automatic prompt updates

for the underlying streaming functionality