nodeLLMRequestStreamingAndSendResults
A node that performs LLM streaming, collects all stream frames, converts them to response messages, and updates the prompt with the results.
This node is useful when you want to:
Stream responses from the LLM for real-time feedback
Collect the complete streamed response as messages
Automatically update the conversation history with the streamed responses
The node will:
Initiate a streaming request to the LLM
Collect all stream frames (text, tool calls, etc.)
Convert the collected frames into proper Message.Response objects
Update the prompt with these messages for conversation continuity
Return the collected messages
Return
A node delegate that accepts input of type T and returns a list of response messages
Parameters
The type of input this node accepts (passed through without modification)
Optional node name for identification in the agent graph
Optional structure definition to guide the LLM's response format
See also
for streaming without automatic prompt updates
for the underlying streaming functionality