requestLLMStreaming
Sends a streaming request to the underlying LLM and returns the streamed response. This method ensures the session is active before executing the request.
Return
A flow emitting StreamFrame objects that represent the streaming output of the language model.