LLMStreamingStartingHandler
A functional interface implemented to handle logic that occurs before streaming from a large language model (LLM) begins. It allows preprocessing steps or validation based on the provided prompt, available tools, targeted LLM model, and a unique run identifier.
This can be particularly useful for custom input manipulation, logging, validation, or applying configurations to the streaming request based on external context.