executeStreaming

open suspend override fun executeStreaming(prompt: Prompt, model: LLModel): Flow<String>(source)

Executes the given prompt with the specified model and streams the response in chunks as a flow.

Parameters

prompt

The prompt to execute, containing the messages and parameters.

model

The LLM model to use for execution.