LLMClient

interface LLMClient(source)

Common interface for direct communication with LLM providers. This interface defines methods for executing prompts and streaming responses.

Inheritors

Functions

Link copied to clipboard
abstract suspend fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor> = emptyList()): List<Message.Response>

Executes a prompt and returns a list of response messages.

Link copied to clipboard
open suspend fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>

Executes a prompt and returns a list of LLM choices.

Link copied to clipboard
abstract fun executeStreaming(prompt: Prompt, model: LLModel): Flow<String>

Executes a prompt and returns a streaming flow of response chunks.

Link copied to clipboard
abstract suspend fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Analyzes the provided prompt for violations of content policies or other moderation criteria.