LLMClient

interface LLMClient(source)

Common interface for direct communication with LLM providers. This interface defines methods for executing prompts and streaming responses.

Inheritors

Functions

Link copied to clipboard
abstract suspend fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor> = emptyList()): List<Message.Response>

Executes a prompt and returns a list of response messages.

Link copied to clipboard
open suspend fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>

Executes a prompt and returns a list of LLM choices.

Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor> = emptyList()): Flow<StreamFrame>

Executes a prompt and returns a streaming flow of response chunks.

Link copied to clipboard
abstract fun llmProvider(): LLMProvider

Retrieves the LLMProvider instance associated with this client.

Link copied to clipboard
abstract suspend fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Analyzes the provided prompt for violations of content policies or other moderation criteria.

Link copied to clipboard
fun LLMClient.toRetryingClient(retryConfig: RetryConfig = RetryConfig.DEFAULT): RetryingLLMClient

Converts an instance of LLMClient into a retrying client with customizable retry behavior.