RetryingLLMClient
class RetryingLLMClient(delegate: LLMClient, config: RetryConfig = RetryConfig()) : LLMClient(source)
A decorator that adds retry capabilities to any LLMClient implementation.
This is a pure decorator - it has no knowledge of specific providers or implementations. It simply wraps any LLMClient and retries operations based on configurable policies.
Example usage:
val client = AnthropicLLMClient(apiKey)
val retryingClient = RetryingLLMClient(client, RetryConfig.CONSERVATIVE)Content copied to clipboard
Parameters
delegate
The LLMClient to wrap with retry logic
config
Configuration for retry behavior
Functions
Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Executes a prompt and returns a list of response messages.
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Executes a prompt and returns a list of LLM choices.
Link copied to clipboard
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Executes a prompt and returns a streaming flow of response chunks.
Link copied to clipboard
Retrieves the configured instance of the LLMProvider in use.
Link copied to clipboard
Analyzes the provided prompt for violations of content policies or other moderation criteria.
Link copied to clipboard
Converts an instance of LLMClient into a retrying client with customizable retry behavior.