AnthropicLLMClient

open class AnthropicLLMClient @JvmOverloads constructor(settings: AnthropicClientSettings = AnthropicClientSettings(), httpClient: KoogHttpClient, clock: Clock = Clock.System) : LLMClient(source)

A client implementation for interacting with Anthropic's API in a suspendable and direct manner.

This class supports functionalities for executing text prompts and streaming interactions with the Anthropic API. It leverages Kotlin Coroutines to handle asynchronous operations and provides full support for configuring HTTP requests, including timeout handling and JSON serialization.

Parameters

settings

Configurable settings for the Anthropic client, which include the base URL and other options.

httpClient

A preconfigured Koog HTTP client used for API calls. Must have authentication and other request defaults already embedded. To use a Ktor-backed client with standard defaults, use the secondary constructor that accepts an API key and an io.ktor.client.HttpClient.

clock

Clock instance used for tracking response metadata timestamps.

Constructors

Link copied to clipboard
constructor(settings: AnthropicClientSettings = AnthropicClientSettings(), httpClient: KoogHttpClient, clock: Clock = Clock.System)
constructor(apiKey: String, settings: AnthropicClientSettings = AnthropicClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System)

Secondary constructor for creating an Anthropic client from a base Ktor HTTP client.

Properties

Link copied to clipboard
open override val clientName: String

Provides the specific Large Language Model (LLM) provider used by the client.

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend override fun embed(text: String, model: LLModel): List<Double>

Embedding is not supported by the Anthropic API.

open suspend override fun embed(inputs: List<String>, model: LLModel): List<List<Double>>

Batch embedding is not supported by the Anthropic API.

Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel): Flow<StreamFrame>
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
open override fun llmProvider(): LLMProvider
Link copied to clipboard
open suspend override fun models(): List<LLModel>
Link copied to clipboard
open suspend override fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Moderation is not supported by the Anthropic API.