AbstractOpenAILLMClient

abstract class AbstractOpenAILLMClient<TResponse : OpenAIBaseLLMResponse, TStreamResponse : OpenAIBaseLLMStreamResponse>(settings: OpenAIBaseSettings, httpClient: KoogHttpClient, clock: Clock = Clock.System, logger: KLogger, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator) : LLMClient(source)

Abstract base class for OpenAI-compatible LLM clients. Provides common functionality for communicating with OpenAI and OpenAI-compatible APIs.

Parameters

settings

Configuration settings including base URL, API paths, and timeout configuration.

httpClient

A fully configured KoogHttpClient for making API requests. Must have authentication and other request defaults (base URL, timeouts, headers) already embedded. To use a Ktor-backed client with standard OpenAI-compatible defaults, use the secondary constructor that accepts an HttpClient and an API key.

clock

Clock instance used for tracking response metadata timestamps. Defaults to Clock.System.

Constructors

Link copied to clipboard
constructor(settings: OpenAIBaseSettings, httpClient: KoogHttpClient, clock: Clock = Clock.System, logger: KLogger, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator)
constructor(apiKey: String, settings: OpenAIBaseSettings, baseClient: HttpClient = HttpClient(), clientName: String = "OpenAICompatibleClient", clock: Clock = Clock.System, logger: KLogger, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator)

Secondary constructor for creating a client backed by a Ktor HttpClient. Configures authentication, base URL, timeouts, and JSON serialization automatically from apiKey and settings.

Types

Link copied to clipboard
object Companion

Properties

Link copied to clipboard
open val clientName: String

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend fun embed(text: String, model: LLModel): List<Double>
open suspend fun embed(inputs: List<String>, model: LLModel): List<List<Double>>
Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel): Flow<StreamFrame>
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
abstract fun llmProvider(): LLMProvider
Link copied to clipboard
open suspend fun models(): List<LLModel>
Link copied to clipboard
abstract suspend fun moderate(prompt: Prompt, model: LLModel): ModerationResult