MistralAILLMClient

open class MistralAILLMClient @JvmOverloads constructor(settings: MistralAIClientSettings = MistralAIClientSettings(), httpClient: KoogHttpClient, clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator()) : AbstractOpenAILLMClient<MistralAIChatCompletionResponse, MistralAIChatCompletionStreamResponse> (source)

Implementation of LLMClient for Mistral AI.

Parameters

settings

The base URL, chat completion path, and timeouts for the Mistral AI

httpClient

A fully configured KoogHttpClient for making API requests. Use the secondary constructor to create a Ktor-backed client configured with an API key.

clock

Clock instance used for tracking response metadata timestamps

Constructors

Link copied to clipboard
constructor(settings: MistralAIClientSettings = MistralAIClientSettings(), httpClient: KoogHttpClient, clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator())
constructor(apiKey: String, settings: MistralAIClientSettings = MistralAIClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator())

Properties

Link copied to clipboard
open override val clientName: String

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend override fun embed(text: String, model: LLModel): List<Double>

Embeds the given text using the MistralAI embeddings API.

open suspend override fun embed(inputs: List<String>, model: LLModel): List<List<Double>>

Embeds the given inputs using the MistralAI embeddings API.

Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel): Flow<StreamFrame>
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
open override fun llmProvider(): LLMProvider

Returns the specific implementation of the LLMProvider associated with this client

Link copied to clipboard
open suspend override fun models(): List<LLModel>

Fetches the list of available model IDs from the MistralAI service. https://docs.mistral.ai/api/endpoint/models

Link copied to clipboard
open suspend override fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Moderates text and image content based on the provided model's capabilities.