MistralAILLMClient

open class MistralAILLMClient(apiKey: String, settings: MistralAIClientSettings = MistralAIClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator()) : AbstractOpenAILLMClient<MistralAIChatCompletionResponse, MistralAIChatCompletionStreamResponse> , LLMEmbeddingProvider(source)

Implementation of LLMClient for Mistral AI.

Parameters

apiKey

The API key for the Mistral AI API

settings

The base URL, chat completion path, and timeouts for the Mistral AI

clock

Clock instance used for tracking response metadata timestamps

Constructors

Link copied to clipboard
constructor(apiKey: String, settings: MistralAIClientSettings = MistralAIClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator())

Properties

Link copied to clipboard
open val clientName: String

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend override fun embed(text: String, model: LLModel): List<Double>

Embeds the given text using the MistralAI embeddings API.

Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
open override fun llmProvider(): LLMProvider

Returns the specific implementation of the LLMProvider associated with this client

Link copied to clipboard
open suspend override fun models(): List<String>

Fetches the list of available model IDs from the MistralAI service. https://docs.mistral.ai/api/endpoint/models

Link copied to clipboard
open suspend override fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Moderates text and image content based on the provided model's capabilities.