Package-level declarations

Types

Link copied to clipboard
class MistralAIClientSettings(baseUrl: String = "https://api.mistral.ai", chatCompletionsPath: String = "v1/chat/completions", val embeddingsPath: String = "v1/embeddings", val moderationPath: String = "v1/moderations", val modelsPath: String = "v1/models", timeoutConfig: ConnectionTimeoutConfig = ConnectionTimeoutConfig()) : OpenAIBaseSettings

Represents the settings for configuring a Mistral AI client.

Link copied to clipboard
open class MistralAILLMClient(apiKey: String, settings: MistralAIClientSettings = MistralAIClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System) : AbstractOpenAILLMClient<MistralAIChatCompletionResponse, MistralAIChatCompletionStreamResponse> , LLMEmbeddingProvider

Implementation of LLMClient for Mistral AI.

Link copied to clipboard

Object containing a collection of predefined Mistral AI model configurations. These models span various use cases including chat, reasoning, coding, vision, and audio tasks.

Link copied to clipboard
class MistralAIParams(temperature: Double? = null, maxTokens: Int? = null, numberOfChoices: Int? = null, speculation: String? = null, schema: LLMParams.Schema? = null, toolChoice: LLMParams.ToolChoice? = null, user: String? = null, additionalProperties: Map<String, JsonElement>? = null, val topP: Double? = null, val stop: List<String>? = null, val randomSeed: Int? = null, val presencePenalty: Double? = null, val frequencyPenalty: Double? = null, val parallelToolCalls: Boolean? = null, val promptMode: String? = null, val safePrompt: Boolean? = null) : LLMParams

MistralAI chat-completions parameters layered on top of LLMParams.