MistralAILLMClient
open class MistralAILLMClient(apiKey: String, settings: MistralAIClientSettings = MistralAIClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator()) : AbstractOpenAILLMClient<MistralAIChatCompletionResponse, MistralAIChatCompletionStreamResponse> , LLMEmbeddingProvider(source)
Implementation of LLMClient for Mistral AI.
Parameters
apiKey
The API key for the Mistral AI API
settings
The base URL, chat completion path, and timeouts for the Mistral AI
clock
Clock instance used for tracking response metadata timestamps
Constructors
Link copied to clipboard
constructor(apiKey: String, settings: MistralAIClientSettings = MistralAIClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator())
Functions
Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
Returns the specific implementation of the LLMProvider associated with this client
Link copied to clipboard
Moderates text and image content based on the provided model's capabilities.