DashscopeLLMClient

class DashscopeLLMClient(apiKey: String, settings: DashscopeClientSettings = DashscopeClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator()) : AbstractOpenAILLMClient<DashscopeChatCompletionResponse, DashscopeChatCompletionStreamResponse> (source)

Implementation of AbstractOpenAILLMClient for DashScope API using OpenAI-compatible endpoints.

Parameters

apiKey

The API key for the DashScope API

settings

The base URL, chat completion path, and timeouts for the DashScope API, defaults to "https://dashscope-intl.aliyuncs.com/compatible-mode/v1" and 900s

baseClient

HTTP client for making requests

clock

Clock instance used for tracking response metadata timestamps

Constructors

Link copied to clipboard
constructor(apiKey: String, settings: DashscopeClientSettings = DashscopeClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System, toolsConverter: OpenAICompatibleToolDescriptorSchemaGenerator = OpenAICompatibleToolDescriptorSchemaGenerator())

Properties

Link copied to clipboard
open val clientName: String

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
open override fun llmProvider(): LLMProvider
Link copied to clipboard
open suspend fun models(): List<String>
Link copied to clipboard
open suspend override fun moderate(prompt: Prompt, model: LLModel): ModerationResult