LLMClientAPI
Functions
Link copied to clipboard
abstract suspend fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor> = emptyList()): List<Message.Response>
Executes a prompt and returns a list of response messages.
Link copied to clipboard
open suspend fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor> = emptyList()): List<LLMChoice>
Executes a prompt and returns a list of LLM choices.
Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Executes a prompt and returns a streaming flow of response chunks.
Link copied to clipboard
Basic JSON schema generator supported by the LLMClient. Return BasicJsonSchemaGenerator by default.
Link copied to clipboard
Standard JSON schema generator supported by the LLMClient. Return StandardJsonSchemaGenerator by default.
Link copied to clipboard
Retrieves the LLMProvider instance associated with this client.
Link copied to clipboard
Analyzes the provided prompt for violations of content policies or other moderation criteria.