GoogleLLMClient

open class GoogleLLMClient @JvmOverloads constructor(settings: GoogleClientSettings = GoogleClientSettings(), httpClient: KoogHttpClient, clock: Clock = Clock.System) : LLMClient(source)

Implementation of LLMClient for Google's Gemini API.

This client supports both standard and streaming text generation with optional tool calling capabilities.

Parameters

settings

Custom client settings, defaults to standard API endpoint and timeouts

httpClient

A preconfigured Koog HTTP client used for API calls. Must have authentication and other request defaults already embedded. To use a Ktor-backed client with standard defaults, use the secondary constructor that accepts an API key and an io.ktor.client.HttpClient.

clock

Clock instance used for tracking response metadata timestamps.

Constructors

Link copied to clipboard
constructor(settings: GoogleClientSettings = GoogleClientSettings(), httpClient: KoogHttpClient, clock: Clock = Clock.System)
constructor(apiKey: String, settings: GoogleClientSettings = GoogleClientSettings(), baseClient: HttpClient = HttpClient(), clock: Clock = Clock.System)

Secondary constructor for creating a GoogleLLMClient backed with a Ktor HTTP client.

Properties

Link copied to clipboard
open override val clientName: String

Provides the Large Language Model (LLM) provider associated with this client.

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend override fun embed(text: String, model: LLModel): List<Double>

Embeds the given text using the Google AI embeddings API.

open suspend override fun embed(inputs: List<String>, model: LLModel): List<List<Double>>

Embeds the given inputs using the Google AI batch embeddings API.

Link copied to clipboard
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend override fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel): Flow<StreamFrame>
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
open override fun llmProvider(): LLMProvider
Link copied to clipboard
open suspend override fun models(): List<LLModel>

Retrieves a list of available language models supported by the Google LLM client. https://ai.google.dev/api/models#method:-models.list

Link copied to clipboard
open suspend override fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Moderates the given prompt using the specified language model. This method is not supported by the Google API and will throw an exception when invoked.