SpringAiLLMClient
An LLMClient implementation that delegates to a Spring AI ChatModel.
This adapter allows Koog agents to use any Spring AI chat model provider (Anthropic, OpenAI, Google, Ollama, etc.) as their underlying LLM backend.
Tool execution is always owned by the Koog agent framework. Spring AI receives only tool definitions (via org.springframework.ai.tool.ToolCallback with a throwing call()) and internalToolExecutionEnabled=false, so Spring never attempts to execute tools.
Parameters
the Spring AI chat model to delegate to
the LLMProvider to report for this client
the clock used for creating response metadata timestamps
the CoroutineDispatcher used for blocking model calls
optional customizer for provider-specific ChatOptions tuning
optional Spring AI ModerationModel for content moderation; if null, moderate throws UnsupportedOperationException
Constructors
Types
Functions
Streams LLM responses by subscribing to ChatModel.stream and converting each chunk into Koog StreamFrame events.
Returns the list with one model based on the configured LLMProvider and ChatModel without capabilities or parameters.