KoogSpringAiChatProperties

@ConfigurationProperties(prefix = "koog.spring.ai.chat")
data class KoogSpringAiChatProperties(val enabled: Boolean = true, val chatModelBeanName: String? = null, val moderationModelBeanName: String? = null, val provider: String? = null, val dispatcher: DispatcherConfig = DispatcherConfig())(source)

Configuration properties for the Koog Spring AI Chat Model adapter.

Prefix: koog.spring.ai.chat

Constructors

Link copied to clipboard
constructor(enabled: Boolean = true, chatModelBeanName: String? = null, moderationModelBeanName: String? = null, provider: String? = null, dispatcher: DispatcherConfig = DispatcherConfig())

Properties

Link copied to clipboard

Optional bean name of the org.springframework.ai.chat.model.ChatModel to use when multiple chat models are registered. When null, a single-candidate default is used.

Link copied to clipboard
val dispatcher: DispatcherConfig

Dispatcher / threading settings for blocking Spring AI model calls.

Link copied to clipboard

Whether the Koog Spring AI Chat auto-configuration is enabled. Default: true.

Link copied to clipboard

Optional bean name of the org.springframework.ai.moderation.ModerationModel to use when multiple moderation models are registered. When null, the single candidate (if any) is used; with multiple candidates the injection is skipped to avoid org.springframework.beans.factory.NoUniqueBeanDefinitionException.

Link copied to clipboard

Optional LLM provider identifier (e.g. google, openai, anthropic). When set, the ai.koog.prompt.llm.LLMProvider passed to SpringAiLLMClient is resolved from the well-known Koog providers by this id. When null (default), the provider is auto-detected from the org.springframework.ai.chat.model.ChatModel implementation class name. If auto-detection fails, a generic spring-ai provider is used as a fallback.