OllamaLLMAutoConfiguration

@AutoConfiguration
@PropertySource(value = ["classpath:/META-INF/config/koog/ollama-llm.properties"])
@EnableConfigurationProperties(value = [OllamaKoogProperties::class])
class OllamaLLMAutoConfiguration(properties: OllamaKoogProperties)(source)

Auto-configuration class for integrating the Ollama Large Language Model (LLM) service into applications.

This configuration initializes and provides the necessary beans to enable interaction with the Ollama LLM API. It relies on properties defined in the OllamaKoogProperties class to set up the service.

The configuration is conditional and will only be initialized if:

Initializes the following beans:

This configuration allows seamless integration with the Ollama API while enabling properties-based customization.

See also

Constructors

Link copied to clipboard
constructor(properties: OllamaKoogProperties)

Functions

Link copied to clipboard
@Bean
@ConditionalOnBean(value = [OllamaClient::class])
fun ollamaExecutor(client: OllamaClient): SingleLLMPromptExecutor

Creates and configures an instance of SingleLLMPromptExecutor that wraps the provided OllamaClient. The configured executor includes retry capabilities based on the application's properties.

Link copied to clipboard
@Bean
@ConditionalOnProperty(prefix = "ai.koog.ollama", name = ["enabled"], havingValue = "true")
fun ollamaLLMClient(): OllamaClient

Creates an OllamaClient bean configured with application properties.