OllamaClient
Client for interacting with the Ollama API with comprehensive model support.
Implements:
LLMClient for executing prompts and streaming responses.
LLMEmbeddingProvider for generating embeddings from input text.
Parameters
The base URL of the Ollama server. Defaults to "http://localhost:11434".
The underlying HTTP client used for making requests.
Configuration for connection, request, and socket timeouts.
Clock instance used for tracking response metadata timestamps.
The ContextWindowStrategy to use for computing context window lengths. Defaults to ContextWindowStrategy.None.
Constructors
Functions
Returns a model card by its model name, on null if no such model exists on the server.
Returns the model cards for all the available models on the server.
Provides the type of Language Learning Model (LLM) provider used by the client.