None

A strategy for letting the Ollama server decide the context window length. To decide the context window length, Ollama proceeds as follows:

  • If the model definition contains a num_ctx parameter, the context window length is set to that value.

  • If an OLLAMA_CONTEXT_LENGTH environment variable is set, the context window length is set to that value.

  • Otherwise, the context window length is set to the default value of 2048.

Functions

Link copied to clipboard
open override fun computeContextLength(prompt: Prompt, model: LLModel): Long?

Computes the context length for a given prompt and language model. This may involve calculating the number of tokens used in the prompt and determining if it fits within the model's context length constraints.