computeContextLength

open override fun computeContextLength(prompt: Prompt, model: LLModel): Long?(source)

Computes the context length for a given prompt and language model. This may involve calculating the number of tokens used in the prompt and determining if it fits within the model's context length constraints.

Return

The context length as a Long, indicating the number of tokens used in the prompt, or null if it cannot be calculated.

Parameters

prompt

The Prompt containing the list of messages, unique identifier, and language model parameters that describe the input for the LLM.

model

The LLModel representing the language model used to process the prompt, which includes its provider, identifier, capabilities, and context length.