FitPrompt

data class FitPrompt(val promptTokenizer: PromptTokenizer? = null, val contextChunkSize: Long = 2048, val minimumChunkCount: Long? = null, val maximumChunkCount: Long? = null) : ContextWindowStrategy(source)

A strategy for computing the context window length based on the prompt length.

Parameters

promptTokenizer

The PromptTokenizer to use for computing the prompt length, or null to use the last reported token usage.

contextChunkSize

The granularity to use for computing the context window length. Defaults to 2048.

minimumChunkCount

The minimum number of context chunks in the context.

maximumChunkCount

The maximum number of context chunks in the context.

Example: contextChunkSize = 512, minimumChunkCount = 2, maximumChunkCount = 4, then minimumContextLength = 1024 and maximumContextLength = 2048

Constructors

Link copied to clipboard
constructor(promptTokenizer: PromptTokenizer? = null, contextChunkSize: Long = 2048, minimumChunkCount: Long? = null, maximumChunkCount: Long? = null)

Properties

Link copied to clipboard
Link copied to clipboard
val maximumChunkCount: Long? = null
Link copied to clipboard
val minimumChunkCount: Long? = null
Link copied to clipboard

Functions

Link copied to clipboard
open override fun computeContextLength(prompt: Prompt, model: LLModel): Long?

Computes the context length for a given prompt and language model. This may involve calculating the number of tokens used in the prompt and determining if it fits within the model's context length constraints.