latestTokenUsage
Represents the total token usage of the most recent response message in the current prompt.
This value is determined by iterating through the list of messages
within the prompt and locating the last message that is of type Message.Response
. If found, the tokensCount
from its metadata is returned. If no response message exists, the value defaults to 0.
Useful for tracking the token count of the most recently generated LLM response in the LLM chat flow.