LLMCallCompletedEvent

data class LLMCallCompletedEvent(val runId: String, val prompt: Prompt, val model: String, val responses: List<Message.Response>, val moderationResponse: ModerationResult? = null, val timestamp: Long = Clock.System.now().toEpochMilliseconds()) : DefinedFeatureEvent(source)

Represents an event signaling the completion of an LLM (Large Language Model) call.

This event encapsulates the responses provided by the LLM during its operation. It serves as a record of the responses generated by the LLM, marking the end of a particular interaction cycle. The event is used within the system to capture relevant output data and ensure proper tracking and logging of LLM-related interactions.

Constructors

Link copied to clipboard
constructor(runId: String, prompt: Prompt, model: String, responses: List<Message.Response>, moderationResponse: ModerationResult? = null, timestamp: Long = Clock.System.now().toEpochMilliseconds())

Properties

Link copied to clipboard
open override val messageType: FeatureMessage.Type

Specifies the type of the feature message for this event.

Link copied to clipboard

The description of the LLM model used during the call. Use the format: 'llm_provider:model_id';

Link copied to clipboard

The moderation response, if any, returned by the LLM. This is typically used to capture and track content moderation results.

Link copied to clipboard

The input prompt encapsulated as a Prompt object. This represents the structured set of messages and configuration parameters sent to the LLM.

Link copied to clipboard

A list of responses generated by the LLM, represented as instances of Message.Response. Each response contains content, metadata, and additional context about the interaction.

Link copied to clipboard

The unique identifier of the LLM run.

Link copied to clipboard
open override val timestamp: Long

The timestamp of the event, in milliseconds since the Unix epoch.