BedrockLLMClient

class BedrockLLMClient @JvmOverloads constructor(bedrockClient: BedrockRuntimeClient, apiMethod: BedrockAPIMethod = BedrockAPIMethod.InvokeModel, moderationGuardrailsSettings: BedrockGuardrailsSettings? = null, fallbackModelFamily: BedrockModelFamilies? = null, clock: Clock = Clock.System) : LLMClient(source)

Creates a new Bedrock LLM client configured with the specified AWS credentials and settings.

Return

A configured LLMClient instance for Bedrock

Parameters

bedrockClient

The runtime client for interacting with Bedrock, highly configurable

apiMethod

The API method to use for interacting with Bedrock models that support messages, defaults to BedrockAPIMethod.InvokeModel.

moderationGuardrailsSettings

Optional settings of the AWS bedrock Guardrails (see AWS documentation ) that would be used for the LLMClient.moderate request

fallbackModelFamily

Optional fallback model family to use for unsupported models. If not provided, unsupported models will throw an exception.

clock

A clock used for time-based operations

Constructors

Link copied to clipboard
constructor(bedrockClient: BedrockRuntimeClient, apiMethod: BedrockAPIMethod = BedrockAPIMethod.InvokeModel, moderationGuardrailsSettings: BedrockGuardrailsSettings? = null, fallbackModelFamily: BedrockModelFamilies? = null, clock: Clock = Clock.System)
constructor(identityProvider: IdentityProvider, settings: BedrockClientSettings = BedrockClientSettings(), clock: Clock = Clock.System)

Creates a new Bedrock LLM client configured with the specified identity provider and settings.

Properties

Link copied to clipboard
open val clientName: String

Functions

Link copied to clipboard
open override fun close()
Link copied to clipboard
open suspend override fun embed(text: String, model: LLModel): List<Double>

Embeds the given text using the AWS Bedrock InvokeModel API.

open suspend override fun embed(inputs: List<String>, model: LLModel): List<List<Double>>

Batch embedding is not currently supported by the Bedrock client.

Link copied to clipboard
@JvmName(name = "embed")
fun embedBlocking(text: String, model: LLModel, executorService: ExecutorService?): List<Double>
@JvmName(name = "embed")
fun embedBlocking(inputs: List<String>, model: LLModel, executorService: ExecutorService?): List<List<Double>>
Link copied to clipboard
fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>, executorService: ExecutorService?): List<Message.Response>
open suspend override fun execute(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<Message.Response>
Link copied to clipboard
open suspend fun executeMultipleChoices(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): List<LLMChoice>
Link copied to clipboard
open fun executeStreaming(prompt: Prompt, model: LLModel): Flow<StreamFrame>
open override fun executeStreaming(prompt: Prompt, model: LLModel, tools: List<ToolDescriptor>): Flow<StreamFrame>
Link copied to clipboard
open override fun llmProvider(): LLMProvider
Link copied to clipboard
open suspend fun models(): List<LLModel>
Link copied to clipboard
fun moderate(prompt: Prompt, model: LLModel, executorService: ExecutorService?): ModerationResult

open suspend override fun moderate(prompt: Prompt, model: LLModel): ModerationResult

Moderates the provided prompt using AWS Bedrock Guardrails.