requestLLMStructured

open suspend fun <T> requestLLMStructured(structure: StructuredData<T>, retries: Int = 1, fixingModel: LLModel = OpenAIModels.Chat.GPT4o): Result<StructuredResponse<T>>(source)

Coerce LLM to provide a structured output.

See also