ConfiguredLLMClientOptions interface
The configuration of the LLMClient instance.
Properties
history_variable | Memory variable used for storing conversation history. |
input_variable | Memory variable used for storing the users input message. |
log |
If true, any repair attempts will be logged to the console. |
max_history_messages | Maximum number of conversation history messages that will be persisted to memory. |
max_repair_attempts | Maximum number of automatic repair attempts that will be made. |
model | AI model used for completing prompts. |
template | Prompt used for the conversation. |
tokenizer | Tokenizer used when rendering the prompt or counting tokens. |
validator | Response validator used when completing prompts. |
Property Details
history_variable
Memory variable used for storing conversation history.
history_variable: string
Property Value
string
input_variable
Memory variable used for storing the users input message.
input_variable: string
Property Value
string
logRepairs
If true, any repair attempts will be logged to the console.
logRepairs: boolean
Property Value
boolean
max_history_messages
Maximum number of conversation history messages that will be persisted to memory.
max_history_messages: number
Property Value
number
max_repair_attempts
Maximum number of automatic repair attempts that will be made.
max_repair_attempts: number
Property Value
number
model
AI model used for completing prompts.
model: PromptCompletionModel
Property Value
template
tokenizer
Tokenizer used when rendering the prompt or counting tokens.
tokenizer: Tokenizer
Property Value
validator
Response validator used when completing prompts.
validator: PromptResponseValidator<TContent>
Property Value
PromptResponseValidator<TContent>