OpenAIModel class
A PromptCompletionModel
for calling OpenAI and Azure OpenAI hosted models.
Remarks
The model has been updated to support calling OpenAI's new o1 family of models. That currently comes with a few constraints. These constraints are mostly handled for you but are worth noting:
- The o1 models introduce a new
max_completion_tokens
parameter and they've deprecated themax_tokens
parameter. The model will automatically convert the incomingmax_tokens
parameter tomax_completion_tokens
for you. But you should be aware that o1 has hidden token usage and costs that aren't constrained by themax_completion_tokens
parameter. This means that you may see an increase in token usage and costs when using the o1 models. - The o1 models do not currently support the sending of system messages which just means that the
useSystemMessages
parameter is ignored when calling the o1 models. - The o1 models do not currently support setting the
temperature
,top_p
, andpresence_penalty
parameters so they will be ignored. - The o1 models do not currently support the use of tools so you will need to use the "monologue" augmentation to call actions.
Constructors
Open |
Creates a new |
Properties
events | Events emitted by the model. |
options | Options the client was configured with. |
Methods
complete |
Completes a prompt using OpenAI or Azure OpenAI. |
Constructor Details
OpenAIModel(OpenAIModelOptions | AzureOpenAIModelOptions)
Creates a new OpenAIModel
instance.
new OpenAIModel(options: OpenAIModelOptions | AzureOpenAIModelOptions)
Parameters
- options
Options for configuring the model client.
Property Details
events
Events emitted by the model.
PromptCompletionModelEmitter events
Property Value
The events emitted by the model.
options
Options the client was configured with.
options: OpenAIModelOptions | AzureOpenAIModelOptions
Property Value
Method Details
completePrompt(TurnContext, Memory, PromptFunctions, Tokenizer, PromptTemplate)
Completes a prompt using OpenAI or Azure OpenAI.
function completePrompt(context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate): Promise<PromptResponse<string>>
Parameters
- context
-
TurnContext
Current turn context.
- memory
- Memory
An interface for accessing state values.
- functions
- PromptFunctions
Functions to use when rendering the prompt.
- tokenizer
- Tokenizer
Tokenizer to use when rendering the prompt.
- template
- PromptTemplate
Prompt template to complete.
Returns
Promise<PromptResponse<string>>
A PromptResponse
with the status and message.