OpenAIAssistantBase Class
OpenAI Assistant Base class.
Manages the interaction with OpenAI Assistants.
Note: This class is experimental and may change in the future.
Initialize an OpenAIAssistant Base.
- Inheritance
-
OpenAIAssistantBase
Constructor
OpenAIAssistantBase(ai_model_id: str, client: AsyncOpenAI, service_id: str, *, kernel: Kernel | None = None, id: str | None = None, name: str | None = None, description: str | None = None, instructions: str | None = None, enable_code_interpreter: bool | None = None, enable_file_search: bool | None = None, enable_json_response: bool | None = None, code_interpreter_file_ids: list[str] | None = [], temperature: float | None = None, top_p: float | None = None, vector_store_id: str | None = None, metadata: dict[str, Any] | None = {}, max_completion_tokens: int | None = None, max_prompt_tokens: int | None = None, parallel_tool_calls_enabled: bool | None = True, truncation_message_count: int | None = None, assistant: Assistant | None = None, polling_options: RunPollingOptions = None, file_search_file_ids: Annotated[list[str] | None, MaxLen(max_length=20)] = None)
Parameters
Name | Description |
---|---|
ai_model_id
Required
|
The AI model id. Defaults to None. |
client
Required
|
The client, either AsyncOpenAI or AsyncAzureOpenAI. |
service_id
Required
|
The service id. |
kernel
Required
|
The kernel. (optional) |
id
Required
|
The id. Defaults to None. (optional) |
name
Required
|
The name. Defaults to None. (optional) |
description
Required
|
The description. Defaults to None. (optional) |
default_headers
Required
|
The default headers. Defaults to None. (optional) |
instructions
Required
|
The instructions. Defaults to None. (optional) |
enable_code_interpreter
Required
|
Enable code interpreter. Defaults to False. (optional) |
enable_file_search
Required
|
Enable file search. Defaults to False. (optional) |
enable_json_response
Required
|
Enable JSON response. Defaults to False. (optional) |
code_interpreter_file_ids
Required
|
The file ids. Defaults to []. (optional) |
temperature
Required
|
The temperature. Defaults to None. (optional) |
top_p
Required
|
The top p. Defaults to None. (optional) |
vector_store_id
Required
|
The vector store id. Defaults to None. (optional) |
metadata
Required
|
The metadata. Defaults to {}. (optional) |
max_completion_tokens
Required
|
The max completion tokens. Defaults to None. (optional) |
max_prompt_tokens
Required
|
The max prompt tokens. Defaults to None. (optional) |
parallel_tool_calls_enabled
Required
|
Enable parallel tool calls. Defaults to True. (optional) |
truncation_message_count
Required
|
The truncation message count. Defaults to None. (optional) |
kwargs
Required
|
The keyword arguments. |
Keyword-Only Parameters
Name | Description |
---|---|
kernel
Required
|
|
id
Required
|
|
name
Required
|
|
description
Required
|
|
instructions
Required
|
|
enable_code_interpreter
Required
|
|
enable_file_search
Required
|
|
enable_json_response
Required
|
|
code_interpreter_file_ids
Required
|
|
temperature
Required
|
|
top_p
Required
|
|
vector_store_id
Required
|
|
metadata
Required
|
|
max_completion_tokens
Required
|
|
max_prompt_tokens
Required
|
|
parallel_tool_calls_enabled
|
Default value: True
|
truncation_message_count
Required
|
|
assistant
Required
|
|
polling_options
Required
|
|
file_search_file_ids
Required
|
|
Methods
add_chat_message |
Add a chat message. |
add_file |
Add a file for use with the Assistant. |
create_assistant |
Create the assistant. |
create_channel |
Create a channel. |
create_thread |
Create a thread. |
create_vector_store |
Create a vector store. |
delete |
Delete the assistant. |
delete_file |
Delete a file. |
delete_thread |
Delete a thread. |
delete_vector_store |
Delete a vector store. |
get_channel_keys |
Get the channel keys. |
get_thread_messages |
Get the messages for the specified thread. |
invoke |
Invoke the chat assistant. The supplied arguments will take precedence over the specified assistant level attributes. |
invoke_stream |
Invoke the chat assistant with streaming. |
model_post_init |
This function is meant to behave like a BaseModel method to initialise private attributes. It takes context as an argument since that's what pydantic-core passes when calling it. |
modify_assistant |
Modify the assistant. |
add_chat_message
Add a chat message.
async add_chat_message(thread_id: str, message: ChatMessageContent) -> Message
Parameters
Name | Description |
---|---|
thread_id
Required
|
The thread id. |
message
Required
|
The chat message. |
Returns
Type | Description |
---|---|
<xref:Message>
|
The message. |
add_file
Add a file for use with the Assistant.
async add_file(file_path: str, purpose: Literal['assistants', 'vision']) -> str
Parameters
Name | Description |
---|---|
file_path
Required
|
The file path. |
purpose
Required
|
The purpose. Can be "assistants" or "vision". |
Returns
Type | Description |
---|---|
The file id. |
Exceptions
Type | Description |
---|---|
AgentInitializationError
|
If the client has not been initialized or the file is not found. |
create_assistant
Create the assistant.
async create_assistant(ai_model_id: str | None = None, description: str | None = None, instructions: str | None = None, name: str | None = None, enable_code_interpreter: bool | None = None, code_interpreter_file_ids: list[str] | None = None, enable_file_search: bool | None = None, vector_store_id: str | None = None, metadata: dict[str, str] | None = {}, **kwargs: Any) -> Assistant
Parameters
Name | Description |
---|---|
ai_model_id
|
The AI model id. Defaults to None. (optional) Default value: None
|
description
|
The description. Defaults to None. (optional) Default value: None
|
instructions
|
The instructions. Defaults to None. (optional) Default value: None
|
name
|
The name. Defaults to None. (optional) Default value: None
|
enable_code_interpreter
|
Enable code interpreter. Defaults to None. (optional) Default value: None
|
enable_file_search
|
Enable file search. Defaults to None. (optional) Default value: None
|
code_interpreter_file_ids
|
The file ids. Defaults to None. (optional) Default value: None
|
vector_store_id
|
The vector store id. Defaults to None. (optional) Default value: None
|
metadata
|
The metadata. Defaults to {}. (optional) Default value: {}
|
kwargs
Required
|
Extra keyword arguments. |
Returns
Type | Description |
---|---|
<xref:Assistant>
|
The assistant |
create_channel
Create a channel.
async create_channel() -> AgentChannel
create_thread
Create a thread.
async create_thread(*, code_interpreter_file_ids: list[str] | None = [], messages: list[ChatMessageContent] | None = [], vector_store_id: str | None = None, metadata: dict[str, str] = {}) -> str
Parameters
Name | Description |
---|---|
code_interpreter_file_ids
Required
|
The code interpreter file ids. Defaults to an empty list. (optional) |
messages
Required
|
The chat messages. Defaults to an empty list. (optional) |
vector_store_id
Required
|
The vector store id. Defaults to None. (optional) |
metadata
Required
|
The metadata. Defaults to an empty dictionary. (optional) |
Keyword-Only Parameters
Name | Description |
---|---|
code_interpreter_file_ids
Required
|
|
messages
Required
|
|
vector_store_id
Required
|
|
metadata
Required
|
|
Returns
Type | Description |
---|---|
The thread id. |
create_vector_store
Create a vector store.
async create_vector_store(file_ids: str | list[str]) -> str
Parameters
Name | Description |
---|---|
file_ids
Required
|
The file ids either as a str of a single file ID or a list of strings of file IDs. |
Returns
Type | Description |
---|---|
The vector store id. |
Exceptions
Type | Description |
---|---|
AgentExecutionError
|
If there is an error creating the vector store. |
delete
Delete the assistant.
async delete() -> bool
Returns
Type | Description |
---|---|
True if the assistant is deleted. |
delete_file
Delete a file.
async delete_file(file_id: str) -> None
Parameters
Name | Description |
---|---|
file_id
Required
|
The file id. |
delete_thread
Delete a thread.
async delete_thread(thread_id: str) -> None
Parameters
Name | Description |
---|---|
thread_id
Required
|
The thread id. |
delete_vector_store
Delete a vector store.
async delete_vector_store(vector_store_id: str) -> None
Parameters
Name | Description |
---|---|
vector_store_id
Required
|
The vector store id. |
Exceptions
Type | Description |
---|---|
AgentExecutionError
|
If there is an error deleting the vector store. |
get_channel_keys
Get the channel keys.
get_channel_keys() -> Iterable[str]
Returns
Type | Description |
---|---|
The channel keys. |
get_thread_messages
Get the messages for the specified thread.
async get_thread_messages(thread_id: str) -> AsyncIterable[ChatMessageContent]
Parameters
Name | Description |
---|---|
thread_id
Required
|
The thread id. |
invoke
Invoke the chat assistant.
The supplied arguments will take precedence over the specified assistant level attributes.
async invoke(thread_id: str, *, ai_model_id: str | None = None, enable_code_interpreter: bool | None = False, enable_file_search: bool | None = False, enable_json_response: bool | None = None, max_completion_tokens: int | None = None, max_prompt_tokens: int | None = None, parallel_tool_calls_enabled: bool | None = True, truncation_message_count: int | None = None, temperature: float | None = None, top_p: float | None = None, metadata: dict[str, str] | None = None, **kwargs: Any) -> AsyncIterable[ChatMessageContent]
Parameters
Name | Description |
---|---|
thread_id
Required
|
The thread id. |
ai_model_id
Required
|
The AI model id. Defaults to None. (optional) |
enable_code_interpreter
Required
|
Enable code interpreter. Defaults to False. (optional) |
enable_file_search
Required
|
Enable file search. Defaults to False. (optional) |
enable_json_response
Required
|
Enable JSON response. Defaults to False. (optional) |
max_completion_tokens
Required
|
The max completion tokens. Defaults to None. (optional) |
max_prompt_tokens
Required
|
The max prompt tokens. Defaults to None. (optional) |
parallel_tool_calls_enabled
Required
|
Enable parallel tool calls. Defaults to True. (optional) |
truncation_message_count
Required
|
The truncation message count. Defaults to None. (optional) |
temperature
Required
|
The temperature. Defaults to None. (optional) |
top_p
Required
|
The top p. Defaults to None. (optional) |
metadata
Required
|
The metadata. Defaults to {}. (optional) |
kwargs
Required
|
Extra keyword arguments. |
Keyword-Only Parameters
Name | Description |
---|---|
ai_model_id
Required
|
|
enable_code_interpreter
Required
|
|
enable_file_search
Required
|
|
enable_json_response
Required
|
|
max_completion_tokens
Required
|
|
max_prompt_tokens
Required
|
|
parallel_tool_calls_enabled
|
Default value: True
|
truncation_message_count
Required
|
|
temperature
Required
|
|
top_p
Required
|
|
metadata
Required
|
|
invoke_stream
Invoke the chat assistant with streaming.
async invoke_stream(thread_id: str, *, messages: list[ChatMessageContent] | None = None, ai_model_id: str | None = None, enable_code_interpreter: bool | None = False, enable_file_search: bool | None = False, enable_json_response: bool | None = None, max_completion_tokens: int | None = None, max_prompt_tokens: int | None = None, parallel_tool_calls_enabled: bool | None = True, truncation_message_count: int | None = None, temperature: float | None = None, top_p: float | None = None, metadata: dict[str, str] | None = None, **kwargs: Any) -> AsyncIterable[ChatMessageContent]
Parameters
Name | Description |
---|---|
thread_id
Required
|
|
Keyword-Only Parameters
Name | Description |
---|---|
messages
Required
|
|
ai_model_id
Required
|
|
enable_code_interpreter
Required
|
|
enable_file_search
Required
|
|
enable_json_response
Required
|
|
max_completion_tokens
Required
|
|
max_prompt_tokens
Required
|
|
parallel_tool_calls_enabled
|
Default value: True
|
truncation_message_count
Required
|
|
temperature
Required
|
|
top_p
Required
|
|
metadata
Required
|
|
model_post_init
This function is meant to behave like a BaseModel method to initialise private attributes.
It takes context as an argument since that's what pydantic-core passes when calling it.
model_post_init(context: Any, /) -> None
Positional-Only Parameters
Name | Description |
---|---|
context
Required
|
|
Parameters
Name | Description |
---|---|
self
Required
|
The BaseModel instance. |
context
Required
|
The context. |
modify_assistant
Modify the assistant.
async modify_assistant(assistant_id: str, **kwargs: Any) -> Assistant
Parameters
Name | Description |
---|---|
assistant_id
Required
|
The assistant's current ID. |
kwargs
Required
|
Extra keyword arguments. |
Returns
Type | Description |
---|---|
<xref:Assistant>
|
The modified assistant. |
Attributes
model_computed_fields
A dictionary of computed field names and their corresponding ComputedFieldInfo objects.
model_computed_fields: ClassVar[Dict[str, ComputedFieldInfo]] = {}
model_config
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'populate_by_name': True, 'validate_assignment': True}
model_fields
Metadata about the fields defined on the model, mapping of field names to [FieldInfo][pydantic.fields.FieldInfo] objects.
This replaces Model.fields from Pydantic V1.
model_fields: ClassVar[Dict[str, FieldInfo]] = {'ai_model_id': FieldInfo(annotation=str, required=True), 'assistant': FieldInfo(annotation=Union[Assistant, NoneType], required=False, default=None), 'client': FieldInfo(annotation=AsyncOpenAI, required=True), 'code_interpreter_file_ids': FieldInfo(annotation=Union[list[str], NoneType], required=False, default_factory=list, metadata=[MaxLen(max_length=20)]), 'description': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'enable_code_interpreter': FieldInfo(annotation=Union[bool, NoneType], required=False, default=False), 'enable_file_search': FieldInfo(annotation=Union[bool, NoneType], required=False, default=False), 'enable_json_response': FieldInfo(annotation=Union[bool, NoneType], required=False, default=False), 'file_search_file_ids': FieldInfo(annotation=Union[list[str], NoneType], required=False, default_factory=list, metadata=[MaxLen(max_length=20)]), 'id': FieldInfo(annotation=str, required=False, default_factory=<lambda>), 'instructions': FieldInfo(annotation=Union[str, NoneType], required=False, default=None), 'kernel': FieldInfo(annotation=Kernel, required=False, default_factory=Kernel), 'max_completion_tokens': FieldInfo(annotation=Union[int, NoneType], required=False, default=None), 'max_prompt_tokens': FieldInfo(annotation=Union[int, NoneType], required=False, default=None), 'metadata': FieldInfo(annotation=Union[dict[str, Any], NoneType], required=False, default_factory=dict, metadata=[MaxLen(max_length=16)]), 'name': FieldInfo(annotation=str, required=False, default_factory=<lambda>), 'parallel_tool_calls_enabled': FieldInfo(annotation=Union[bool, NoneType], required=False, default=True), 'polling_options': FieldInfo(annotation=RunPollingOptions, required=False, default_factory=RunPollingOptions), 'temperature': FieldInfo(annotation=Union[float, NoneType], required=False, default=None), 'top_p': FieldInfo(annotation=Union[float, NoneType], required=False, default=None), 'truncation_message_count': FieldInfo(annotation=Union[int, NoneType], required=False, default=None), 'vector_store_id': FieldInfo(annotation=Union[str, NoneType], required=False, default=None)}
tools
ai_model_id
ai_model_id: str
allowed_message_roles
allowed_message_roles: ClassVar[list[str]] = [AuthorRole.USER, AuthorRole.ASSISTANT]
assistant
assistant: Assistant | None
channel_type
channel_type: ClassVar[type[AgentChannel]] = None
client
client: AsyncOpenAI
code_interpreter_file_ids
code_interpreter_file_ids: list[str] | None
enable_code_interpreter
enable_code_interpreter: bool | None
enable_file_search
enable_file_search: bool | None
enable_json_response
enable_json_response: bool | None
error_message_states
error_message_states: ClassVar[list[str]] = ['failed', 'canceled', 'expired']
file_search_file_ids
file_search_file_ids: list[str] | None
is_experimental
is_experimental = True
max_completion_tokens
max_completion_tokens: int | None
max_prompt_tokens
max_prompt_tokens: int | None
metadata
metadata: dict[str, Any] | None
parallel_tool_calls_enabled
parallel_tool_calls_enabled: bool | None
polling_options
polling_options: RunPollingOptions
polling_status
polling_status: ClassVar[list[str]] = ['queued', 'in_progress', 'cancelling']
temperature
temperature: float | None
top_p
top_p: float | None
truncation_message_count
truncation_message_count: int | None
vector_store_id
vector_store_id: str | None