ChatCompletionsClient Class

Definition

The ChatCompletions service client.

public class ChatCompletionsClient
type ChatCompletionsClient = class
Public Class ChatCompletionsClient
Inheritance
ChatCompletionsClient

Constructors

ChatCompletionsClient()

Initializes a new instance of ChatCompletionsClient for mocking.

ChatCompletionsClient(Uri, AzureKeyCredential, AzureAIInferenceClientOptions)

Initializes a new instance of ChatCompletionsClient.

ChatCompletionsClient(Uri, AzureKeyCredential)

Initializes a new instance of ChatCompletionsClient.

ChatCompletionsClient(Uri, TokenCredential, AzureAIInferenceClientOptions)

Initializes a new instance of ChatCompletionsClient.

ChatCompletionsClient(Uri, TokenCredential)

Initializes a new instance of ChatCompletionsClient.

Properties

Pipeline

The HTTP pipeline for sending and receiving REST requests and responses.

Methods

Complete(ChatCompletionsOptions, CancellationToken)

Gets chat completions for the provided chat messages. Completions support a wide variety of tasks and generate text that continues from or "completes" provided prompt data. The method makes a REST API call to the /chat/completions route on the given endpoint.

CompleteAsync(ChatCompletionsOptions, CancellationToken)

Gets chat completions for the provided chat messages. Completions support a wide variety of tasks and generate text that continues from or "completes" provided prompt data. The method makes a REST API call to the /chat/completions route on the given endpoint.

CompleteStreaming(ChatCompletionsOptions, CancellationToken)

Begin a chat completions request and get an object that can stream response data as it becomes available.

CompleteStreamingAsync(ChatCompletionsOptions, CancellationToken)

Begin a chat completions request and get an object that can stream response data as it becomes available.

GetModelInfo(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfo(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfoAsync(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfoAsync(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

Applies to