ChatCompletionsClient.GetModelInfoAsync Method

Definition

Overloads

GetModelInfoAsync(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

GetModelInfoAsync(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

GetModelInfoAsync(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

public virtual System.Threading.Tasks.Task<Azure.Response> GetModelInfoAsync (Azure.RequestContext context);
abstract member GetModelInfoAsync : Azure.RequestContext -> System.Threading.Tasks.Task<Azure.Response>
override this.GetModelInfoAsync : Azure.RequestContext -> System.Threading.Tasks.Task<Azure.Response>
Public Overridable Function GetModelInfoAsync (context As RequestContext) As Task(Of Response)

Parameters

context
RequestContext

The request context, which can override default behaviors of the client pipeline on a per-call basis.

Returns

The response returned from the service.

Exceptions

Service returned a non-success status code.

Examples

This sample shows how to call GetModelInfoAsync and parse the result.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response response = await client.GetModelInfoAsync(null);

JsonElement result = JsonDocument.Parse(response.ContentStream).RootElement;
Console.WriteLine(result.GetProperty("model_name").ToString());
Console.WriteLine(result.GetProperty("model_type").ToString());
Console.WriteLine(result.GetProperty("model_provider_name").ToString());

This sample shows how to call GetModelInfoAsync with all parameters and parse the result.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response response = await client.GetModelInfoAsync(null);

JsonElement result = JsonDocument.Parse(response.ContentStream).RootElement;
Console.WriteLine(result.GetProperty("model_name").ToString());
Console.WriteLine(result.GetProperty("model_type").ToString());
Console.WriteLine(result.GetProperty("model_provider_name").ToString());

Applies to

GetModelInfoAsync(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

public virtual System.Threading.Tasks.Task<Azure.Response<Azure.AI.Inference.ModelInfo>> GetModelInfoAsync (System.Threading.CancellationToken cancellationToken = default);
abstract member GetModelInfoAsync : System.Threading.CancellationToken -> System.Threading.Tasks.Task<Azure.Response<Azure.AI.Inference.ModelInfo>>
override this.GetModelInfoAsync : System.Threading.CancellationToken -> System.Threading.Tasks.Task<Azure.Response<Azure.AI.Inference.ModelInfo>>
Public Overridable Function GetModelInfoAsync (Optional cancellationToken As CancellationToken = Nothing) As Task(Of Response(Of ModelInfo))

Parameters

cancellationToken
CancellationToken

The cancellation token to use.

Returns

Examples

This sample shows how to call GetModelInfoAsync.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response<ModelInfo> response = await client.GetModelInfoAsync();

This sample shows how to call GetModelInfoAsync with all parameters.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response<ModelInfo> response = await client.GetModelInfoAsync();

Applies to