ChatCompletionsClient.GetModelInfo Method

Definition

Overloads

GetModelInfo(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

GetModelInfo(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

GetModelInfo(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

public virtual Azure.Response GetModelInfo (Azure.RequestContext context);
abstract member GetModelInfo : Azure.RequestContext -> Azure.Response
override this.GetModelInfo : Azure.RequestContext -> Azure.Response
Public Overridable Function GetModelInfo (context As RequestContext) As Response

Parameters

context
RequestContext

The request context, which can override default behaviors of the client pipeline on a per-call basis.

Returns

The response returned from the service.

Exceptions

Service returned a non-success status code.

Examples

This sample shows how to call GetModelInfo and parse the result.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response response = client.GetModelInfo(null);

JsonElement result = JsonDocument.Parse(response.ContentStream).RootElement;
Console.WriteLine(result.GetProperty("model_name").ToString());
Console.WriteLine(result.GetProperty("model_type").ToString());
Console.WriteLine(result.GetProperty("model_provider_name").ToString());

This sample shows how to call GetModelInfo with all parameters and parse the result.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response response = client.GetModelInfo(null);

JsonElement result = JsonDocument.Parse(response.ContentStream).RootElement;
Console.WriteLine(result.GetProperty("model_name").ToString());
Console.WriteLine(result.GetProperty("model_type").ToString());
Console.WriteLine(result.GetProperty("model_provider_name").ToString());

Applies to

GetModelInfo(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint.

public virtual Azure.Response<Azure.AI.Inference.ModelInfo> GetModelInfo (System.Threading.CancellationToken cancellationToken = default);
abstract member GetModelInfo : System.Threading.CancellationToken -> Azure.Response<Azure.AI.Inference.ModelInfo>
override this.GetModelInfo : System.Threading.CancellationToken -> Azure.Response<Azure.AI.Inference.ModelInfo>
Public Overridable Function GetModelInfo (Optional cancellationToken As CancellationToken = Nothing) As Response(Of ModelInfo)

Parameters

cancellationToken
CancellationToken

The cancellation token to use.

Returns

Examples

This sample shows how to call GetModelInfo.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response<ModelInfo> response = client.GetModelInfo();

This sample shows how to call GetModelInfo with all parameters.

Uri endpoint = new Uri("<https://my-service.azure.com>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
ChatCompletionsClient client = new ChatCompletionsClient(endpoint, credential);

Response<ModelInfo> response = client.GetModelInfo();

Applies to