EmbeddingsClient.GetModelInfoAsync Method

Definition

Overloads

GetModelInfoAsync(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfoAsync(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfoAsync(RequestContext)

Source:
EmbeddingsClient.cs

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

public virtual System.Threading.Tasks.Task<Azure.Response> GetModelInfoAsync (Azure.RequestContext context);
abstract member GetModelInfoAsync : Azure.RequestContext -> System.Threading.Tasks.Task<Azure.Response>
override this.GetModelInfoAsync : Azure.RequestContext -> System.Threading.Tasks.Task<Azure.Response>
Public Overridable Function GetModelInfoAsync (context As RequestContext) As Task(Of Response)

Parameters

context
RequestContext

The request context, which can override default behaviors of the client pipeline on a per-call basis.

Returns

The response returned from the service.

Exceptions

Service returned a non-success status code.

Examples

This sample shows how to call GetModelInfoAsync and parse the result.

Uri endpoint = new Uri("<endpoint>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
EmbeddingsClient client = new EmbeddingsClient(endpoint, credential);

Response response = await client.GetModelInfoAsync(null);

JsonElement result = JsonDocument.Parse(response.ContentStream).RootElement;
Console.WriteLine(result.GetProperty("model_name").ToString());
Console.WriteLine(result.GetProperty("model_type").ToString());
Console.WriteLine(result.GetProperty("model_provider_name").ToString());

This sample shows how to call GetModelInfoAsync and parse the result.

Uri endpoint = new Uri("<endpoint>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
EmbeddingsClient client = new EmbeddingsClient(endpoint, credential);

Response response = await client.GetModelInfoAsync(null);

JsonElement result = JsonDocument.Parse(response.ContentStream).RootElement;
Console.WriteLine(result.GetProperty("model_name").ToString());
Console.WriteLine(result.GetProperty("model_type").ToString());
Console.WriteLine(result.GetProperty("model_provider_name").ToString());

Applies to

GetModelInfoAsync(CancellationToken)

Source:
EmbeddingsClient.cs

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

public virtual System.Threading.Tasks.Task<Azure.Response<Azure.AI.Inference.ModelInfo>> GetModelInfoAsync (System.Threading.CancellationToken cancellationToken = default);
abstract member GetModelInfoAsync : System.Threading.CancellationToken -> System.Threading.Tasks.Task<Azure.Response<Azure.AI.Inference.ModelInfo>>
override this.GetModelInfoAsync : System.Threading.CancellationToken -> System.Threading.Tasks.Task<Azure.Response<Azure.AI.Inference.ModelInfo>>
Public Overridable Function GetModelInfoAsync (Optional cancellationToken As CancellationToken = Nothing) As Task(Of Response(Of ModelInfo))

Parameters

cancellationToken
CancellationToken

The cancellation token to use.

Returns

Examples

This sample shows how to call GetModelInfoAsync.

Uri endpoint = new Uri("<endpoint>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
EmbeddingsClient client = new EmbeddingsClient(endpoint, credential);

Response<ModelInfo> response = await client.GetModelInfoAsync();

This sample shows how to call GetModelInfoAsync.

Uri endpoint = new Uri("<endpoint>");
AzureKeyCredential credential = new AzureKeyCredential("<key>");
EmbeddingsClient client = new EmbeddingsClient(endpoint, credential);

Response<ModelInfo> response = await client.GetModelInfoAsync();

Applies to