ModelProfile Class

Contains the results of a profiling run.

A model profile of a model is a resource requirement recommendation. A ModelProfile object is returned from the profile method of the Model class.

Initialize the ModelProfile object.

Inheritance
azureml.core.profile._ModelEvaluationResultBase
ModelProfile

Constructor

ModelProfile(workspace, name)

Parameters

Name Description
workspace
Required

The workspace object containing the model.

name
Required
str

The name of the profile to create and retrieve.

workspace
Required

The workspace object containing the model.

name
Required
str

The name of the profile to create and retrieve.

Remarks

The following example shows how to return a ModelProfile object.


   profile = Model.profile(ws, "profilename", [model], inference_config, input_dataset=dataset)
   profile.wait_for_profiling(True)
   profiling_details = profile.get_details()
   print(profiling_details)

Methods

get_details

Get the details of the profiling result.

Return the the observed metrics (various latency percentiles, maximum utilized cpu and memory, etc.) and recommended resource requirements in case of a success.

serialize

Convert this Profile into a JSON serialized dictionary.

wait_for_completion

Wait for the model to finish profiling.

get_details

Get the details of the profiling result.

Return the the observed metrics (various latency percentiles, maximum utilized cpu and memory, etc.) and recommended resource requirements in case of a success.

get_details()

Returns

Type Description

A dictionary of recommended resource requirements.

serialize

Convert this Profile into a JSON serialized dictionary.

serialize()

Returns

Type Description

The JSON representation of this Profile

wait_for_completion

Wait for the model to finish profiling.

wait_for_completion(show_output=False)

Parameters

Name Description
show_output

Boolean option to print more verbose output. Defaults to False.

default value: False