JobAddParameter Class
An Azure Batch Job to add.
All required parameters must be populated in order to send to Azure.
- Inheritance
-
msrest.serialization.ModelJobAddParameter
Constructor
JobAddParameter(*, id: str, pool_info, display_name: str = None, priority: int = None, max_parallel_tasks: int = -1, allow_task_preemption: bool = None, constraints=None, job_manager_task=None, job_preparation_task=None, job_release_task=None, common_environment_settings=None, on_all_tasks_complete=None, on_task_failure=None, metadata=None, uses_task_dependencies: bool = None, network_configuration=None, **kwargs)
Parameters
Name | Description |
---|---|
id
Required
|
Required. The ID can contain any combination of alphanumeric characters including hyphens and underscores, and cannot contain more than 64 characters. The ID is case-preserving and case-insensitive (that is, you may not have two IDs within an Account that differ only by case). |
display_name
Required
|
The display name need not be unique and can contain any Unicode characters up to a maximum length of 1024. |
priority
Required
|
The priority of the Job. Priority values can range from -1000 to 1000, with -1000 being the lowest priority and 1000 being the highest priority. The default value is 0. |
max_parallel_tasks
Required
|
The maximum number of tasks that can be executed in parallel for the job. The value of maxParallelTasks must be -1 or greater than 0 if specified. If not specified, the default value is -1, which means there's no limit to the number of tasks that can be run at once. You can update a job's maxParallelTasks after it has been created using the update job API. Default value: -1 . |
allow_task_preemption
Required
|
Whether Tasks in this job can be preempted by other high priority jobs. If the value is set to True, other high priority jobs submitted to the system will take precedence and will be able requeue tasks from this job. You can update a job's allowTaskPreemption after it has been created using the update job API. |
constraints
Required
|
The execution constraints for the Job. |
job_manager_task
Required
|
Details of a Job Manager Task to be launched when the Job is started. If the Job does not specify a Job Manager Task, the user must explicitly add Tasks to the Job. If the Job does specify a Job Manager Task, the Batch service creates the Job Manager Task when the Job is created, and will try to schedule the Job Manager Task before scheduling other Tasks in the Job. The Job Manager Task's typical purpose is to control and/or monitor Job execution, for example by deciding what additional Tasks to run, determining when the work is complete, etc. (However, a Job Manager Task is not restricted to these activities - it is a fully-fledged Task in the system and perform whatever actions are required for the Job.) For example, a Job Manager Task might download a file specified as a parameter, analyze the contents of that file and submit additional Tasks based on those contents. |
job_preparation_task
Required
|
The Job Preparation Task. If a Job has a Job Preparation Task, the Batch service will run the Job Preparation Task on a Node before starting any Tasks of that Job on that Compute Node. |
job_release_task
Required
|
The Job Release Task. A Job Release Task cannot be specified without also specifying a Job Preparation Task for the Job. The Batch service runs the Job Release Task on the Nodes that have run the Job Preparation Task. The primary purpose of the Job Release Task is to undo changes to Compute Nodes made by the Job Preparation Task. Example activities include deleting local files, or shutting down services that were started as part of Job preparation. |
common_environment_settings
Required
|
Individual Tasks can override an environment setting specified here by specifying the same setting name with a different value. |
pool_info
Required
|
Required. The Pool on which the Batch service runs the Job's Tasks. |
on_all_tasks_complete
Required
|
The action the Batch service should take when all Tasks in the Job are in the completed state. Note that if a Job contains no Tasks, then all Tasks are considered complete. This option is therefore most commonly used with a Job Manager task; if you want to use automatic Job termination without a Job Manager, you should initially set onAllTasksComplete to noaction and update the Job properties to set onAllTasksComplete to terminatejob once you have finished adding Tasks. The default is noaction. Possible values include: 'noAction', 'terminateJob' |
on_task_failure
Required
|
str or
OnTaskFailure
The action the Batch service should take when any Task in the Job fails. A Task is considered to have failed if has a failureInfo. A failureInfo is set if the Task completes with a non-zero exit code after exhausting its retry count, or if there was an error starting the Task, for example due to a resource file download error. The default is noaction. Possible values include: 'noAction', 'performExitOptionsJobAction' |
metadata
Required
|
The Batch service does not assign any meaning to metadata; it is solely for the use of user code. |
uses_task_dependencies
Required
|
Whether Tasks in the Job can define dependencies on each other. The default is false. |
network_configuration
Required
|
The network configuration for the Job. |
Keyword-Only Parameters
Name | Description |
---|---|
id
Required
|
|
pool_info
Required
|
|
display_name
Required
|
|
priority
Required
|
|
max_parallel_tasks
|
Default value: -1
|
allow_task_preemption
Required
|
|
constraints
Required
|
|
job_manager_task
Required
|
|
job_preparation_task
Required
|
|
job_release_task
Required
|
|
common_environment_settings
Required
|
|
on_all_tasks_complete
Required
|
|
on_task_failure
Required
|
|
metadata
Required
|
|
uses_task_dependencies
Required
|
|
network_configuration
Required
|
|
Methods
as_dict |
Return a dict that can be JSONify using json.dump. Advanced usage might optionally use a callback as parameter: Key is the attribute name used in Python. Attr_desc is a dict of metadata. Currently contains 'type' with the msrest type and 'key' with the RestAPI encoded key. Value is the current value in this object. The string returned will be used to serialize the key. If the return type is a list, this is considered hierarchical result dict. See the three examples in this file:
If you want XML serialization, you can pass the kwargs is_xml=True. |
deserialize |
Parse a str using the RestAPI syntax and return a model. |
enable_additional_properties_sending | |
from_dict |
Parse a dict using given key extractor return a model. By default consider key extractors (rest_key_case_insensitive_extractor, attribute_key_case_insensitive_extractor and last_rest_key_case_insensitive_extractor) |
is_xml_model | |
serialize |
Return the JSON that would be sent to azure from this model. This is an alias to as_dict(full_restapi_key_transformer, keep_readonly=False). If you want XML serialization, you can pass the kwargs is_xml=True. |
validate |
Validate this model recursively and return a list of ValidationError. |
as_dict
Return a dict that can be JSONify using json.dump.
Advanced usage might optionally use a callback as parameter:
Key is the attribute name used in Python. Attr_desc is a dict of metadata. Currently contains 'type' with the msrest type and 'key' with the RestAPI encoded key. Value is the current value in this object.
The string returned will be used to serialize the key. If the return type is a list, this is considered hierarchical result dict.
See the three examples in this file:
attribute_transformer
full_restapi_key_transformer
last_restapi_key_transformer
If you want XML serialization, you can pass the kwargs is_xml=True.
as_dict(keep_readonly=True, key_transformer=<function attribute_transformer>, **kwargs)
Parameters
Name | Description |
---|---|
key_transformer
|
<xref:function>
A key transformer function. |
keep_readonly
|
Default value: True
|
Returns
Type | Description |
---|---|
A dict JSON compatible object |
deserialize
Parse a str using the RestAPI syntax and return a model.
deserialize(data, content_type=None)
Parameters
Name | Description |
---|---|
data
Required
|
A str using RestAPI structure. JSON by default. |
content_type
|
JSON by default, set application/xml if XML. Default value: None
|
Returns
Type | Description |
---|---|
An instance of this model |
Exceptions
Type | Description |
---|---|
DeserializationError if something went wrong
|
enable_additional_properties_sending
enable_additional_properties_sending()
from_dict
Parse a dict using given key extractor return a model.
By default consider key extractors (rest_key_case_insensitive_extractor, attribute_key_case_insensitive_extractor and last_rest_key_case_insensitive_extractor)
from_dict(data, key_extractors=None, content_type=None)
Parameters
Name | Description |
---|---|
data
Required
|
A dict using RestAPI structure |
content_type
|
JSON by default, set application/xml if XML. Default value: None
|
key_extractors
|
Default value: None
|
Returns
Type | Description |
---|---|
An instance of this model |
Exceptions
Type | Description |
---|---|
DeserializationError if something went wrong
|
is_xml_model
is_xml_model()
serialize
Return the JSON that would be sent to azure from this model.
This is an alias to as_dict(full_restapi_key_transformer, keep_readonly=False).
If you want XML serialization, you can pass the kwargs is_xml=True.
serialize(keep_readonly=False, **kwargs)
Parameters
Name | Description |
---|---|
keep_readonly
|
If you want to serialize the readonly attributes Default value: False
|
Returns
Type | Description |
---|---|
A dict JSON compatible object |
validate
Validate this model recursively and return a list of ValidationError.
validate()
Returns
Type | Description |
---|---|
A list of validation error |
Azure SDK for Python