Metadata-Version: 2.0
Name: azureml-contrib-pipeline-steps
Version: 1.16.0
Summary: Azure Machine Learning Parallel Run Step
Home-page: https://docs.microsoft.com/python/api/overview/azure/ml/?view=azure-ml-py
Author: Microsoft Corp
License: Proprietary https://aka.ms/azureml-preview-sdk-license 
Platform: UNKNOWN
Requires-Python: >=3.5,<4
Description-Content-Type: text/markdown
Requires-Dist: azureml-core (~=1.16.0)
Requires-Dist: azureml-dataset-runtime (~=1.16.0)
Requires-Dist: azureml-pipeline-core (~=1.16.0)

# Note
This package has been deprecated and moved to [azureml-pipeline-steps](https://pypi.org/project/azureml-pipeline-steps/). Please refer to this [documentation](https://docs.microsoft.com/python/api/azureml-pipeline-steps) for more information.

# Azure Machine Learning Batch Inference

Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.

# Getting Started with Batch Inference Public Preview

Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Please visit [Azure Machine Learning Notebooks](https://github.com/Azure/MachineLearningNotebooks) to find tutorials on how to leverage this service.


