Metadata-Version: 2.0
Name: azureml-contrib-pipeline-steps
Version: 1.1.0rc0
Summary: Azure Machine Learning Parallel Run Step
Home-page: https://docs.microsoft.com/en-us/azure/machine-learning/service/
Author: Microsoft Corp
License: Proprietary https://aka.ms/azureml-preview-sdk-license 
Platform: UNKNOWN
Description-Content-Type: text/markdown
Requires-Dist: azureml-core (==1.1.0rc0.*)
Requires-Dist: azureml-pipeline-core (==1.1.0rc0.*)
Requires-Dist: azureml-pipeline-steps (==1.1.0rc0.*)
Requires-Dist: azureml-dataprep (~=1.1)

# Azure Machine Learning Batch Inference

Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.

# Getting Started with Batch Inference Public Preview

Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Please visit [Azure Machine Learning Notebooks](https://github.com/Azure/MachineLearningNotebooks) to find tutorials on how to leverage this service.




