End-to-end modeling workflow with Azure¶
Access this AI accelerator on GitHub
DataRobot offers an in-depth API that allows you to produce fully automated workflows in your coding environment of choice. This accelerator shows how to enable end-to-end processing of data stored natively in Azure.
In this notebook you'll see how data stored in Azure can be used to train a collection of models on DataRobot. You'll then deploy a recommended model and use DataRobot's batch prediction API to produce predictions and write them back to the source Azure container.
This accelerator notebook covers the following activities:
- Acquire a training dataset from an Azure storage container
- Build a new DataRobot project
- Deploy a recommended model
- Score via DataRobot's batch prediction API
- Write results back to the source Azure container
Was this page helpful?
Great! Let us know what you found helpful.
What can we do to improve the content?
Thanks for your feedback!