Code-first tools¶
The table below lists the various programmatic tools DataRobot has to offer in addition to the APIs:
Resource | Description |
---|---|
Blueprint Workshop | Construct and modify DataRobot blueprints and their tasks using a programmatic interface. |
DataRobot Prediction Library | The DataRobot Prediction Library is a Python library for making predictions using various prediction methods supported by DataRobot. It provides a common interface for making predictions, making it easy to swap out the underlying implementation. |
DataRobotX (DRX) | DataRobotX, or DRX, is a collection of DataRobot extensions designed to enhance your data science experience. DRX provides a streamlined experience for common workflows but also offers new, experimental high-level abstractions. |
DataRobot User Models (DRUM) | A repository that contains tools, templates, and information for assembling, debugging, testing, and running your custom inference models, custom tasks, and custom notebook environments with DataRobot. |
MLOps agents | The MLOps agents allow you to monitor and manage external models—those running outside of DataRobot MLOps. With this functionality, predictions and information from these models can be reported as part of MLOps deployments. |
Management agent | The MLOps management agent provides a standard mechanism to automate model deployment to any type of infrastructure. It pairs automated deployment with automated monitoring to ease the burden on remote models in production, especially with critical MLOps features such as challenger models and retraining. |
DRApps | DRApps is a simple command line interface (CLI) providing the tools required to host a custom application, such as a Streamlit app, in DataRobot using a DataRobot execution environment. This allows you to run apps without building your own Docker image. Custom applications don't provide any storage; however, you can access the full DataRobot API and other services. |
DataRobot model metrics library (DMM) | A repository that contains a framework to compute model machine learning metrics over time and produce aggregated metrics. In addition, it provides examples of how to run and integrate this library with your custom metrics in DataRobot. You can also review supporting DataRobot documentation. |
MLOps Utilities For Spark | A utilities library to integrate MLOps tasks with Spark. |
Apache Spark API for Scoring Code | Use the Spark API to integrate DataRobot Scoring Code JARs into Spark clusters. |
DataRobot provider for Apache Airflow | This quickstart guide on the DataRobot provider for Apache Airflow illustrates the setup and configuration process by implementing a basic Apache Airflow DAG (Directed Acyclic Graph) to orchestrate an end-to-end DataRobot AI pipeline. |
MLflow integration for DataRobot | How to export a model from MLflow and import it into the DataRobot Model Registry, creating key values from the training parameters, metrics, tags, and artifacts in the MLflow model. |
Updated January 30, 2025
Was this page helpful?
Great! Let us know what you found helpful.
What can we do to improve the content?
Thanks for your feedback!