Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

MLOps preview features

This section provides preliminary documentation for features currently in preview pipeline. If not enabled for your organization, the feature is not visible.

Although these features have been tested within the engineering and quality environments, they should not be used in production at this time. Note that preview functionality is subject to change and that any Support SLA agreements are not applicable.

Availability information

Contact your DataRobot representative or administrator for information on enabling or disabling preview features.

Available MLOps preview documentation

Preview for... Description
Service health and accuracy history Service Health and accuracy history allows you to compare the current model and up to five previous models in one place and on the same scale.
Model logs for model packages View model logs for model packages from the Model Registry to see successful operations (INFO status) and errors (ERROR status).
Automated deployment and replacement of Scoring Code in Snowflake Create a DataRobot-managed Snowflake prediction environment to deploy and replace DataRobot Scoring Code in Snowflake.
Run the monitoring agent in DataRobot Run the monitoring agent within the DataRobot platform, one instance per prediction environment.
Monitoring jobs for custom metrics Monitoring job definitions allow DataRobot to pull calculated custom metric values from outside of DataRobot into the metric defined on the Custom Metrics tab.
MLOps reporting for unstructured models Report MLOps statistics from custom inference models created with an unstructured regression, binary, or multiclass target type.
MLflow integration for DataRobot Export a model from MLflow and import it into the DataRobot Model Registry, creating key values from the training parameters, metrics, tags, and artifacts in the MLflow model.
Multipart upload for the batch prediction API Upload scoring data through multiple files to improve file intake for large datasets.
Automated deployment and replacement in Sagemaker Create a DataRobot-managed Snowflake prediction environment to deploy and replace DataRobot Scoring Code or a custom model in Sagemaker.
Preview for... Description
Service health and accuracy history Service Health and accuracy history allows you to compare the current model and up to five previous models in one place and on the same scale.
Model logs for model packages View model logs for model packages from the Model Registry to see successful operations (INFO status) and errors (ERROR status).
Automated deployment and replacement of Scoring Code in Snowflake Create a DataRobot-managed Snowflake prediction environment to deploy and replace DataRobot Scoring Code in Snowflake.
Run the monitoring agent in DataRobot Run the monitoring agent within the DataRobot platform, one instance per prediction environment.
Monitoring jobs for custom metrics Monitoring job definitions allow DataRobot to pull calculated custom metric values from outside of DataRobot into the metric defined on the Custom Metrics tab.
Custom model proxy for external models (Self-Managed AI Platform only) Create custom model proxies for external models in the Custom Model Workshop.
MLOps reporting for unstructured models Report MLOps statistics from custom inference models created with an unstructured regression, binary, or multiclass target type.
MLflow integration for DataRobot Export a model from MLflow and import it into the DataRobot Model Registry, creating key values from the training parameters, metrics, tags, and artifacts in the MLflow model.
Multipart upload for the batch prediction API Upload scoring data through multiple files to improve file intake for large datasets.

Updated July 11, 2024