Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

MLOps public preview features

This section provides preliminary documentation for features currently in the public preview pipeline. If not enabled for your organization, the feature is not visible.

Although these features have been tested within the engineering and quality environments, they should not be used in production at this time. Note that public preview functionality is subject to change and that any Support SLA agreements are not applicable.

Availability information

Contact your DataRobot representative or administrator for information on enabling or disabling public preview features.

Available MLOps public preview documentation

Public preview for... Describes...
Real-time Predictions on DataRobot serverless prediction environments Create DataRobot Serverless prediction environments to make scaleable real-time predictions on Kubernetes, with configurable compute instance settings.
Service health and accuracy history Service Health and accuracy history allows you to compare the current model and up to five previous models in one place and on the same scale.
Model logs for model packages View model logs for model packages from the Model Registry to see successful operations (INFO status) and errors (ERROR status).
Automated deployment and replacement of Scoring Code in Snowflake Create a DataRobot-managed Snowflake prediction environment to deploy and replace DataRobot Scoring Code in Snowflake.
Run the monitoring agent in DataRobot Run the monitoring agent within the DataRobot platform, one instance per prediction environment.
Custom jobs in the Model Registry Create custom jobs in the Model Registry to define tests for your models and deployments.
Monitoring jobs for custom metrics Monitoring job definitions allow DataRobot to pull calculated custom metric values from outside of DataRobot into the metric defined on the Custom Metrics tab.
Hosted custom metrics Upload and host reusable code to easily add custom metrics to future deployments.
Remote repository file browser for custom models and tasks Browse the folders and files in a remote repository to select the files you want to add to a custom model or task.
Runtime parameters for custom models Add runtime parameters to a custom model through the model metadata.
MLOps reporting for unstructured models Report MLOps statistics from custom inference models created with an unstructured regression, binary, or multiclass target type.
MLflow integration for DataRobot Export a model from MLflow and import it into the DataRobot Model Registry, creating key values from the training parameters, metrics, tags, and artifacts in the MLflow model.
Tableau Analytics Extension for deployments Use the Tableau analytics extension to integrate DataRobot predictions into your Tableau project.
Multipart upload for the batch prediction API Upload scoring data through multiple files to improve file intake for large datasets.
Accelerate decision-making with Decision Intelligence Flows Build, manage, and monitor decision flows for production models.
Automated deployment and replacement in Sagemaker Create a DataRobot-managed Snowflake prediction environment to deploy and replace DataRobot Scoring Code or a custom model in Sagemaker.
Public preview for... Describes...
Real-time Predictions on DataRobot serverless prediction environments Create DataRobot Serverless prediction environments to make scaleable real-time predictions on Kubernetes, with configurable compute instance settings.
Service health and accuracy history Service Health and accuracy history allows you to compare the current model and up to five previous models in one place and on the same scale.
Model logs for model packages View model logs for model packages from the Model Registry to see successful operations (INFO status) and errors (ERROR status).
Automated deployment and replacement of Scoring Code in Snowflake Create a DataRobot-managed Snowflake prediction environment to deploy and replace DataRobot Scoring Code in Snowflake.
Run the monitoring agent in DataRobot Run the monitoring agent within the DataRobot platform, one instance per prediction environment.
Custom jobs in the Model Registry Create custom jobs in the Model Registry to define tests for your models and deployments.
Monitoring jobs for custom metrics Monitoring job definitions allow DataRobot to pull calculated custom metric values from outside of DataRobot into the metric defined on the Custom Metrics tab.
Hosted custom metrics Upload and host reusable code to easily add custom metrics to future deployments.
Remote repository file browser for custom models and tasks Browse the folders and files in a remote repository to select the files you want to add to a custom model or task.
Runtime parameters for custom models Add runtime parameters to a custom model through the model metadata.
Custom model proxy for external models (Self-Managed AI Platform only) Create custom model proxies for external models in the Custom Model Workshop.
MLOps reporting for unstructured models Report MLOps statistics from custom inference models created with an unstructured regression, binary, or multiclass target type.
MLflow integration for DataRobot Export a model from MLflow and import it into the DataRobot Model Registry, creating key values from the training parameters, metrics, tags, and artifacts in the MLflow model.
Tableau Analytics Extension for deployments Use the Tableau analytics extension to integrate DataRobot predictions into your Tableau project.
Multipart upload for the batch prediction API Upload scoring data through multiple files to improve file intake for large datasets.
Accelerate decision-making with Decision Intelligence Flows Build, manage, and monitor decision flows for production models.

Updated March 13, 2024