Skip to content

Click in-app to access the full platform documentation for your version of DataRobot.

August 2022

August 24, 2022

With the latest deployment, DataRobot's Managed AI Cloud deployment delivered the following new GA and Public Preview features. See the deployment history for past feature announcements. See also:

Features grouped by capability
Name GA Public Preview
Data and integrations
UI/UX improvements to No-Code AI Apps
New data connection UI
Predictions and MLOps
Clear deployment statistics
Challenger insights for multiclass and external models
Remote repository file browser for custom models and tasks
Deployment prediction and training data export for custom metrics

GA

UI/UX improvements to No-Code AI Apps

This release introduces the following improvements to No-Code AI Apps:

  • An in-app tour has been added to help you set up Optimizer applications. Click the ? in the upper-right and select Show Optimizer Guide.

  • When opening an application, it now opens in Consume mode instead of Build mode.

  • In Consume > Optimization Details, the What-if and Optimizer widgets have been moved towards the top of the page.

  • In Optimizer applications, you previously needed to select a prediction row to calculate an optimization. Now, you can click the Optimize Row button in the All Rows widget to calculate and display the optimized prediction without leaving the page.

  • In Build mode, widgets no longer display an example.

Clear deployment statistics

Now generally available, you can clear monitoring data by model version and date range. If your organization has enabled the deployment approval workflow, approval must be given before any monitoring data can be cleared from the deployment. This feature allows you to remove monitoring data sent inadvertently or during the integration testing phase of deploying a model from the deployment.

Choose a deployment for which you want to reset statistics from the inventory. Click the actions menu and select Clear statistics.

Complete the settings in the Clear Deployment Statistics window to configure the conditions of the reset.

After fully configuring the settings, click Clear statistics. DataRobot clears the monitoring data from the deployment for the indicated date range.

For more information, see the Clear deployment statistics documentation.

Challenger insights for multiclass and external models

Now generally available, you can compute challenger model insights for multiclass models and external models.

  • Multiclass classification projects only support accuracy comparison.

  • External models (regardless of project type) require an external challenger comparison dataset.

To compare an external model challenger, you need to provide a dataset that includes the actuals and the prediction results. When you upload the comparison dataset, you can specify a column containing the prediction results.

To add a comparison dataset for an external model challenger, follow the Generate model comparisons process, and on the Model Comparison tab, upload your comparison dataset with a Prediction column identifier. Make sure the prediction dataset you provide includes the prediction results generated by the external model at the location identified by the Prediction column.

Once you compute model insights, the Model Insights page displays comparison tabs depending on the project type:

Accuracy Dual lift Lift ROC Predictions Difference
Regression
Binary
Multiclass
Time series

For more information, see the View model comparisons documentation.

Public Preview

New data connection UI

Now available for public preview, DataRobot introduces improvements to the data connection user interface that simplifies the process of adding and configuring data connections from the AI Catalog > Data Connection page. Instead of opening multiple windows to set up a data connection, after selecting a data store, you can configure parameters and authenticate credentials in the same window. For each data connection, only the required fields are displayed, however, you can define additional parameters under Advanced Options at the bottom of the page.

Additionally, using credentials to connect to data sources has also been simplified. Once you enter credentials when configuring a data connection, DataRobot automatically applies these credentials when you create a new AI Catalog dataset from the connection.

Required feature flag: Enable New Data Connection UI

Public preview documentation.

Remote repository file browser for custom models and tasks

Now available as a public preview feature, you can browse the folders and files in a remote repository to select the files you want to add to a custom model or task. When you add a model or add a task to the Custom Model Workshop, you can add files to that model or task from a wide range of repositories, including Bitbucket, GitHub, GitHub Enterprise, S3, GitLab, and GitLab Enterprise. After you add a repository to DataRobot, you can pull files from the repository and include them in the custom model or task.

When you pull from a remote repository, in the Pull from GitHub repository dialog box, you can select the checkbox for any files or folders you want to pull into the custom model.

In addition, you can click Select all to select every file in the repository, or, after you select one or more files, you can click Deselect all to clear your selections.

Note

This example uses GitHub; however, the process is the same for each repository type.

Required feature flag: Enable File Browser for Pulling Model or Task Files from Remote Repositories

Public Preview documentation.

Deployment prediction and training data export for custom metrics

Now available as a public preview feature, you can export a deployment's stored training and prediction data—both the scoring data, and the prediction results—to compute and monitor custom business or performance metrics outside DataRobot.

To export a deployment's stored prediction and training data:

  1. In the top navigation bar, click Deployments.

  2. On the Deployments tab, click on the deployment you want to open and export stored prediction or training data from.

    Note

    To access the Data Export tab, the deployment must store prediction data. Ensure that you Enable prediction rows storage for challenger analysis in the deployment settings.

  3. In the deployment, click the Data Export tab.

To open or download training data:

  • Under Training Data, click the open icon to open the training data in the AI Catalog.

  • Click the download icon to download the training data.

To open or download prediction data:

  1. Configure the following settings to specify the stored prediction data you want to export:

    Setting Description
    Model Select the deployment's model, current or previous, to export prediction data for.
    Range (UTC) Select the start and end dates of the period you want to export prediction data from.
    Resolution Select the granularity of the date slider. Select from hourly, daily, weekly, and monthly granularity based on the time range selected. If the time range is longer than 7 days, hourly granularity is not available.
    Reset Reset the data export settings to the default.
  2. Under Prediction Data, click Generate Prediction Data.

    Prediction data generation considerations

    When generating prediction data, consider the following:

    • When generating prediction data, you can export up to 200,000 rows. If the time range you set exceeds 200,000 rows of prediction data, decrease the range.

    • In the AI Catalog, you can have up to 100 prediction export items. If generating prediction data for export would cause the number of prediction export items in the AI Catalog to exceed that limit, delete old prediction export AI Catalog items.

    • When generating prediction data for time series deployments, two prediction export items are added to the AI Catalog. One item for the prediction data, the other for the prediction results. The Data Export tab links to the prediction results.

    The prediction data export appears in the table below.

  3. After the prediction data is generated:

    • Click the open icon to open the prediction data in the AI Catalog.

    • Click the download icon to download the prediction data.

To use the exported deployment data to create your own custom metrics, you can implement a script to read from the CSV file containing the exported data and then calculate metrics using the resulting values, including columns automatically generated during the export process.

This example uses the exported prediction data to calculate and plot the change in the time_in_hospital feature over a 30-day period using the DataRobot prediction timestamp (DR_RESERVED_PREDICTION_TIMESTAMP) as the DateFrame index (or row labels). It also uses the exported training data as the plot's baseline:

import pandas as pd
feature_name = "<numeric_feature_name>"
training_df = pd.read_csv("<path_to_training_data_csv>")
baseline = training_df[feature_name].mean()
prediction_df = pd.read_csv("<path_to_prediction_data_csv>")
prediction_df["DR_RESERVED_PREDICTION_TIMESTAMP"] = pd.to_datetime(
    prediction_df["DR_RESERVED_PREDICTION_TIMESTAMP"]
)
predictions = prediction_df.set_index("DR_RESERVED_PREDICTION_TIMESTAMP")["time_in_hospital"]
ax = predictions.rolling('30D').mean().plot()
ax.axhline(y=baseline - 2, color="C1", label="training data baseline")
ax.legend()
ax.figure.savefig("feature_over_time.png")

DataRobot automatically adds the following columns to the prediction data generated for export:

Column Description
DR_RESERVED_PREDICTION_TIMESTAMP Contains the prediction timestamp.
DR_RESERVED_PREDICTION Identifies regression prediction values.
DR_RESERVED_PREDICTION_{Label} Identifies classification prediction values.

Required feature flag: Enable Training and Prediction Data Export for Deployments

Public Preview documentation.

Deprecation announcements

USER/Open Source models deprecated and soon disabled

With this release, all models containing USER/Open source (“user”) tasks are deprecated. The exact process of deprecating existing models will be rolling out over the next few months and implications will be announced in subsequent releases. See the full announcement in the June Cloud Announcements.


All product and company names are trademarks™ or registered® trademarks of their respective holders. Use of them does not imply any affiliation with or endorsement by them.


Updated September 27, 2022
Back to top