Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

Code-first (V9.1)

July 31, 2023

The DataRobot v9.1 release includes many new features and capabilities for code-first users, described below. See additional details of Release 9.1 in the data and modeling and MLOps release announcements.

Code-first features
Name GA Preview
Notebooks
DataRobot Notebooks now GA
Integrated terminals
Azure OpenAI Service integration
API enhancements
DataRobot REST API v2.31
Python client v3.2
R client v2.18.4
DataRobotX
R client v2.31

Notebooks

GA

DataRobot Notebooks now GA

Now generally available, DataRobot includes an in-browser editor to create and execute notebooks for data science analysis and modeling. Notebooks display computation results in various formats, including text, images, graphs, plots, tables, and more. You can customize the output display by using open-source plugins. Cells can also contain Markdown rich text for commentary and explanation of the coding workflow. As you develop and edit a notebook, DataRobot stores a history of revisions that you can return to at any time.

DataRobot Notebooks offer a dashboard that hosts notebook creation, upload, and management. Individual notebooks have containerized, built-in environments with commonly used machine learning libraries that you can easily set up in a few clicks. Notebook environments seamlessly integrate with DataRobot's API, allowing a robust coding experience supported by keyboard shortcuts for cell functions, in-line documentation, and saved environment variables for secrets management and automatic authentication.

Preview

Integrated notebook terminals

Now available for preview, DataRobot notebooks support integrated terminal windows. When you have a notebook session running, you can open one or more integrated terminals to execute terminal commands, such as running .py scripts or installing packages. Terminal integration also allows you to have full support for a system shell (bash) so you can run installed programs. When you create a terminal window in a DataRobot Notebook, the notebook page divides into two sections: one for the notebook itself, and another for the terminal.

Required feature flag: Enable Notebooks Terminal

Preview documentation.

Azure OpenAI Service integration

Now available for preview, you can power code development workflows in DataRobot Notebooks by applying OpenAI large language models for assisting with code generation. With the Azure OpenAI Service integration in DataRobot Notebooks, you can leverage state-of-the-art generative models with Azure's enterprise-grade security and compliance capabilities. By selecting Assist in a DataRobot notebook, you can provide a prompt for the Code Assistant to generate code in a cell.

Required feature flag: Enable Notebooks OpenAI Integration

Preview documentation.

API

GA

DataRobot REST API v2.31

DataRobot's v2.31 for the REST API is now generally available. For a complete list of changes introduced in v2.31, view the REST API changelog.

New features

  • New route to retrieve deployment fairness score over time:
  • GET /api/v2/deployments/(deploymentId)/fairnessScoresOverTime/
  • New route to retrieve deployment predictions stats over time:
  • GET /api/v2/deployments/(deploymentId)/predictionsOverTime/
  • New routes to calculate and retrieve sliced insights:
  • POST /api/v2/insights/featureEffects/
  • GET /api/v2/insights/featureEffects/models/(entityId)/
  • POST /api/v2/insights/featureImpact/
  • GET /api/v2/insights/featureImpact/models/(entityId)/
  • POST /api/v2/insights/liftChart/
  • GET /api/v2/insights/liftChart/models/(entityId)/
  • POST /api/v2/insights/residuals/
  • GET /api/v2/insights/residuals/models/(entityId)/
  • POST/api/v2/insights/rocCurve/
  • GET GET /api/v2/insights/rocCurve/models/(entityId)/
  • New routes to create and manage data slices for use with sliced insights:
  • POST /api/v2/dataSlices/
  • DELETE /api/v2/dataSlices/
  • DELETE /api/v2/dataSlices/(dataSliceId)/
  • GET /api/v2/dataSlices/(dataSliceId)/
  • GET /api/v2/projects/(projectId)/dataSlices/
  • POST /api/v2/dataSlices/(dataSliceId)/sliceSizes/
  • GET /api/v2/dataSlices/(dataSliceId)/sliceSizes/
  • New route to register a Leaderboard model:
  • POST /api/v2/modelPackages/fromLeaderboard/
  • New routes to create and manage Value Trackers (former Use Cases):
  • POST /api/v2/valueTrackers/
  • GET /api/v2/valueTrackers/
  • GET /api/v2/valueTrackers/(valueTrackerId)/
  • PATCH /api/v2/valueTrackers/(valueTrackerId)/
  • DELETE /api/v2/valueTrackers/(valueTrackerId)/
  • GET /api/v2/valueTrackers/(valueTrackerId)/activities/
  • GET /api/v2/valueTrackers/(valueTrackerId)/attachments/
  • POST /api/v2/valueTrackers/(valueTrackerId)/attachments/
  • DELETE /api/v2/valueTrackers/(valueTrackerId)/attachments/(attachmentId)/
  • GET /api/v2/valueTrackers/(valueTrackerId)/attachments/(attachmentId)/
  • GET /api/v2/valueTrackers/(valueTrackerId)/realizedValueOverTime/
  • GET /api/v2/valueTrackers/(valueTrackerId)/sharedRoles/
  • PATCH /api/v2/valueTrackers/(valueTrackerId)/sharedRoles/

Python client v3.2

v3.2 for DataRobot's Python client is now generally available. For a complete list of changes introduced in v2.31, view the Python client changelog.

New Features

  • Added DatetimePartitioning.datetime_partitioning_log_retrieve to download the datetime partitioning log.
  • Added method DatetimePartitioning.datetime_partitioning_log_list to list the datetime partitioning log.
  • Added DatetimePartitioning.get_input_data to retrieve the input data used to create optimized datetime partitioning.
  • Added the class DatetimePartitioningId, which can be passed as a partitioning_method to Project.analyze_and_model.
  • Added the ability to share deployments. See deployment sharing for more information on sharing deployments.
  • Added new methods get_bias_and_fairness_settings and update_bias_and_fairness_settings to retrieve or update bias and fairness settings.
  • Added a new class UseCase for interacting with the DataRobot Use Cases API.
  • Added a new class Application for retrieving DataRobot Applications available to the user.
  • Added a new class SharingRole to hold user or organization access rights.
  • Added a new class BatchMonitoringJob for interacting with batch monitoring jobs.
  • Added a new class BatchMonitoringJobDefinition for interacting with batch monitoring jobs definitions.
  • Added new methods for handling monitoring job definitions:
    • BatchMonitoringJobDefinition.list
    • BatchMonitoringJobDefinition.get
    • BatchMonitoringJobDefinition.create
    • BatchMonitoringJobDefinition.update
    • BatchMonitoringJobDefinition.delete
    • BatchMonitoringJobDefinition.run_on_schedule
    • BatchMonitoringJobDefinition.run_once
  • Added a new method to retrieve a monitoring job: BatchMonitoringJob.get.
  • Added the ability to filter return objects by a Use Case ID passed to the following methods: Dataset.list and Project.list.
  • Added the ability to automatically add a newly created dataset or project to a Use Case by passing a UseCase, list of UseCase objects, UseCase ID or list of UseCase IDs using the keyword argument use_cases to the following methods:

    • Dataset.create_from_file
    • Dataset.create_from_in_memory_data
    • Dataset.create_from_url
    • Dataset.create_from_data_source
    • Dataset.create_from_query_generator
    • Dataset.create_project
    • Project.create
    • Project.create_from_data_source
    • Project.create_from_dataset
    • Project.create_segmented_project_from_clustering_model
    • Project.start
  • Added the ability to set a default Use Case for requests. It can be set in several ways.

    • If the user configures the client via Client(...), then invoke Client(..., default_use_case = <id>).
    • If the user configures the client via dr.config.yaml, then add the property default_use_case: <id>.
    • If the user configures the client via env vars, then set the env var DATAROBOT_DEFAULT_USE_CASE.
    • The default use case can also be set programmatically as a context manager via UseCase.get(<id>).
  • Added the method Deployment.get_predictions_over_time to retrieve deployment predictions over time data.
  • Added a new class FairnessScoresOverTime to retrieve fairness over time information.
  • Added a new method Deployment.get_fairness_scores_over_time to retrieve fairness scores over time of a deployment.
  • Added a new use_gpu parameter to the method Project.analyze_and_model to set whether the project should allow usage of GPU.
  • Added a new use_gpu parameter to the class Project with information whether project allows usage of GPU.
  • Added a new class TrainingData for retrieving training data assigned to the CustomModelVersion.
  • Added a new class HoldoutData for retrieving HoldoutData assigned to CustomModelVersion.
  • Added parameter unsupervised_type to the class DatetimePartitioning.

R client v2.18.4

Version v2.31 of the R client is now generally available. It can be accessed via CRAN. The datarobot package is now dependent on R >= 3.5. For a complete list of changes introduced in v2.18.4, view the R client changelog.

New features

  • The R client will now output a warning when you attempt to access certain resources (projects, models, deployments, etc.) that are deprecated or disabled by the DataRobot platform migration to Python 3.

  • Added support for comprehensive autopilot: use mode = AutopilotMode.Comprehensive.

Preview

DataRobotX

Now available for preview, DataRobotX, or DRX, is a collection of DataRobot extensions designed to enhance your data science experience. DRX provides a streamlined experience for common workflows but also offers new, experimental high-level abstractions.

DRX offers unique experimental workflows, including the following:

  • Smart downsampling with Pyspark
  • Enrich datasets using LLMs
  • Feature importance rank ensembling (FIRE)
  • Deploy custom models
  • Track experiments in MLFlow

Preview documentation.

R client v2.31

Version v2.31 of the R client is now available for preview. It can be installed via GitHub.

This version of the R client addresses an issue where a new feature in the curl==5.0.1 package caused any invocation of datarobot:::UploadData (i.e., SetupProject) to fail with the error No method asJSON S3 class: form_file.

For a complete list of changes introduced in v2.31, view the R client changelog.

All product and company names are trademarks™ or registered® trademarks of their respective holders. Use of them does not imply any affiliation with or endorsement by them.


Updated November 15, 2024