Skip to content

AI Platform releases

April SaaS feature announcements

April 2025

This page provides announcements of newly released features in April 2025, available in DataRobot's SaaS multi-tenant AI Platform, with links to additional resources. From the release center, you can also access past announcements and Self-Managed AI Platform release notes.

Self-managed/STS documentation now publicly accessible

Starting with release 11.0, version-specific documentation for self-managed and single-tenant SaaS users will be available at http://docs.datarobot.com/11.0/en/docs/index.html with future, versioned releases hosted on the site going forward. This means DataRobot documentation is now easily accessible without additional installation on your part. Additionally, just like with the SaaS documentation, the self-managed public documentation will be updated when we add detail, examples, or corrections. For customers in air-gapped environments, ask your administrator to allow-list the site or contact DataRobot Support for a PDF version.

Previously, version-specific documentation was only available in-app. Now when you open docs using either of the methods below, you are directed to the public site:

  • Clicking an “Open documentation” link in the app itself.
  • Clicking the ? in right-hand upper corner.

Note that the self-managed site will not be indexed for Google so that there will not be two results returned for each page; only results from the SaaS documentation will be returned. Often those results will answer questions, but to check for specifics in your version, use the search functionality from the on-premise documentation site.

April features

The following table lists each new feature:

Features grouped by capability
Name NextGen Classic
Applications
Perform common DataRobot tasks with Pulumi
GenAI
NVIDIA AI Enterprise integration*
New versions of Gemini released
Predictions and MLOps
Create custom model proxies for external models
Batch prediction support for Alibaba Cloud MaxCompute
Platform
NextGen is soon to be the default landing page N/A N/A
API enhancements
Use Covalent to simplify compute orchestration N/A N/A
Python client v3.7 N/A N/A
DataRobot REST API v2.36 N/A N/A
Browse Python API client documentation on docs.datarobot.com N/A N/A

Applications

Perform common DataRobot tasks with Pulumi

You can now access notebooks that outline how to perform common DataRobot tasks using Pulumi and the declarative API. Browse notebooks for deploying custom models, custom applications, and governed custom LLMs.

GenAI

NVIDIA AI Enterprise integration

NVIDIA AI Enterprise and DataRobot provide a pre-built AI stack solution, designed to integrate with your organization's existing DataRobot infrastructure, providing access to robust evaluation, governance, and monitoring features. This integration includes a comprehensive array of tools for end-to-end AI orchestration, accelerating your organization's data science pipelines to rapidly deploy production-grade AI applications on NVIDIA GPUs in DataRobot Serverless Compute.

In DataRobot, create custom AI applications tailored to your organization's needs by selecting NVIDIA Inference Microservices (NVIDIA NIM) from a gallery of AI applications and agents. NVIDIA NIM provides pre-built and pre-configured microservices within NVIDIA AI Enterprise, designed to accelerate the deployment of generative AI across enterprises.

For more information on the NVIDIA AI Enterprise and DataRobot integration, review the workflow summary documentation, or review the documentation listed below:

Task Description
Create an inference endpoint for NVIDIA NIM Register and deploy with NVIDIA NIM to create inference endpoints accessible through code or the DataRobot UI.
Evaluate a text generation NVIDIA NIM in the playground Add a deployed text generation NVIDIA NIM to a blueprint in the playground to access an array of comparison and evaluation tools.
Use an embedding NVIDIA NIM to create a vector database Add a registered or deployed embedding NVIDIA NIM to a Use Case with a vector database to enrich prompts in the playground with relevant context before they are sent to the LLM.
Use NVIDIA NeMo Guardrails in a moderation framework to secure your application Connect NVIDIA NeMo Guardrails to deployed text generation models to guard against off-topic discussions, unsafe content, and jailbreaking attempts.
Use a text generation NVIDIA NIM in an application template Customize application templates from DataRobot to use a registered or deployed NVIDIA NIM text generation model.

New versions of Gemini released

With this deployment, Gemini 1.5 Pro version v001 and Gemini 1.5 Flash v001 have been replaced, in both cases, with version 002. On May 24, 2025, v001 will be permanently disabled. On September 24, 2025 both Gemini 1.5 Pro v002 and Gemini 1.5 Flash v002 will be retired. If an LLM blueprint is in the playground, it has been automatically switched to v002. If you have a registered model or deployment that uses v001, you must send the LLM blueprint to the Registry’s model workshop again and redeploy it to start using v002. Alternatively, if using the Bolt-on Governance API for inference, specify gemini-1.5-flash-002 / gemini-1.5-pro-002 as the model ID in the inference request without redeploying the LLM blueprint.

See the full list of LLM availability in DataRobot, with links to creator documentation, for assistance in choosing a replacement embedding model.

Predictions and MLOps

Create custom model proxies for external models

In the Model workshop, you can create custom model as a proxy for an external model. A proxy model contains proxy code created to connect with an external model, allowing you to use features like compliance documentation, challenger analysis, and custom model tests with a model running on infrastructure outside DataRobot. For more information, see the documentation.

Batch prediction support for Alibaba Cloud MaxCompute

DataRobot now supports intake and write-back through a MaxCompute data connection when you configure a JDBC prediction source and destination with the Job Definitions UI or the Batch Prediction API. For a complete list of supported output options, see the data sources supported for batch predictions.

Platform

NextGen is soon to be the default landing page

The NextGen homepage will soon be the default landing page when accessing app.datarobot.com. When that happens, if you request a specific page, for example app.datarobot.com/projects/123abc/models, you will be brought to the requested page. You will be able to make DataRobot Classic the default page instead of NextGen by selecting User settings > System and disabling the toggle.

API enhancements

Use Covalent to simplify compute orchestration

Now available as a premium feature, DataRobot offers an open-source distributed computing platform, Covalent, a code-first solution that simplifies building and scaling complex AI and high-performance computing applications. You can define your compute needs (CPUs, GPUs, storage, deployment, etc.) directly within Python code and Covalent handles the rest, without dealing with the complexities of server management and cloud configurations. Covalent accelerates agentic AI application development with advanced compute orchestration and optimization.

As a DataRobot user, you can access the Covalent SDK in a Python environment (whether that be in a DataRobot notebook or your own development environment) and use your DataRobot API key to leverage all Covalent features, including fine-tuning and model serving. The Covalent SDK enables compute-intensive workloads, such as model training and testing, to run as server-managed workflows. The workload is broken down into tasks that are arranged in a workflow. The tasks and the workflow are Python functions decorated with Covalent’s electron and lattice interfaces, respectively.

Python client v3.7

v3.7 for DataRobot's Python client is now generally available. For a complete list of changes introduced in v3.7, view the Python client changelog. v3.7 for DataRobot's Python client is now generally available. For a complete list of changes introduced in v3.7, view the Python client changelog.

DataRobot REST API v2.36

DataRobot's v2.36 for the REST API is now generally available. For a complete list of changes introduced in v2.36, view the REST API changelog. DataRobot's v2.36 for the REST API is now generally available. For a complete list of changes introduced in v2.36, view the REST API changelog.

Browse Python API client documentation on docs.datarobot.com

You can now access reference documentation for the Python API client directly from the documentation portal, in addition to ReadTheDocs. The reference documentation outlines the functionality supported by the Python client, matching the organization of the REST API documentation.

All product and company names are trademarks™ or registered® trademarks of their respective holders. Use of them does not imply any affiliation with or endorsement by them.