Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

Predictions

DataRobot offers several methods for getting predictions on new data from a model (also known as scoring). You can read an overview of the available methods below. Before proceeding with a prediction method, be sure to review the prediction file size limits.

Topic Describes...
Real-time predictions Make real-time predictions by connecting to HTTP and requesting predictions for a model via a synchronous call. After DataRobot receives the request, it immediately returns a response containing the prediction results.
Batch predictons Score large datasets in batches with one asynchronous prediction job.
Portable predictions Execute predictions outside of the DataRobot application using Scoring Code or the Portable Prediction Server.
Monitor external predictions To integrate more closely with external data sources, monitoring job definitions allow DataRobot to monitor deployments running and storing feature data and predictions outside of DataRobot.

Predictions overview

DataRobot offers several methods for getting predictions on new data. Select a tab to learn about these methods:

Make real-time predictions by connecting to HTTP and requesting predictions for a model via a synchronous call. Predictions are made after DataRobot receives the request and immediately returns a response.

Use a deployment

The simplest method for making real-time predictions is to deploy a model from the Leaderboard and make prediction requests with the Prediction API.

After deploying a model, you can also navigate to a deployment's Prediction API tab to access and configure scripting code to make simple scoring requests. The deployment also hosts integration snippets.

Both batch prediction methods stem from deployments. After deploying a model, you can make batch predictions via the UI by accessing the deployment, or use the Batch Prediction API.

Use the Make Predictions tab

Navigate to a deployment's Make Predictions tab and use the interface to configure batch prediction jobs.

Use the batch prediction API

The Batch Prediction API provides flexible options for intake and output when scoring large datasets using the prediction servers you have already deployed. The API is exposed through the DataRobot Public API. The API can be consumed using either any REST-enabled client or the DataRobot Python package Public API bindings.

Portable predictions allow you to execute prediction jobs outside of the DataRobot application. The portable prediction methods are detailed below.

Use Scoring Code

You can export Scoring Code from DataRobot in Java or Python to make predictions. Scoring Code is portable and executable in any computing environment. This method is useful for low-latency applications that cannot fully support REST API performance or lack network access.

Availability information

DataRobot’s exportable models and independent prediction environment option, which allows a user to export a model from a model building environment to a dedicated and isolated prediction environment, is not available for managed AI Platform deployments.

Use the Portable Prediction Server

The Portable Prediction Server (PPS) is a remote DataRobot execution environment for DataRobot model packages (MLPKG files) distributed as a self-contained Docker image. It can host one or more production models. The models are accessible through DataRobot's Prediction API for predictions and Prediction Explanations.

Use RuleFit models

DataRobot RuleFit models generate fast Python or Java Scoring Code, which can be run anywhere with no dependencies. Once created, you can export these models as a Python module or a Java class, and run the exported script.


Updated January 31, 2024