Skip to content

Click in-app to access the full platform documentation for your version of DataRobot.

Prediction methods and file size limits

DataRobot supports many methods of making predictions, including the DataRobot UI and APIs—for example, Python, R, and REST, among others. The prediction methods you use depend on factors like the size of your prediction data, whether you're validating a model prior to deployment or using and monitoring it in production, whether you need immediate low-latency predictions, or if you want to schedule batch prediction jobs. The following table provides details about the different methods and file size limits for each method.

Note

Prediction file size limits vary for on-premise installations and limits are configurable.

Prediction methods

Prediction method Details File size limit
Leaderboard predictions (UI and API) To make predictions on a non-deployed model using the UI, expand the model on the Leaderboard and select Predict > Make Predictions. Upload predictions from a local file, URL, data source, or AI Catalog. You can also upload predictions using the modeling predictions API, also called the "V2 predictions API." Use this API to test predictions using your modeling workers on small datasets. Predictions can be limited to 100 requests per user, per hour, depending on your pricing plan. 1GB
Deployment predictions (UI) To make predictions on a deployed model using the UI, select Deployments, choose a deployment, and select Predictions (for MLOps users only). 10GB
Batch Prediction API The Batch Prediction API is optimized for high-throughput and contains production grade connectivity options that allow you to not only push data through the API, but also connect to the AI catalog, cloud storage, databases, or data warehouses (Requires MLOps). Unlimited
Prediction API (real-time) To make real-time predictions on a deployed model, use the Prediction API. 50 MB

Updated December 12, 2021
Back to top