Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

Deployment settings

Deprecation notice

The Settings > Data and Settings > Monitoring tabs are deprecated and scheduled for removal. The new deployment settings workflow provides an organized and intuitive interface, separating the categories of deployment configuration and monitoring setup tasks into dedicated settings pages. During the deprecation period, you can use the Data tab; however, the Monitoring tab directs you to the service health settings.

You can add data to a deployment and configure monitoring, notifications, and challenger behavior using the Settings associated with each deployment tab:

Topic Describes
Set up service health monitoring Enable segmented analysis to assess service health, data drift, and accuracy statistics by filtering them into unique segment attributes and values.
Set up data drift monitoring Enable data drift monitoring on a deployment's Data Drift Settings tab.
Set up accuracy monitoring Enable accuracy monitoring on a deployment's Accuracy Settings tab.
Set up fairness monitoring Enable fairness monitoring on a deployment's Fairness Settings tab.
Set up humility rules Enable humility monitoring by creating rules which enable models to recognize, in real-time, when they make uncertain predictions or receive data they have not seen before.
Configure retraining Enable Automated Retraining for a deployment by defining the general retraining settings and then creating retraining policies.
Configure challengers Enable challenger comparison by configuring a deployment to store prediction request data at the row level and replay predictions on a schedule.
Review predictions settings Review the Predictions Settings tab to view details about your deployment's inference data.
Enable data export Enable data export to compute and monitor custom business or performance metrics.
Set up custom metrics monitoring Enable custom metrics monitoring by defining the "at risk" and "failing" thresholds for the custom metrics you created.
Set prediction intervals for time series deployments Enable prediction intervals in the prediction response for deployed time series models.

Updated August 10, 2023