Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

Deployment settings

After you create and configure a deployment, you can use the settings tabs for individual features to add or update deployment functionality:

Topic Describes
Set up service health monitoring Enable segmented analysis to assess service health, data drift, and accuracy statistics by filtering them into unique segment attributes and values.
Set up data drift monitoring Enable data drift monitoring on a deployment's Data Drift Settings tab.
Set up accuracy monitoring Enable accuracy monitoring on a deployment's Accuracy Settings tab.
Set up fairness monitoring Enable fairness monitoring on a deployment's Fairness Settings tab.
Set up humility rules Enable humility monitoring by creating rules which enable models to recognize, in real-time, when they make uncertain predictions or receive data they have not seen before.
Configure retraining Enable Automated Retraining for a deployment by defining the general retraining settings and then creating retraining policies.
Configure challengers Enable challenger comparison by configuring a deployment to store prediction request data at the row level and replay predictions on a schedule.
Review predictions settings Review the Predictions Settings tab to view details about your deployment's inference data.
Enable data exploration Enable data exploration to export deployment data, allowing you to compute and monitor custom business or performance metrics.
Set up custom metrics monitoring Enable custom metrics monitoring by defining the "at risk" and "failing" thresholds for the custom metrics you created.
Set up timeliness tracking Enable timeliness tracking on a deployment's Usage Settings tab to reveal when deployment status indicators are based on old data.
Set prediction intervals for time series deployments Enable prediction intervals in the prediction response for deployed time series models.

Updated April 18, 2024