The Holdout column displays an evaluation metric that measures a model's accuracy against unseen ("new") data. Holdout is calculated using the trained model's predictions on the holdout partition. DataRobot reserves a portion of your data to use as holdout (20% by default); it does not train models using this data but instead validates the quality of your models once they have been trained.
You should only unlock your holdout data after having made all your model-related decisions. Once your project's holdout has been unlocked, it cannot be re-locked.
If you run full or Quick Autopilot and DataRobot returns a model recommended and prepared for deployment, the specifics described below work slightly differently.
To display a specific model's Holdout score:
Click Unlock project Holdout for all models on the rightmost panel.
Confirm your decision by clicking Unlock holdout.
When you unlock holdout, the label on the project menu changes to Holdout is unlocked and a value displays in the Holdout column.
Once you have unlocked the holdout data, view the Leaderboard scores on the test data. Then, look at the Lift Chart. Alternate the Data Source dropdown between Validation and Holdout to determine the accuracy of the model's predictions.