The Forecasting Accuracy tab provides a visual indicator of how well a model predicts at each forecast distance in the project's forecast window. It is available for all time series projects (both single series and multiseries). Use it to help determine, for example, how much harder it is to accurately forecast four days out as opposed to two days out. The chart depicts how accuracy changes as you move further into the future.
For each forecast distance, the points represent:
- Green (Backtest 1): the validation score displayed on the Leaderboard, which represents the validation score of the first (most recent) backtest.
- Blue (All Backtests): the backtesting score displayed on the Leaderboard, which represents the average validation score across all backtests.
- Red (Holdout): the holdout score.
You can change the optimization metric from the Leaderboard to change the display.
Updated September 20, 2022
Was this page helpful?
Great! Let us know what you found helpful.
What can we do to improve the content?
Thanks for your feedback!