When is a Forecast Good Enough?
- By Datamago team
- Published April 15, 2022
- 5 min read
You can always eke out a little more accuracy making that last adjustment to the historical data or finding that perfect variable. But, there's an opportunity cost in terms of where you spend your time. Perfectly predicting the future is impossible—instead, it's better to focus on balancing performance with a reasonable looking forecast. How that balance looks is up to you but we’ll offer 3 generic guidelines for determining when a forecast is good enough. They include a minimum performance score, forecast appearance, and diagnostics tests.
1) Achieve a Performance Score of at least ‘moderate’
The Performance Score, which is based on standard statistical measurements of error, indicates how the forecast may perform in the future and is the best objective measure of quality. It's calculated by removing a certain number of rows from the end of the historical data (a.k.a the validation set), then making a blind forecast and comparing it to the actual values that were removed. The closer the predicted values are to the actual ones, the better the score, which will look like one of the following:
Note that a ‘needs improvement’ score may be acceptable in a small number of situations. For example, daily data with lots of fluctuation can score poorly on the validation set forecast due to under or over forecasting certain sections. But if the validation forecast and actual forecast propagate the historical patterns, follow the trend, and pass one or more forecast diagnostics tests (more on that below), accepting the forecast is worth considering.
The Mean Absolute Error (MAE) and Mean Absolute Percentage Error (MAPE), from which the performance scores are derived, are also available for advanced users (blog posts coming soon). They can be toggled by clicking on the 3 dotted menu on the upper right of the forecasts or validation forecasts list, then selecting 'advanced'.
Beware of any anomalies or shifting patterns as they can degrade performance. Learn more about how to identify and fix anomalies here.
2) The forecast should look reasonable
The forecast should look reasonable given your assumptions—based on experience and domain expertise—about the future. If you expect a shift from the underlying historical context, the forecast should reflect that assumption in relation to whichever variable, special event, or value adjustment helps guide it (learn more about variables here, and special events and other adjustments here). Otherwise, the forecast should reflect the past by carrying forward the historical trend and seasonality.
If both the forecast and validation forecast look reasonable given your assumptions, a Performance score lower than excellent or good may be acceptable. But, the key is to not rely on intuition alone. Confidence intervals also exist to provide a range of potential outcomes.
This is an example [1] of a forecast that may not look reasonable—depending on your forecasting goals and assumptions—even though it has a ‘good’ performance score (learn how to improve the forecast here):
This is an example [2] of a forecast with a ‘moderate’ score but a reasonable looking result given the historical pattern:
3) Pass at least one forecast diagnostics test
Forecast diagnostics provide an indication of how well the forecast carries forward the historical data's statistical properties. This includes autocorrelation, standard deviation, and seasonality (when applicable). In other words, they measure how the forecast fits in given the recent past.
Also, the validation set forecast is also compared to a naive method where each forecasted value is the same as the last real value in the training data.
Your forecast should pass at least the naive test and one of the others to be considered good enough.
You can find the diagnostics by clicking on the ‘Performance’ menu item on the left hand side of the forecast page, then clicking ‘Advanced’ at the top.
For tips on how to improve your forecast, learn about fixing anomalies here and smoothing here.
1. Wind Energy Production in the U.S. Source: U.S. Energy Information Administration.
2. Fremont Bridge Bicycle Crossings. Source: Seattle Department of Transportation.