Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - A “Softer” approach to the measurement of forecast accuracy
AU - Boylan, John
PY - 2011
Y1 - 2011
N2 - Foresight’s Summer 2010 issue contained a letter to the editor from David Hawitt, suggesting that forecast managers would be more receptive to hearing about forecast accuracy rather than forecast error. If forecast error is measured by the average absolute percentage error (MAPE), then forecast accuracy would be the complement, 100% - MAPE. David argued that while reporting (e.g.) a 40% average error might generate “knee-jerk reactions and the creation of unrealistic goals,” reporting 60% accuracy stimulates executive thinking about “What can we do to improve this?” Then, in the Spring 2011 issue of Foresight, Mark Little and Jim Hoover offered their commentaries on the Hawitt recommendation. Mark had a different take on the issue: “Rather than attempt to express accuracy in a form executives think they understand, it may be better to focus on the improvements in business outcomes (the KPIs) that result through better forecasts.” The issue was carried forward in recent Linked-In exchanges, which John Boylan mentions below in his reflections. John sees the discussion in terms of the different perspectives of academics and practitioners, and attempts to reconcile these positions through the lens of Soft Systems Methodology. Copyright International Institute of Forecasters, 2011
AB - Foresight’s Summer 2010 issue contained a letter to the editor from David Hawitt, suggesting that forecast managers would be more receptive to hearing about forecast accuracy rather than forecast error. If forecast error is measured by the average absolute percentage error (MAPE), then forecast accuracy would be the complement, 100% - MAPE. David argued that while reporting (e.g.) a 40% average error might generate “knee-jerk reactions and the creation of unrealistic goals,” reporting 60% accuracy stimulates executive thinking about “What can we do to improve this?” Then, in the Spring 2011 issue of Foresight, Mark Little and Jim Hoover offered their commentaries on the Hawitt recommendation. Mark had a different take on the issue: “Rather than attempt to express accuracy in a form executives think they understand, it may be better to focus on the improvements in business outcomes (the KPIs) that result through better forecasts.” The issue was carried forward in recent Linked-In exchanges, which John Boylan mentions below in his reflections. John sees the discussion in terms of the different perspectives of academics and practitioners, and attempts to reconcile these positions through the lens of Soft Systems Methodology. Copyright International Institute of Forecasters, 2011
M3 - Journal article
VL - 23
SP - 16
EP - 20
JO - Foresight: The International Journal of Applied Forecasting
JF - Foresight: The International Journal of Applied Forecasting
ER -