Home > Research > Publications & Outputs > A “Softer” approach to the measurement of forec...
View graph of relations

A “Softer” approach to the measurement of forecast accuracy

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

A “Softer” approach to the measurement of forecast accuracy. / Boylan, John.
In: Foresight: The International Journal of Applied Forecasting, Vol. 23, 2011, p. 16-20.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Boylan, J 2011, 'A “Softer” approach to the measurement of forecast accuracy', Foresight: The International Journal of Applied Forecasting, vol. 23, pp. 16-20. <https://ideas.repec.org/a/for/ijafaa/y2011i23p16-20.html>

APA

Vancouver

Boylan J. A “Softer” approach to the measurement of forecast accuracy. Foresight: The International Journal of Applied Forecasting. 2011;23:16-20.

Author

Boylan, John. / A “Softer” approach to the measurement of forecast accuracy. In: Foresight: The International Journal of Applied Forecasting. 2011 ; Vol. 23. pp. 16-20.

Bibtex

@article{6591e8640e13428d8545cfb7321deed4,
title = "A “Softer” approach to the measurement of forecast accuracy",
abstract = "Foresight{\textquoteright}s Summer 2010 issue contained a letter to the editor from David Hawitt, suggesting that forecast managers would be more receptive to hearing about forecast accuracy rather than forecast error. If forecast error is measured by the average absolute percentage error (MAPE), then forecast accuracy would be the complement, 100% - MAPE. David argued that while reporting (e.g.) a 40% average error might generate “knee-jerk reactions and the creation of unrealistic goals,” reporting 60% accuracy stimulates executive thinking about “What can we do to improve this?” Then, in the Spring 2011 issue of Foresight, Mark Little and Jim Hoover offered their commentaries on the Hawitt recommendation. Mark had a different take on the issue: “Rather than attempt to express accuracy in a form executives think they understand, it may be better to focus on the improvements in business outcomes (the KPIs) that result through better forecasts.” The issue was carried forward in recent Linked-In exchanges, which John Boylan mentions below in his reflections. John sees the discussion in terms of the different perspectives of academics and practitioners, and attempts to reconcile these positions through the lens of Soft Systems Methodology. Copyright International Institute of Forecasters, 2011",
author = "John Boylan",
year = "2011",
language = "English",
volume = "23",
pages = "16--20",
journal = "Foresight: The International Journal of Applied Forecasting",
publisher = "International Institute of Forecasters",

}

RIS

TY - JOUR

T1 - A “Softer” approach to the measurement of forecast accuracy

AU - Boylan, John

PY - 2011

Y1 - 2011

N2 - Foresight’s Summer 2010 issue contained a letter to the editor from David Hawitt, suggesting that forecast managers would be more receptive to hearing about forecast accuracy rather than forecast error. If forecast error is measured by the average absolute percentage error (MAPE), then forecast accuracy would be the complement, 100% - MAPE. David argued that while reporting (e.g.) a 40% average error might generate “knee-jerk reactions and the creation of unrealistic goals,” reporting 60% accuracy stimulates executive thinking about “What can we do to improve this?” Then, in the Spring 2011 issue of Foresight, Mark Little and Jim Hoover offered their commentaries on the Hawitt recommendation. Mark had a different take on the issue: “Rather than attempt to express accuracy in a form executives think they understand, it may be better to focus on the improvements in business outcomes (the KPIs) that result through better forecasts.” The issue was carried forward in recent Linked-In exchanges, which John Boylan mentions below in his reflections. John sees the discussion in terms of the different perspectives of academics and practitioners, and attempts to reconcile these positions through the lens of Soft Systems Methodology. Copyright International Institute of Forecasters, 2011

AB - Foresight’s Summer 2010 issue contained a letter to the editor from David Hawitt, suggesting that forecast managers would be more receptive to hearing about forecast accuracy rather than forecast error. If forecast error is measured by the average absolute percentage error (MAPE), then forecast accuracy would be the complement, 100% - MAPE. David argued that while reporting (e.g.) a 40% average error might generate “knee-jerk reactions and the creation of unrealistic goals,” reporting 60% accuracy stimulates executive thinking about “What can we do to improve this?” Then, in the Spring 2011 issue of Foresight, Mark Little and Jim Hoover offered their commentaries on the Hawitt recommendation. Mark had a different take on the issue: “Rather than attempt to express accuracy in a form executives think they understand, it may be better to focus on the improvements in business outcomes (the KPIs) that result through better forecasts.” The issue was carried forward in recent Linked-In exchanges, which John Boylan mentions below in his reflections. John sees the discussion in terms of the different perspectives of academics and practitioners, and attempts to reconcile these positions through the lens of Soft Systems Methodology. Copyright International Institute of Forecasters, 2011

M3 - Journal article

VL - 23

SP - 16

EP - 20

JO - Foresight: The International Journal of Applied Forecasting

JF - Foresight: The International Journal of Applied Forecasting

ER -