Final published version
Licence: CC BY: Creative Commons Attribution 4.0 International License
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Forecast value added in demand planning
AU - Fildes, Robert
AU - Goodwin, Paul
AU - De Baets, Shari
PY - 2024/12
Y1 - 2024/12
N2 - Forecast value added (FVA) analysis is commonly used to measure the improved accuracy and bias achieved by judgmentally modifying system forecasts. Assessing the factors that prompt such adjustments, and their effect on forecast performance, is important in demand forecasting and planning. To address these issues, we collected the publicly available data on around 147,000 forecasts from six studies and analysed them using a common framework. Adjustments typically led to improvements in bias and accuracy for only just over half of stock keeping units (SKUs), though there was variation across datasets. Positive adjustments were confirmed as more likely to worsen performance. Negative adjustments typically led to improvements, particularly when they were large. The evidence that forecasters made effective use of relevant information not available to the algorithm was weak. Instead, they appeared to respond to irrelevant cues, or those of less diagnostic value. The key question is how organizations can improve on their current forecasting processes to achieve greater forecast value added. For example, a debiasing procedure applied to adjusted forecasts proved effective at improving forecast performance.
AB - Forecast value added (FVA) analysis is commonly used to measure the improved accuracy and bias achieved by judgmentally modifying system forecasts. Assessing the factors that prompt such adjustments, and their effect on forecast performance, is important in demand forecasting and planning. To address these issues, we collected the publicly available data on around 147,000 forecasts from six studies and analysed them using a common framework. Adjustments typically led to improvements in bias and accuracy for only just over half of stock keeping units (SKUs), though there was variation across datasets. Positive adjustments were confirmed as more likely to worsen performance. Negative adjustments typically led to improvements, particularly when they were large. The evidence that forecasters made effective use of relevant information not available to the algorithm was weak. Instead, they appeared to respond to irrelevant cues, or those of less diagnostic value. The key question is how organizations can improve on their current forecasting processes to achieve greater forecast value added. For example, a debiasing procedure applied to adjusted forecasts proved effective at improving forecast performance.
U2 - 10.1016/j.ijforecast.2024.07.006
DO - 10.1016/j.ijforecast.2024.07.006
M3 - Journal article
JO - International Journal of Forecasting
JF - International Journal of Forecasting
SN - 0169-2070
ER -