In the early 1980s, two of the authors of Principles of Business Forecasting, Robert Fildes and Keith Ord, were leaders in the movement to use experiments to determine which principles led to more accurate forecasts. The experiments show that forecasting methods today are capable of providing much more accurate forecasts that they were 50 years ago. This movement was a golden age for progress in forecasting. However, this knowledge has been not been widely adopted in business and government.

In recent years, Fildes and Ord have been concerned with ways of implementing new approaches. To my knowledge, Principles of Business Forecasting is the first textbook to include some of the new methods. It is aimed both at beginning forecasters and practitioners , so it also covers the basics. Despite the improvements in methodology, forecasting accuracy in business and government has been shown to be getting poorer. Much of the forecasting has been done by people who have no awareness of scientific forecasting methods. Principles of Business Forecasting is directed at that market.

Research by Kesten Green and I has found that forecasters have turned to "analytics" using "big data." That approach ignores all cumulative knowledge about forecasting methods. It also violates the Golden Rule of Forecasting and Occam's razor. In his 1972 paper "Alchemy in the Behavioral Sciences," Hillel Einhorn warned of the danger of relying on the computer. Hopefully, Ord, Fildes and Kourentzes's book will bring forecasting back to the basics and convince forecasters to use more accurate methods and principles. There is profit to be made.

J. Scott Armstrong, The Wharton School, University of Pennsylvania.

Scott Armstrong and Kesten Green are making a last call for help with their paper, "Forecasting methods and principles: Evidence-based checklists". They are ambitious for the paper: by the use of checklists, it is intended to make scientific forecasting accessible to all researchers, practitioners, clients, and other stakeholders who care about forecast accuracy.

Here is the abstract of the paper:

Problem: Few practitioners or academics use findings from nearly a century of experimental research that would allow them to substantially reduce forecast errors. In order to improve forecasting practice, this paper develops evidence-based guidance in a form that is easy for forecasters and decision-makers to access, understand, and use: checklists.
Methods: Meta-analyses of experimental research on forecasting were used to identify the principles and methods that lead to accurate out-of-sample forecasts. Cited authors were contacted to check that summaries of their research were correct. Checklists to help forecasters and their clients practice evidence-based forecasting were then developed from the research findings. Finally, appeals to identify errors of omission or commission in the analyses and summaries of research findings were sent to leading researchers.
Findings: Seventeen simple forecasting methods can between them be used to provide accurate forecasts for diverse problems. Knowledge on forecasting is summarized in the form of five checklists with guidance on the selection of the most suitable methods for the problem, and their implementation.
Originality: Three of the five checklists—addressing (1) evidence-based methods, (2) regression analysis, and (3) assessing uncertainty—are new. A fourth—the Golden Rule checklist—has been improved. The fifth—the Simple Forecasting checklist (Occam's Razor)—remains the same.
Usefulness: Forecasters can use the checklists as tools to reduce forecast errors—often by more than one-half—compared to those of forecasts from commonly used methods. Scientists can use the checklists to devise valid tests of the predictive validity of their hypotheses. Finally, clients and other interested parties can use the checklists to determine whether forecasts were derived using evidence-based procedures and can, therefore, be trusted.

Please send This email address is being protected from spambots. You need JavaScript enabled to view it. suggestions of evidence that the authors' may have missed (papers with relevant experimental evidence from tests of alternative methods), or mistakes they have made by Novermber 21 at the latest. The latest version of the working paper is available from ResearchGate, here.

The Unscaled Mean Bounded Relative Absolute Error (UMBRAE) is a new way to measure forecast errors proposed, and well supported in, Chen, Twycross, and Garibaldi (2017). "A new accuracy measure based on bounded relative error for time series forecasting". The new measure appears to be a promising alternative, and is certainly worthy of further comparative research. Some analysts may want to continue using the RAE until further testing is done. We suggest using both measures in the meantime.

We have added Ev Gardner's spreadsheet for damped trend exponential smoothing to the Software Page.

Don Miller and Dan Williams have recently re-posted spreadsheets and X-12 specifications that can be used to implement the seasonal damping method they proposed in "Shrinkage Estimators Of Time Series Seasonal Factors And Their Effect On Forecasting Accuracy," International Journal of Forecasting 19(4): 669-684, and "Damping seasonal factors: Shrinkage estimators for the X-12-ARIMA program," International Journal of Forecasting 20(4): 529-549.

Damping is needed to counteract excessive seasonal variation produced by classical decomposition and X-12, which is an artifact of the random noise in the data.These articles show that a small but significant reduction in forecast error is obtained when using seasonal damping. The benefit is more pronounced when seasonal factors are determined through Classical Decomposition, but remains significant when X-12 is used.

The spreadsheet software for seasonal factor damping is available from the Software Page.