Special Issue on Simple Versus Complex Forecasting

Journal of Business Research

Volume 68, Issue 8, Pages 1657-1818 (August 2015)

  1. Simple versus complex forecasting: The evidence. 1678-1685
    Green, K. C., & Armstrong, J. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.026
  2. Golden Rule of Forecasting: Be conservative. 1717-1731
    Armstrong, J. S., Green, K. C., & Graefe, A., http://dx.doi.org/10.1016/j.jbusres.2015.03.031
  3. Is there a Golden Rule? 1742-1745
    Fildes, R., & Petropoulos, F., http://dx.doi.org/10.1016/j.jbusres.2015.01.059
  4. Is a more liberal approach to conservatism needed in forecasting? 1753-1754
    Goodwin, P., http://dx.doi.org/10.1016/j.jbusres.2015.01.060
  5. The Golden Rule of Forecasting: Objections, refinements, and enhancements. 1702-1704
    Soyer, E., & Hogarth, R. M., http://dx.doi.org/10.1016/j.jbusres.2015.03.029
  6. Conservative forecasting with the damped trend. 1739-1741
    Gardner Jr., E. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.033
  7. Golden rule of forecasting rearticulated: Forecast unto others as you would have them forecast unto you. 1768-1771
    Green, K. C., Armstrong, J. S., & Graefe, A., http://dx.doi.org/10.1016/j.jbusres.2015.03.036
  8. The bias bias. 1772-1784
    Brighton, H., & Gigerenzer, G., http://dx.doi.org/10.1016/j.jbusres.2015.01.061
  9. Simple versus complex selection rules for forecasting many time series. 1692-1701
    Fildes, R., & Petropoulos, F., http://dx.doi.org/10.1016/j.jbusres.2015.03.028
  10. Forecasting new product trial with analogous series. 1732-1738
    Wright, M. J., & Stern, P., http://dx.doi.org/10.1016/j.jbusres.2015.03.032
  11. Decomposition of time-series by level and change. 1755-1758
    Tessier, T. H., & Armstrong, J. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.035
  12. Improving forecasts using equally weighted predictors. 1792-1799
    Graefe, A., http://dx.doi.org/10.1016/j.jbusres.2015.03.038
  13. Picking profitable investments: The success of equal weighting in simulated venture capitalist decision making. 1705-1716
    Woike, J. K., Hoffrage, U., & Petty, J. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.030
  14. Forecasting intermittent inventory demands: simple parametric methods vs. bootstrapping. 1746-1752
    Syntetos, A. A., Babai, M. Z., & Gardner Jr., E. S., http://dx.doi.org/10.1016/j.jbusres.2015.03.034
  15. Relative performance of methods for forecasting special events. 1785-1791
    Nikolopoulos, K., Litsa, A., Petropoulos, F., Bougioukos, V., & Khammash, M., http://dx.doi.org/10.1016/j.jbusres.2015.03.037
  16. Improving forecasts for noisy geographic time series. 1810-1818
    Huddleston, S. H., Porter, J. H., & Brown, D. E., http://dx.doi.org/10.1016/j.jbusres.2015.03.040
  17. When simple alternatives to Bayes formula work well: Reducing the cognitive load when updating probability forecasts. 1686-1691
    Goodwin, P., http://dx.doi.org/10.1016/j.jbusres.2015.03.027
  18. Collective wisdom: Methods of confidence interval aggregation. 1759-1767
    Lyon, A., Wintle, B. C., & Burgman, M., http://dx.doi.org/10.1016/j.jbusres.2014.08.012
  19. Communicating forecasts: The simplicity of simulated experience. 1800- 1809
    Hogarth, R. M., & Soyer, E., http://dx.doi.org/10.1016/j.jbusres.2015.03.039

Green, K. C. and Armstrong, J. S. (2007). "The Ombudsman: Value of Expertise for Forecasting Decisions in Conflicts, Interfaces, 37, 287-299.

Green and Armstrong's paper (available in full text as a working paper and from the publisher) provides evidence on the accuracy of forecasts from the method usually used for forecasting the decision people will make in a conflict situation: unaided expert judgment. The authors obtained 106 forecasts by experts and 169 forecasts by novices about eight real conflicts. Conflicts included a military conflict in the Middle East, a hostile takeover attempt in the telecommunications industry, and a union-management dispute between nurses and the hospital that employed them. Experts' forecasts were little better than novices', and their forecasts were not meaningfully different from choosing at random.

The Green and Armstrong paper provides evidence on two principles:  

6.3 Use structured rather than unstructured forecasting methods

G&A's evidence on this principle is indirect. The research reported in G&A does not include structured methods. Other research by the same authors, however, shows that two structured methods (structured analogies and simulated interaction) provided forecasts for the same situations that were substantially more accurate than those from unaided expert judgment in G&A.  

6.7 Match the forecasting method(s) to the situation.

Prior research has shown that unaided judgment is not an appropriate forecasting method for complex situations, when relationships are unclear, when feedback on predictions is poor, and when experts are biased. All four of these problems are likely to arise with conflict situations. Despite the fact that this evidence has been available for many years, unaided expert judgment is still the method of choice for forecasting decisions in conflicts.

Kesten Green

August 28, 2007

Goodwin, P. (2002) "Integrating management judgment with statistical  methods to improve short-term forecasts," Omega,30, 127-135.

This paper reviews the research literature to assess the effectiveness of methods that are designed to allow management judgment and statistical methods to be integrated when short-term point forecasts are required. A systematic search led to 45 empirical studies.

Two integration processes are identified: voluntary integration (where the forecaster is able to choose how much weight the statistical forecast will have in establishing the ‘final’ forecast) and mechanical integration (where the 'final' forecast is obtained by applying a statistical process to the judgmental forecasts). The main findings are:

Voluntary Integration

·        When only time series information is available (i.e. there is no domain knowledge) judgmental adjustments of statistical forecasts will tend to reduce accuracy because people attempt to forecast the noise in the series (supporting principle 11.4 Limit subjective adjustments of quantitative forecasts).

·        Judgmental adjustments should therefore only be made on the basis of  ‘important’ domain knowledge that is not included in the statistical forecast (supporting principle 7.5 Adjust for events expected in the future).

·      Requiring forecasters to record reasons for any adjustments is likely to reduce the     likelihood of damaging adjustments being made (supporting principle 8.3  Ask experts to justify their forecasts in writing).

·        More research is needed to establish the effectiveness of structured decomposition of judgmental adjustments  (see principle 11.2 Use structured judgment as inputs to quantitative models). 

Mechanical integration 

·       Combination of judgmental and statistical forecasts is most effective when the forecasts are unbiased and their errors are negatively correlated  (supporting principle 12.1 Combine forecasts from approaches that differ),

·        In many business environments there is insufficient data to justify using anything other than equal weights in the combination (supporting principle 12.4 Start with equal weights).

·        There is some empirical evidence that statistical correction for judgmental bias is likely to be more effective than combination in situations where there is an absence of ‘strong’ time patterns, but forecasters are in possession of domain knowledge that is difficult to model statistically.

·        When only time series information is available there is, as yet, no evidence to suggest that the use of judgmental bootstrapping will improve accuracy (see principle 11.5 Use judgmental bootstrapping instead of expert forecasts).

Voluntary versus mechanical integration

·        It is essential that forecasting methods are acceptable to decision makers (Principle 1.5 Obtain decision makers’ agreement on methods). Acceptance is more likely to be achieved through voluntary integration methods.


The complementary strengths that management judgment and statistical methods can bring to the forecasting process have been widely discussed. This paper reviews research on the effectiveness of methods that are designed to allow judgment and statistical methods to be integrated when short-term point forecasts are required. The application of both voluntary and mechanical integration methods are considered and conditions identified where the use of particular methods is appropriate, according to current research. While acknowledging the power of mechanical integration methods that exclude the judgmental forecaster from the integration process, the paper suggests that future research effort should focus on the design of forecasting support systems that facilitate voluntary integration. (Reprinted with permission, Copyright © 2002 Elsevier Science Ltd.)

This email address is being protected from spambots. You need JavaScript enabled to view it.

This page is for researchers, practitioners, and students who want to know about papers related to forecasting principles.

Papers with information on forecasting principles

Full-text copies of most of the other papers can be obtained through JSTOR or Business Premier (EBSCO) at subscribing institutions.

Special Issue on "Simple versus Complex Forecasting"

A Special Issue of the Journal of Business Research (Volume 68, Issue 8, pp 1657-1818), guest edited by Kesten Green and Scott Armstrong, is devoted to evidence on the effect of simplicity versus complexity on forecast accuracy. The Table of Contents for the Special Issue is available from the link below. The issue includes a summary of the evidence in the first paper, and a paper describing a unifying theory of forecasting, the Golden Rule of Forecasting.

Reviews of important papers on forecasting

Add your papers to the RePEc Archive or the Munich Personal RePEc Archive (MPRA)

You may add your own materials to RePEc through a department or institutional archive. All institutions are welcome to join and contribute their materials by establishing and maintaining their own RePEc archive. RePEc does not support personal archives (only institutional archives).

To add your work to the RePEc Archive independently of an institution you can set up and maintain an account on the Munich Personal RePEc Archive (MPRA).

Working Papers by authors seeking reviews and advice

If you would like your paper posted here, please send a copy or a URL to This email address is being protected from spambots. You need JavaScript enabled to view it.



Papers with information on forecasting principles

The papers below are provided in full text (PDF format). Articles from Elsevier Science, John Wiley, and other publishers have been reproduced with permission. Single copies of these articles may be downloaded and printed for the reader's personal research and study. Researchers are invited to submit relevant papers.


Green, K. C. & Armstrong, J. S. (2011), "Role thinking: Standing in other people’s shoes to forecast decisions in conflicts," International Journal of Forecasting, 27, 69-80. - Full text


Armstrong, J. S., & K. C. Green (2010), "Demand Forecasting: Evidence-based Methods," Working Paper - Full Text

Armstrong, J. S., K. C. Green, R. Jones, & M. Wright (2010), "Predicting elections from politicians’ faces," International Journal of Public Opinion Research, 22, 511-522. - Full text

Speirs-Bridge, A., F. Fidler, M. McBride, L. Flander, G. Cumming, and M. Burgman (2010), "Reducing overconfidence in the interval judgments of experts," Risk Analysis, 30, 512-523. - Full text


Herzog, S. M., & R. Hertwig (2009), "The wisdom of many in one mind: Improving individual judgments with dialectical bootstrapping," Psychological Science, 20, 231-237. - Full text


Green, K. C. & J. S. Armstrong (2007), "Structured analogies for forecasting," International Journal of Forecasting, 23, 365-376. - Full Text

Green, K. C. & J. S. Armstrong (2007), "Value of expertise for forecasting decisions in conflicts," Interfaces, 37, 287-299. - Working Paper; Published Paper

Green, Kesten C., J. Scott Armstrong, and Andreas Graefe, "Methods to Elicit Forecasts from Groups: Delphi and Prediction Markets Compared," forthcoming in Foresight: The International Journal of Applied Forecasting (Fall 2007). - Full Text

Jones, Randall J., J. Scott Armstrong, and Alfred G. Cuzàn (2007), "Forecasting Elections Using Expert Surveys: An Application to the U.S. Presidential Election," Prepared for presentation at the annual meeting of the American Political Science Association, Chicago, August 30 – September 2, 2007. - Full Text


Armstrong, J. Scott (2006), "Findings from Evidence-Based Forecasting: Methods for Reducing Forecast Error," International Journal of Forecasting, 22, 583-598. - Full Text

Armstrong, J. Scott (2006), "How to Make Better Forecasts and Decisions: Avoid Face-to-face Meetings (with commentary)," Foresight: The International Journal of Applied Forecasting, Fall, 3-15. - Full Text

Armstrong, J. Scott (2006), "Significance Tests Harm Progress in Forecasting," International Journal of Forecasting, 23, 321-327. - Full Text

Armstrong, J. Scott and Robert Fildes (2006), "Making Progress in Forecasting," International Journal of Forecasting, 22, 433-441. - Full Text

Using an index model called "The 13 Keys to the White House," American University professor Allan Lichtmann has correctly predicted all American presidential elections since 1860. Scott Armstrong and Alfred Cuzán in "Index Methods for Forecasting: An Application to the American Presidential Elections," Foresight, Issue 3, February 2006, examine how this model performs in comparison with several other models. To access the article, click here.


Armstrong, J. Scott (2005), "Findings from Evidence-based Forecasting Methods for Reducing Forecast Error," International Journal of Forecasting, 22, 583-598. - Full Text

Armstrong, J. Scott (2005), "The Forecasting Canon: Nine Generalizations to Improve Forecast Accuracy," Foresight: The International Journal of Applied Forecasting, 1 (1), June, 29-35. - Full Text

Armstrong, J. S., Fred Collopy, and J. Thomas Yokum (2005), "Decomposition by Causal Forces: A Procedure for Forecasting Complex Time Series," International Journal of Forecasting, 21, 25-36. - Full Text

Goodwin, P. (2005). "How to integrate management judgment with statistical forecasts," Foresight: The International Journal of Applied Forecasting, 1 (1), 8-11. - Full Text

Kesten C. Green & J. Scott Armstrong (2005), "The War in Iraq: Should we have expected better forecasts?," Foresight: The International Journal of Applied Forecasting, 1, October, 50-52. - Full Text


Armstrong, J. S. (2004), "Damped Seasonality Factors: Introduction," International Journal of Forecasting, 20, 525-527. - Full Text

Cuzàn, Alfred G., J. Scott Armstrong, and Randall J. Jones, "Combining Methods to Forecast the 2004 Presidential Election: The Pollyvote." - Full Text

Green, Kesten and J. Scott Armstrong, "Structured Analogies for Forecasting." - Full Text


Armstrong, J. Scott, and Ruth A. Pagell (2003), "Reaping Benefits from Management Research: Lessons from the Forecasting Principles Project," Interfaces, 33 (6), 89-111. - Full Text Provides data showing that invited papers are twenty times more likely to be important than are those accepted through the traditional peer review system. This paper was followed by commentary and by this reply:

Armstrong, J. Scott (2003), "Incentives for Developing and Communicating Principles: A Reply," Interfaces, 33 (8), 109-111. Full Text (PDF)

Also, see review by Raymond Hubbard in International Journal of Forecasting, 20 (2004), 740-741.

Fader, Peter S., Bruce G.S. Hardie, and Robert Zeithammer (2003), "Forecasting New Product Trial in a Controlled Test Market Environment," Journal of Forecasting. - Full Text


Green, Kesten C., "Forecasting decisions in conflict situations: A comparison of game theory, role-playing, and unaided judgment," International Journal of Forecasting, 18, 321-344. - Full Text Unaided judgment and game theorists were unable to provide useful forecasts of decisions made in conflicts. In contrast, simulated interactions (a type of role playing) were quite accurate. The following papers were written in response, along with other commentaries.

Armstrong, J. S. (2002), "Assessing Game Theory, Role Playing, and Unaided Judgment," International Journal of Forecasting, 18 (3), 345-352. - Full Text

See also Green's reply to commentators, "Embroiled in a conflict: Who do you call?," International Journal of Forecasting, 28, 389-395. - Full Text


Adya, M., F. Collopy, J. S. Armstrong and M. Kennedy (2001), "Automatic Identification of Time Series Features for Rule-Based Forecasting," International Journal of Forecasting, 17, 143-157. - Automatic procedures, which are less expensive and more reliable than judgmental procedures, produced rule-based forecasts with little loss in forecast accuracy.

Armstrong, J. Scott (2001), "Combining Forecasts," Principles of Forecasting: A Handbook for Researchers and Practitioners, J. Scott Armstrong (ed.): Norwell, MA: Kluwer Academic Publishers. - Reprinted permission of Kluwer/Springer.

Armstrong, J. Scott (2001), "Should We Redesign Forecasting Competitions?," International Journal of Forecasting, 17, 542-545. - In the future, competitions should start with hypotheses as to which methods will be most effective under what conditions. In addition, they should allow for the use of domain knowledge.

Armstrong, J. S., and F. Collopy (2001), "Identification of Asymmetric Prediction Intervals through Causal Forces," Journal of Forecasting, 20, 273-283. - When forecast errors are large, as is common in annual forecasting, errors are asymmetrical only in percentage terms. Log transformations can correct for this asymmetry, although asymmetry in the logs occurs for "contrary" time series.

Armstrong, J. S., and T. Yokum (2001), "Potential Diffusion of Expert Systems in Forecasting," Technological Forecasting and Social Change, 67, 93-103. Review by M. Adya.


Adya, M., J. S. Armstrong, F. Collopy & M. Kennedy (2000), "An Application of Rule-based Forecasting for a Situation Lacking Domain Knowledge," International Journal of Forecasting, 16, 477-484. - This paper was part of the M3-competition. A simplified version of RBF performed well without using domain knowledge.

Armstrong, J., V. Morwitz & V. Kumar (2000), "Sales Forecasts for Existing Consumer Products and Services: Do Purchase Intentions Contribute to Accuracy?" International Journal of Forecasting, 16, 383-397. - Intentions data are useful, even when one has historical sales data. In the study of consumer durables, intentions led to a 1/3 reduction in forecast errors. This paper was named as one of four outstanding papers published in the International Journal of Forecasting for the period 2000-2001.

Collopy, F. & J. S. Armstrong (2000), "Another Error Measure for Selection of the Best Forecasting Method: The Unbiased Absolute Percentage Error," working paper. - The Unbiased Absolute Percentage Error does not offer any advantage over the Relative Absolute Error.


Armstrong, J. S. (1999), "Forecasting for Environmental Decision-Making," in V.H. Dale and M.E. English, eds., Tools to Aid Environmental Decision Making, New York: Springer-Verlag, 192-225.

Armstrong, J. S. (1999), "Sales Forecasting," in The IEBM Encyclopedia of Marketing, Michael J. Baker (Ed.), London, International Thomson Business Press, 278-290.

Armstrong, J. S. & R. J. Brodie (1999), "Forecasting for Marketing," in G. J. Hooley and M. K. Hussey (eds.), Quantitative Methods in Marketing, 2nd ed., London: International Thompson Business Press, 92-119. - This review of empirical evidence leads to recommendations on how to improve forecast accuracy in marketing.


Armstrong, J. S. (1998), "Commentaries on 'Generalizing about Univariate Forecasting Methods: Further Empirical Evidence,'" International Journal of Forecasting, 14, 359-366. - Comparative studies such as the M-competitions should fully describe the conditions that describe the data.

Armstrong, J. S. & F. Collopy (1998), "Integration of Statistical Methods and Judgment for Time Series Forecasting: Principles from Empirical Research," published in G. Wright and P. Goodwin (eds.), Forecasting with Judgment, John Wiley & Sons Ltd., pp. 269-293, with review by N. R. Sanders, International Journal of Forecasting, 15 (1999), 345-346. - Presents all feasible ways to combine judgment and statistical forecasts, and reviews empirical evidence on each.


Armstrong, J. S. & R. Fildes (1995), "On the Selection of Error Measures for Comparisons Among Forecasting Methods," Journal of Forecasting, 14, 67-71 - This review of empirical evidence leads to the conclusion that mean square errors are highly unreliable and should not be used in forecasting.

Yokum, J. & J. S. Armstrong (1995), "Beyond Accuracy: Comparison of Criteria Used to Select Forecasting Methods," International Journal of Forecasting, 11, 591-597. - According to experts, accuracy is always important, yet other criteria are nearly as important.


Armstrong, J. S. (1994), "The Fertile Field of Meta-Analysis: Cumulative Progress in Agricultural Forecasting," International Journal of Forecasting, 10, 147-149.

Armstrong, J. S. & R. Brodie (1994), "Effects of Portfolio Planning Methods on Decision Making: Experimental Results," International Journal of Research in Marketing, 11, 73-84. - This review of empirical evidence leads to the conclusion that mean square errors are highly unreliable and should not be used in forecasting.

Armstrong, J. S. & R. Brodie (1994), "Portfolio Planning Methods: Faculty Approach or Faculty Research? A Rejoinder to 'Making Better Decisions' by Wensley," International Journal of Research in Marketing, 11, 91-93.

Armstrong, J. & F. Collopy (1994), "How Serious are Methodological Issues in Surveys: A Reexamination of the Clarence Thomas Polls" (Internet publication). - Decisions can be improved if criteria are established before making a decision. Once a decision is made, decision makers were reluctant to change even if their decision violates their own criteria.

Collopy, F., M. Adya & J. Armstrong (1994), "Principles for Examining Predictive Validity: The Case of Information Systems Spending Forecasts," Information Systems Research, 5, 170-179.

MacGregor, D. & J. S. Armstrong (1994), "Judgmental Decomposition: When Does It Work?" International Journal of Forecasting, 10, 495-506. - Decomposition is useful when dealing with large numbers (or very small numbers) and when one knows more about the parts than the whole. Otherwise, it leads to increased errors.

Stewart, T. & C. Lusk (1994), "Seven Components of Judgmental Forecasting Skill: Implications for Research and the Improvement of Forecasts," Journal of Forecasting, 13, 579-599.


Armstrong, J. S. & F. Collopy (1993), "Causal Forces: Structuring Knowledge for Time-series Extrapolation," Journal of Forecasting, 12, 103-115. - Domain knowledge, expressed as expectations about trends in time series, led to improved accuracy in time series forecasts.


Armstrong, J. S. & F. Collopy (1992), "Error Measures for Generalizing about Forecasting Methods: Empirical Comparisons," International Journal of Forecasting, 8, 69-80 - Proposes the Relative Absolute Error (RAE) and compares it with other error measures for comparisons among time series methods. It is reliable and easy to interpret.

Collopy, F. & J. S. Armstrong (1992), "Expert Opinions about Extrapolation and the Mystery of the Overlooked Discontinuities," International Journal of Forecasting, 8, 575-582. - Traditional time series extrapolative methods do not perform well when there are discontinuities in the data, according to this survey of forecasting experts.

Collopy, F. & J. S. Armstrong (1992), "Rule-Based Forecasting: Development and Validation of an Expert Systems Approach for Combining Time Series Extrapolations," Management Science, 38, 1394-1414. - Uses prior knowledge about forecasting methods and domain knowledge to formulate rules for time-series forecasting.


Armstrong, J. S. (1991), "Prediction of Consumer Behavior by Experts and Novices," Journal of Consumer Research, 18, 251-256. - Academics who are familiar with research on consumer behavior were no more accurate than practitioners or high school students when making forecasts about the outcomes of studies on consumer behavior.


Armstrong, J. S. (1989), "Combining Forecasts: The End of the Beginning or the Beginning of the End," International Journal of Forecasting, 5, 585-588.

Armstrong, J. S. & P. Hutcherson (1989), "Predicting the Outcome of Marketing Negotiations: Role-Playing versus Unaided Opinions," International Journal of Research in Marketing, 6, 227-239. - Role-playing is much more accurate than unaided judgment to predict decisions made in negotiations.

Clemen, Robert T. (1989), "Combining Forecasts: A Review and Annotated Bibliography," International Journal of Forecasting, 5, 559-583.

Collopy, F. & J. S. Armstrong (1989), "Toward Computer-Aided Forecasting Systems: Gathering, Coding, and Validating the Knowledge," in George R. Widmeyer (ed.), DSS-899 Transactions: Ninth International Conference on Decision Support Systems, Institute of Management Science, 103-119. - Describes how to develop rules for forecasting.

Dakin, S. & J. Armstrong S. (1989), "Predicting Job Performance: A Comparison of Expert Opinion and Research Findings," International Journal of Forecasting, 5, 187-194. - Despite a wealth of useful research on personnel selection, practitioners rely on invalid procedures.


Armstrong, J. S. (1988), "Book Review of Corporate Strategic Planning," Journal of Marketing, 54, 114-119. - In practice, formal planning is useful.

Armstrong, J. S. (1988), "Communication of Research in Forecasting: The Journal," International Journal of Forecasting, 4, 321-324. - Reports on a survey of 201 readers of two forecasting journals. A substantial amount of findings were used by practitioners and researchers.

Armstrong, J. S. (1988), "Research Needs in Forecasting," International Journal of Forecasting, 4, 449-465.

Armstrong, J. S. (1988), "Review of Ravi Batra, The Great Depression of 1990" International Journal of Forecasting, 4, 493-502. - Identifies faulty forecasting procedures in a popular book.


Armstrong, J. S. (1987), "Forecasting Methods for Conflict Situations," in G. Wright and P. Ayton (eds.), Judgmental Forecasting, 157-176. - For a more recent report, see the Role Playing chapter in Principles of Forecasting.

Armstrong, J. S., R. Brodie & S. McIntyre (1987), "Forecasting Methods for Marketing: Review of Empirical Research," International Journal of Forecasting, 3, 335-376.


Armstrong, J. S. (1986), "Research on Forecasting: A Quarter-Century Review, 1960-1984," Interfaces, 16, 89-109. - This review shows that substantial progress has been made since 1960. That said, much efforts are still being devoted to areas that have little promise, while some highly promising areas are ignored.

Armstrong, J. S. (1986), "Review of Steven J. Rosenstone, Forecasting Presidential Elections," International Journal of Forecasting, 2, 248-249.

Armstrong, J. S., E. B. Dagum, R. Fildes & S. Makridakis (1986), "Publishing Standards for Research in Forecasting (An Editorial)," International Journal of Forecasting, 2, 133-137.


Armstrong, J. S. (1984), "Do Judgmental Researchers Use Their Own Research? A Review of Judgment Under Uncertainty: Heuristics and Biases," Journal of Forecasting, 3, 235-239.

Armstrong, J. S. (1984), "Forecasting by Extrapolation: Conclusions from 25 Years of Research," Interfaces, 14 (Nov.-Dec.), 52-66, with commentaries and reply. - Relatively simple extrapolation methods were found to be more accurate than sophisticated methods.


Armstrong, J. S. (1983), "Relative Accuracy of Judgmental and Extrapolative Methods in Forecasting Annual Earnings," Journal of Forecasting, 2, 437-447. - Management forecasts of firms' annual earnings were more accurate than those by professional independent analysts, and these were more accurate than statistical extrapolations.

Armstrong, J. S. (1983), "Strategic Planning and Forecasting Fundamentals," in Kenneth Albert (ed.), The Strategic Management Handbook. New York: McGraw Hill. - The analysis and setting of objectives has long been regarded as a major step in formal strategic planning. Informal planners seldom devote much energy to this step. For example, in Baker's (1957) summary planning for Edsel, less than 1 percent of his discussion concerned objectives.

Armstrong, J. S. & E. Lusk (1983), "Commentary on the Makridakis Time Series Competition (M-Competition)," with commentary by Gardner, Geurts, Lopes, Markland, McLaughlin, Newbold, Pack, and replies by Andersen, Carbone, Fildes, Parzen, Newton, Winkler, & Makridadis, Journal of Forecasting 2, 259-311.


Armstrong, J. S. (1982), "Research on Scientific Journals: Implications for Editors and Authors," Journal of Forecasting, 1, 83-104.

Armstrong, J. S. (1982), "Strategies for Implementing Change: An Experiential Approach," Group and Organization Studies, 7, 457-475.

Carbone, R. & J. S. Armstrong (1982), "Evaluation of Extrapolative Forecasting Methods: Results of a Survey of Academicians and Practitioners," Journal of Forecasting 1, 215-217. - Reports on a survey of academics and practitioners dealing with beliefs about forecasting methods. For example, they favored the RMSE for evaluation.

Makridakis, Spryos, J. Scott Armstrong, Robert Carbone & Robert Fildes (1982), "An Editorial Statement," Journal of Forecasting, Vol.1, 1-2.


Armstrong, J. S. (1981), "How Expert Are the Experts?," Inc., December, 15-16. - A simpler version of the paper from the Technology Review (1980).


Armstrong, J. S. (1980), "The Seer-Sucker Theory: The Value of Experts in Forecasting," Technology Review, June/July, 16-24. - Expertise above a minimal level does little to improve the accuracy for forecasts of changes over time.


Armstrong, J. S. (1978), "Econometric Forecasting and the Science Court," Journal of Business, 51, 595-600.

Armstrong, J. S. (1978), "Forecasting with Econometric Methods: Folklore versus Fact," Journal of Business, 51, 549-564. - This study, using a survey of the world's leading econometricians, found conflicts between the econometricians' procedures and results from empirical studies. The paper was published along with commentaries by econometricians. The next paper followed this.

Armstrong, J. S. (1978), "Review of Don A. Dillman, Mail and Telephone Surveys," Journal of Business, 54 (4), 622-625. - Note: A 2nd edition of this book, Mail and Internet Surveys, was published by Wiley in 2000.


Tessier, T. & J. Armstrong (1977), "Improving Current Sales Estimates with Econometric Models," (Internet publication).


Armstrong, J. S. (1975), "Tom Swift and his Electric Regression Analysis Machine: 1973," Psychological Reports, 36, 806. - Illustrates problems with non-theoretical statistical analyses by reporting on a published study that used stepwise regression with 115 variables and 19 observations.

Armstrong, J. S., W. Denniston, Jr. & M. Gordon (1975), "The Use of the Decomposition Principle in Making Judgments," Organizational Behavior and Human Performance, 13, 257-263. - Judgmental decomposition improved estimation for difficult problems. See MacGregor and Armstrong (1994) for a more recent treatment.


Armstrong, J. S. & A. Shapiro (1974), "Analyzing Quantitative Models," Journal of Marketing, 38, 61-66. - We present a framework for analyzing models, and illustrate its use in analyzing a popular marketing model.


Armstrong, J. S. & M. Grohman (1972), "A Comparative Study of Methods for Long-Range Market Forecasting," Management Science, 19, 211-221. - Econometric methods were more accurate than extrapolation methods.


Armstrong, J. S. & T. Overton (1971), "Brief vs. Comprehensive Descriptions in Measuring Intentions to Purchase," Journal of Marketing Research, 8, 114-117. - Brief descriptions of new products were as accurate as elaborate and expensive descriptions.


Armstrong, J. S. (1970), "An Application of Econometric Models to International Marketing," Journal of Marketing Research, 7, 190-198. - Econometric methods produced substantially more accurate long-range forecasts of the photographic markets in 17 countries than did extrapolations.

Armstrong, J. S. & J. Andress (1970), "Exploratory Analysis of Marketing Data: Trees vs. Regression," Journal of Marketing Research, 7, 487-492, - Trees (using Automatic Interaction Detector) was more accurate than regression in forecasting the sales at 2,717 gas stations. Also see Armstrong, J. (1971), "Exploratory Analysis of Marketing Data: A Reply," Journal of Marketing Research, 8, 311-313.


Armstrong, J. S. & J. Farley (1969), "A Note on the Use of Markov Chains in Forecasting Store Choice," Management Science, 16, B281-B285. - Markov chains proved to be of little value in forecasting customers' choice of a food store.


Armstrong, J. S. (1967), "Derivation of Theory by Means of Factor Analysis, or Tom Swift and His Electric Factor Analysis Machine," The American Statistician, December, 17-21.

Return to top

Reviews of important papers on forecasting before 1985

J. Scott Armstrong's summaries and critiques (written from 1982 to 1985 for the Journal of Forecasting) of important articles on forecasting originally appeared in various major journals. Authors of the original papers were invited to comment, and they did so in almost all cases. The list is not comprehensive; it is simply a selection of papers that were thought to be important. These summaries are reproduced here with the kind permission of the journals' editors and John Wiley.

Click on the author[s] and date to read the review.

  • Alexander III, Elmore R. and Ronnie D. Wilkins (1982), "Performance rating validity: the relationship of objective and subjective measures of performance," Group and Organization Studies, 7, 485-496.

  • Anderson, Craig A. (1983), "Abstract and concrete data in the perseverance of social theories: when weak data lead to unshakeable beliefs," Journal of Experimental Social Psychology, 19, 93-108.
  • Anderson, Craig A. (1983), "Imagination and expectation: the effect of imagining behavioral scripts on personal intentions," Journal of Personality and Social Psychology, 45, 293-305.
    • Arkes, Hal R. et al (1981), "Hindsight bias among physicians weighing the likelihood of diagnoses," Journal of Applied Psychology, 66, 252-254.

    • Baker, Earl J. (1979), "Predicting response to hurricane warnings: a reanalysis of data from four studies," Mass Emergencies, 4, 9-24.

    • Borman, Walter C. (1982), "Validity of behavioral assessment for predicting military recruiter performance," Journal of Applied Psychology, 67, 3-9.

    • Camerer, Colin (1981), "General conditions for the success of bootstrapping models," Organizational Behavior and Human Performance, 27, 411–422.

    • Currim, Imran S. (1981), "Using segmentation approaches for better prediction and understanding from consumer mode choice models," Journal of Marketing Research, 18, 301-309.

    • Daub, Mervin (1981), "The accuracy of Canadian short-term economic forecasts revisited," Canadian Journal of Economics, 14, 499–507.
    • Fischer, Gregory W. (1981), "When oracles fail - a comparison of four procedures for aggregating subjective probability forecasts," Organizational Behavior and Human Performance, 28, 96-110.

    • Fischer, Gregory W. (1982), "Scoring-rule feedback and the overconfidence syndrome in subjective probability forecasting," Organizational Behavior and Human Performance, 20, 352-369.

      Fralicx, Rodney and Namburg S. Raju (1982), "A comparison of five methods for combining multiple criteria into a single composite," Educational and Psychological Measurement, 42, 823-827.

    • Glantz, Michael H. (1982), "Consequences and responsibilities in drought forecasting: the case of Yakima, 1977," Water Resources Research, 18, 3-13.

    • Holmes, David S. et al (1980), "Biorhythms: their utility for predicting post-operative recuperative time, death and athletic performance," Journal of Applied Psychology, 65, 233-236.

    • Keen, Howard Jr. (1981), "Who forecasts best? Some evidence from the Livingston survey," Business Economics, 16, 24-29.

    • Marks, Robert E. (1980), "The value of 'almost' perfect weather information to the Australian tertiary sector," Australian Journal of Management, 5, 67-85.
    • Moore, William L. (1982), "Predictive power of joint space models constructed with composition techniques," Journal of Business Research, 10, 217-236.
    • More, Roger A. and Blair Little (1980), "The application of discriminant analysis to the prediction of sales forecast uncertainty in new product situations," Journal of the Operational Research Society, 31,71-77.
    • Murphy, Allan H. et al (1980), "Misinterpretations of precipitation probability forecasts," Bulletin of the American Meteorological Society, 61, 695-701.
    • Sewell, Murphy A. (1981), "Relative information contributions of consumer purchase intentions and management judgment as explanators of sales," Journal of Marketing Research, 18, 249-253.
    • Simonton, Dean Keith (1981), "Presidential greatness and performance: can we predict leadership in the White House?" Journal of Personality, 49, 306-323.

    Return to top

    Reviews of important papers on forecasting from 1985 and on

    J. Scott Armstrong's summaries and critiques of important articles on forecasting originally appeared in the International Journal of Forecasting and other major journals. Authors of the original papers were invited to comment, and they did so in almost all cases. The list is not comprehensive; it is simply a selection of papers that were thought to be important. Papers from the International Journal of Forecasting were not included. These summaries are reproduced here with the kind permission of the journals' editors and Elesevier Science.

    Click on the author[s] and date to read the review.

    Return to top