Considering the frequency and cost of inaccurate forecasts, there has been remarkably little research comparing methods for forecasting decisions in conflicts.

  • Type of Information

- Evidence on accuracy
- Expert expectations
-
Descriptions of methods
- Theory and commentary

Type of Information

Evidence on the accuracy of conflict forecasting methods

  • Armstrong, J. S. & Green, K. C. (2005) "Competitor-oriented objectives: The myth of market share," Monash University Econometrics and Business Statistics Working Paper 17/05. -Full Text

  • Armstrong, J. S. (2002), "Assessing game theory, role playing, and unaided judgment," International Journal of Forecasting, 18 (3), 345-242 - Full Text

  • Armstrong, J. S. (2001). "Role playing: A method to forecast decisions." In Armstrong, J. S. (Ed.), Principles of Forecasting: A Handbook for Researchers and Practitioners. Norwell, MA: Kluwer Academic Publishers, 15-30 - Abstract

  • Babcock, L., Loewenstein, G., Issacharoff, S., & Camerer, C. (1995). "Biased judgments of fairness in bargaining." The American Economic Review, 85(5), 1337-1343.

  • Beck, N., King, G., & Zeng, L. (2000)."Improving quantitative studies of international conflict: A conjecture." American Political Science Review, 94 (1), 21-35 - Full Text

  • Bueno de Mesquita, B. & Stokman, F. N., (1994). Models of exchange and of expected utility maximisation: A comparison of accuracy. In Bueno de Mesquita, B., & Stokman, F. N. (Eds.), European community decision making: models, applications, and comparisons. New Haven, CN: Yale University Press, 214-228.

  • Feder, S. A. (1987). Factions and Policon: New ways to analyze politics. Studies in Intelligence, 31(1), 41-57.

  • Fraser, N. M. (1986). Political and social forecasting using conflict analysis. European Journal of Political Economy, 2(2), 203-222.

  • Fraser, N. M., & Hipel, K. W. (1984). Conflict analysis: Models and resolutions. New York: North-Holland.

  • Ghemawat, P., & McGahan, A. M. (1998). Order backlogs and strategic pricing: The case of the US large turbine generator industry. Strategic Management Journal, 19(3), 255-268.

  • Green, K. C. & Armstrong, J. S. (2007). "Value of expertise for forecasting decisions in conflicts," Interfaces, 37, 287-299. - Working Paper; Published Paper.

  • Green, K. C. & Armstrong, J. S. (2007). "Structured analogies for forecasting" International Journal of Forecasting, 23, 365-376. - Working Paper; Published Paper.
  • Green, K. C. & Armstrong, J. S. (2005), "The war in Iraq: Should we have expected better forecasts?" Foresight, 2, 50-52. - Full Text

  • Green, K. C. (2005), "Game theory, simulated interaction, and unaided judgment for forecasting decisions in conflicts," International Journal of Forecasting, 21, 463-472- Full Text [Top 25 Hottest Articles in IJF during September quarter 2005]

  • Green, K. C. (2002). Forecasting decisions in conflict situations: A comparison of game theory, role-playing, and unaided judgement. International Journal of Forecasting, 18, 321-344. - Full Text [Best Paper Award 2002-2003]

  • King, G. & Zeng, L. (2001). "Improving forecasts of state failure." World Politics, 53, 623-658. - Full Text

  • Rowe, G., & Wright, G. (2001). Expert opinions in forecasting: The role of the Delphi technique. In Armstrong, J. S. (Ed.), Principles of forecasting: a handbook for researchers and practitioners. Norwell, MA: Kluwer Academic Publishers, 125-144. - Abstract

  • Sambanis, N. (2003). Using case studies to expand the theory of civil war. World Bank Conflict Prevention and Reconstruction Unit Working Paper No. 5 - Full Text

  • Tetlock, P. E. (1992). Good judgment in international politics: Three psychological perspectives. Political Psychology, 13(3), 517-539.

Return to top

Expert expectations on the accuracy of forecasts from different methods

Scott Armstrong and Kesten Green surveyed diverse experts on their expectations of the accuracy of conflict forecasting methods when used by experts and by novices (see questionnaire). They asked the experts for their expectations of the proportion of correct forecasts from various combinations of method and expertise assuming 28% accuracy could be achieved by choosing decisions at random. On average, the respondents expected experts to correctly pick the actual outcome 45% of the time if they used unaided judgement and half of the time if they used game theory, structured analogies, or simulated interaction. They expected 30% of novices' unaided-judgement forecasts and 40% of novices' simulated-interaction forecasts to be accurate.

Descriptions of methods

Outlines of how to implement conflict forecasting methods are provided on these pages and in the Forecasting Dictionary. More comprehensive descriptions are provided in:

  • Armstrong, J. S. (2001). Role playing: A method to forecast decisions. In Armstrong, J. S. (Ed., Principles of Forecasting: A handbook for researchers and practitioners. Norwell, MA: Kluwer Academic Publishers, 15-30 - Abstract

  • Collier, P. Hoeffler, A. & Soderbom, M. (2001), On the duration of civil war. World Bank Working Paper No. 2681 - Full Text

  • Dixit, A., & Skeath, S. (1999). Games of strategy. New York: Norton.

  • Elbadawi, I. & Sambanis, N. (2001). How much war will we see? Estimating the incidence of civil war in 161 countries. World Bank Working Paper No. 2533 - Full Text

  • Green, K. C. & Armstrong, J. S. (2004). "Structured analogies for forecasting" - Monash University Econometrics and Business Statistics Working Paper 17/04 - Full Text

  • Hargreaves Heap, S. P., & Varoufakis, Y. (1995). Game theory: A critical introduction. New York: Routledge.

  • Khong, Y. F. (1992). Analogies at War: Korea, Munich, Dien Bien Phu, and the Vietnam Decisions of 1965. Princeton NJ: Princeton University Press.

  • Nalebuff, B. J., & Brandenburger, A. M. (1996). Co-opetition. London: Harper Collins.

  • Neustadt, R. E., & May, E. R.. (1986). Thinking in time: The uses of history for decision makers. New York: Free Press.

  • Rowe, G., & Wright, G. (2001). Expert opinions in forecasting: The role of the Delphi technique. In Armstrong, J. S. (Ed.), Principles of forecasting: A handbook for researchers and practitioners. Norwell, MA: Kluwer Academic Publishers, 125-144. - Abstract

  • Sambanis, N. (2003). Using case studies to expand the theory of civil war. World Bank Conflict Prevention and Reconstruction Unit Working Paper No. 5 - Full Text

  • Wolfers, J. & Zitzewitz, E. (2004). Prediction markets. NBER Working Papers Series, Working Paper 10504.

Return to top

Theory and Commentary

Research findings by Kesten Green on the accuracy of game-theorists' forecasts relative to that of simulated interactions forecasts published in the International Journal of Forecasting (18:3) were accompanied by six commentaries by nine authors:

  • Armstrong, J. S. (2002). Assessing game theory, role playing, and unaided judgment. International Journal of Forecasting, 18, 345-352 - Full Text

  • Bolton, G. E. (2002). Game theory's role in role-playing. International Journal of Forecasting, 18, 353-358 - Full Text

  • Erev, I., Roth, A. E., Slonim, R. L., & Barron, G. (2002). Predictive value and the usefulness of game theoretic models. International Journal of Forecasting, 18, 359-368 -Full Text

  • Goodwin, P. (2002). Forecasting games: Can game theory win? International Journal of Forecasting, 18, 369-374 - Full Text

  • Green, K. C. (2002). Embroiled in a conflict: Who do you call?. International Journal of Forecasting, 18, 389-395 - Full Text

  • Shefrin, H. (2002). Behavioural decision making, forecasting, game theory, and role-play, International Journal of Forecasting, 18, 375-382 - Full Text

  • Wright, G. (2002). Game theory, game theorists, university students, role-playing and forecasting ability. International Journal of Forecasting, 18, 383-387 - Full Text

Theories about conflict forecasting are discussed to a greater or lesser extent in many of the articles and books referred to above. The following do this to a greater degree than others.

  • Elbadawi, I. & Sambanis, N. (2001). How much war will we see? Estimating the incidence of civil war in 161 countries. World Bank Working Paper No. 2533 - Full Text

  • Khong, Y. F. (1992). Analogies at War: Korea, Munich, Dien Bien Phu, and the Vietnam Decisions of 1965. Princeton NJ: Princeton University Press.

  • Schrodt, P. A. (2004). Forecasting conflict using open sources. Paper presented at the Graduate School of Journalism, University of California Berkeley, 19 March - Full Text

  • Schrodt, P. A. (2002). Forecasts and contingencies: From methodology to policy. Paper presented at the American Political Science Association meetings, Boston, 29 August -1 September - Full Text

  • Sambanis, N. (2003). Using case studies to expand the theory of civil war. World Bank Conflict Prevention and Reconstruction Unit Working Paper No. 5 - Full Text

Return to top

Abstracts of Relevant Forthcoming Papers

  • Nine papers and a panel at the International Symposium on Forecasting 2005, in San Antonio, Texas, USA

    • "Terrorist Attack Prediction using Discrete Choice Models," This email address is being protected from spambots. You need JavaScript enabled to view it. and Michael Smith, Department of Systems and Information Engineering, University of Virginia, Charlottesville, VA 22904, USA

      Terrorists employ a range of attack modes to include suicide bombings, improved explosive devices (IED), mortar and rocket firings, and portable air defense missiles. The range of attack modes and the rareness of these events makes effective defensive measures difficult with the result that defensive actions typically impose greater restrictions on the larger population. Predicting the locations and times of terrorist events can enable more directed defensive efforts. While a number of predictive technologies might be used for this problem, very few are capable of dealing explicitly with the inherent decision making used by the terrorists in their attack planning. This paper describes an approach to terrorist incident prediction that uses discrete spatial choice models to predict the behavior of the terrorist. This work builds on our previous work using point process models and spatial choice analysis to forecast criminal behavior and suicide bombings. We give examples of the use of this approach and an evaluation of its performance. These evaluations show the discrete spatial choice models are more effective at predicting future attack locations than the more commonly used methods that employ kernel density estimates.

    • "Strategic Early Warning for Organized and Serious Crime," Dr. David Carment, Carleton University, Dr. Gregory O'Hayon, Criminal Intelligence Service Canada (CISC), Bruno Nordeste, Carleton University, Stewart Prest,Carleton University

      The criminal threat environment is ever-changing. As a consequence, law enforcement agencies have had to develop forecasting capabilities based on intelligence gathering and analysis that will improve targeting and resource allocation. Criminal Intelligence Service Canada (CISC), in partnership with Carleton University, has spearheaded an effort to develop a strategic early warning methodology and intelligence network in order to forecast threats. The methodology draws on existing capabilities developed by the Country Indicators for Foreign Policy at Carleton University. Both the methodology and the outputs derived from it have been adapted to meet the specific needs of law enforcement personnel and decision makers. We first examine the rationale and purpose for developing a strategic early warning capability. Second, we outline the methodology for identifying potential risks and their relationship to criminal activity. Third, we describe current research using the existing framework. Finally, we specify directions for future work and implications for both policy and strategy development. Early results from this project have been positive, as evidenced by the community's feedback to our warning product (SENTINEL). Early warning methodologies from other fields (including public health, military and national security) have proven useful to law enforcement's mission.

    • "Oracle of Battle, Part 3," Jonathan E. Czarnecki, Naval Postgraduate School, Monterey, CA 93943

      Part one of this project outlined the research problem: explanation and prediction of joint combat operations. It proposed a theory that contained testable hypotheses, and suggested methods by which the hypotheses could be examined. Part two initiated the testing of the theory. The testing revealed weaknesses in the theory and method (systematic judgment.) It also suggested means by which the theory and accompanying hypotheses could be improved.

      Part three will demonstrate the efficacy of the improvements, and explore the reliability and validity of the measures that bring form to the theory. The reliability of the variable matrix is .75, very acceptable considering the small number of the population. The prima facie and content validity as assessed by the judges is consistent and strong. Thus, the basic empirical elements of the theory seem sound.

      The expansion of the variability and the inclusion of an operational leadership variable, recommended by the last round of judges, improve the explanatory power of the theory, but only marginally and at the cost of statistical significance. Finally, the judges in this round strongly recommend that the theory's limit is to the descriptive and explanatory, and should not be expanded to the predictive.

    • "Forecasting the Unforecastable: The Impact of 9/11 on Las Vegas Gaming Revenues," This email address is being protected from spambots. You need JavaScript enabled to view it., Virginia Commonwealth University, 702-526-8154 and This email address is being protected from spambots. You need JavaScript enabled to view it., Virginia Commonwealth University, 804-798-3041.

      When a major disruption occurs, a time series can no longer be predicted reliably from the historical data. The issue most important to the forecaster's client, the decision maker, becomes identifying which of several alternative scenarios will characterize the future. For example, the events of 9/11 caused a major loss of gaming revenue in Las Vegas. Would revenues return to the previous trend line, return to the previous trend but at a lower level, remain flat indefinitely, or continue to deteriorate? Decisions to build/expand casinos, lay off employees, cancel contracts, and temporarily close hotel wings depended on which scenario would prevail. The challenge to the forecaster is to provide reliable answers rapidly as new data become available.

      In this paper, we develop a method for forecasting time series after significant disruptions. We develop simple models of the response to a disruption that blend easily with the pre-disruption time series model. Using Bayesian methods to adjust business judgments as new data arrive, we are able to identify the nature of the response and to develop a reliable forecast as rapidly as possible. We illustrate the method using gaming revenue for Clark County (Las Vegas) before and immediately after 9/11.

    • "Forecasting Domestic Conflict," This email address is being protected from spambots. You need JavaScript enabled to view it., Florida International University, and This email address is being protected from spambots. You need JavaScript enabled to view it., University of Peloponnese

      We take domestic conflicts across the world, measured and classified in terms of number of deaths and forecast their occurrence likelihood in future using the following models: (1) Poisson Autoregressive model, (2) Markov Switching model, (3) Artificial Neural Networks model and (4) Smooth Transition Autoregressive model. First two models take care of underlying conditionalities, if any, present in the original data. As the data generating process is unknown a priori, therefore, we use the neural network framework to investigate if the conflict process itself is state-independent. Additionally, as the sample ranges from 1950 to 2003, we choose the smooth transition model to explore the potential nonlinear pattern. We have also evaluated the first two and the last model in presence of economic, institutional and political control variables as being identified in the literature. Various model forecasts are then combined and compared with the individual model forecasts to generate improvements in forecasting performance. However, final results provide ambiguous improvement in out-of-sample predictions and call for a more general approach to correctly classify the data pattern.

    • "CASCON and MIT Research on Conflict," This email address is being protected from spambots. You need JavaScript enabled to view it., MIT Sloan School of Management and Center for International Studies

      The focus of this discussion will be on methods for better understanding the process by which disputes either do or do not escalate to threats of violence or to outright hostilities. One research approach currently underway is to use systems dynamics to operationalize theories linking post-conflict conditions of one conflict to the precursors of subsequent conflict. This work attempts to quantify measure where possible, identify data sources, and test theories against reality. In another vein of research, CASCON (Computerized System for Analyzing Conflict) supports decision analysis by historical analogy using a conceptual map of 571 factors influencing the dynamics of the conflict process. Factors were developed by generalizing case-specific events and circumstances identified as significant by case experts. Using the factor map, expert case knowledge is captured in an extensible database and software enables the analyst to explore patterns of similarities and dissimilarities between a current case and historical cases. In contrast to other approaches that use either simplified theories or collections of unconnected facts, CASCON offers a method for applying a multivariate abstract framework to assist in organizing research on a conflict situation, whether incipient or ongoing, in identifying comparable historical situations, and in analyzing potential future courses.

    • "What we know about forecasting methods for conflicts?" This email address is being protected from spambots. You need JavaScript enabled to view it., International Graduate School of Business, University of South Australia and This email address is being protected from spambots. You need JavaScript enabled to view it., The Wharton School, University of Pennsylvania

      We have learned much about forecasting for conflicts over the past 28 years. On the one hand, we have evidence that the accuracy of judgmental forecasts from domain experts and from game theorists is no better than chance. On the other hand, we know that forecasts from two methods, structured analogies and simulated interaction (a form of role playing), are substantially more accurate, especially when forecasts are combined. The error reduction compared to chance of combined forecasts for eight situations we used in our research was 13% for experts' unaided judgement and game theory experts, versus 31% for structured analogies and 83% for simulated interaction. Our findings are contrary to people's expectations. This situation presents opportunities for those who are first to adopt the new methods. Improvements in decision-making will occur even if only one party adopts the improved forecasting methods. There is still much to learn. For example, might some game-theoretic analysis aid forecasting under some conditions? Can Delphi technique or prediction markets provide useful forecasts for conflict situations? Are there conditions under which simulated interaction fails to provide accurate forecasts?

    • "Red Teaming Approaches for Homeland Security: A Review of Current and innovative Methodologies," This email address is being protected from spambots. You need JavaScript enabled to view it., Ms. Shelley Asher and Ms. Catherine Bott, Homeland Security Institute

      From a U.S. homeland security perspective, the need for understanding and anticipating the adversary's adaptive nature in a dynamic environment is greater now than ever before. Existing approaches used in the homeland security community for assessing the adversary's perspective will be discussed, including wargaming, vulnerability assessment, table top exercises, and red cell approaches. To enhance the ability to address current and future adversaries, innovative red team approaches must be identified. We looked outside and inside the defense and intelligence communities to identify innovative methodologies that could be adapted for use in red teaming. The adversary was not always defined in terms of terrorists or terrorist groups in the methodologies, but rather as competitors, unions, suppliers, or customers. Five innovative methodologies were identified: (1) Competitive Intelligence Wargaming, (2) Simulated Interaction, (3) Structured Analogies, (4) Structured Idea Generation, and (5) Rapid Ethnography. The innovative methodologies and issues for potentially adapting them to a homeland security context will be discussed. Additionally, implications for sharing and adopting methodologies across broad sectors and fields will be discussed.

    • "Role thinking: Does standing in the other guy's shoes improve forecast accuracy?" This email address is being protected from spambots. You need JavaScript enabled to view it., International Graduate School of Business, University of South Australia

      Two methods have been shown to provide accurate forecasts of the decisions that people will make in conflict situations. The first is simulated interaction, a kind of role playing, using novices; the second is structured analogies, a formal analysis of similar situations by experts. In contrast, when they use their unaided judgement, experts and novices alike provide forecasts that are no more accurate than chance. The success of the simulated interaction method suggests that realistic modelling of role and interaction between parties is important, while the success of the structured analogies method reinforces findings, from other research, that forecasts derived from expert judgements using a structured process are more accurate than those that derive from unaided judgement. Is it possible to obtain forecasts that are more accurate than unaided-judgement forecasts by encouraging participants to think about roles and interactions in a formal way? I will provide a tentative answer to this question using findings from research on the relative accuracy of forecasts from novices' role thinking.

    • Panel: "How to accelerate adoption of superior conflict forecasting methods"

      This email address is being protected from spambots. You need JavaScript enabled to view it.* International Graduate School of Business, University of South Australia, GPO Box 2471, Adelaide SA 5001, Australia

      This email address is being protected from spambots. You need JavaScript enabled to view it., Department of Systems and Information Engineering, University of Virginia, Charlottesville, VA 22904, USA

      This email address is being protected from spambots. You need JavaScript enabled to view it., Homeland Security Institute, 2900 South Quincy Street, Arlington, VA 22206

      This email address is being protected from spambots. You need JavaScript enabled to view it., MIT Sloan School of Management, and Center for International Studies

      This email address is being protected from spambots. You need JavaScript enabled to view it., Kent Center for Analytic Tradecraft

      Empirical research has led to the identification of methods for forecasting in conflicts that are superior to the current practice of using unaided judgement. Accurate forecasts of how people will behave in conflicts offer the prospect of better decisions. Adoption of superior practices can be slow, however. For example, it was 264 years after the discovery that lemons could be used to prevent scurvy that the British Merchant Marine changed their practices to take advantage of this knowledge. We think that improving conflict forecasting practice can and should be more rapid. Panellists will describe briefly their experiences in facilitating the adoption of superior methods and their thoughts on how best to achieve wider adoption. We will invite questions and suggestions from the audience.

  • Five papers presented at the International Symposium on Forecasting 2004, in Sydney, Australia

    • "How to Use Experts to Forecast in the War on Terrorism," J. Scott Armstrong, The Wharton School, University of Pennsylvania, and Kesten C. Green, Victoria Management School, Victoria University of Wellington

      In 2003, the Pentagon proposed the use of a market on terrorism as a way to assess dangers. While it was politically unpopular, one might ask whether it was a good idea. In many situations, combining the predictions of large groups of unbiased people can provide accurate forecasts. As we will show, given the conditions involved in terrorism forecasting, markets are unlikely to increase accuracy substantially. They might also have negative consequences for behavior. We examine alternative approaches to using experts to forecast acts of terrorism: these include the Delphi technique and structured analogies. Information on these approaches to forecasting conflicts with terrorist organizations is available at conflictforecasting.com. (Address correspondence to This email address is being protected from spambots. You need JavaScript enabled to view it.)

    • "An Oracle of Battle: Forecasting Results of Joint Military Operations," Jonathan E. Czarnecki, Naval War College of the United States

      War, campaigns, operations, and combat appear to be chaotic in their application of violence. However, within that chaos, there are common processes and behaviors that seem to transcend history and culture in the conduct of such chaos. Are there common elements or variables critical to all joint military operations? If there are common elements, can one begin to develop a theory that describes and explains this class of societal behavior? Finally, can one use such a theory to forecast the success or failure of joint operations, and thus obtain insight into forecasting the future results of the war in which such operations occur?

      This paper argues that there are common variables critical to joint military operations. It develops a theory that can concisely explain and described these operations through four independent variables. These variables are: training; integrated combat fires; decision space; and information processing. Using selected historical data from the post-1975 United States experience with joint military operations and applying psychometric judgmental scaling methods, the paper tests the theory's ability to explain the results of past joint military operations. It concludes that the theory has merit, and recommends further refinement through the continuation of research and production of forecasts. (Address correspondence to This email address is being protected from spambots. You need JavaScript enabled to view it.)

    • "Simulated Interaction: An Approach to Terrorism Forecasting," Kesten C. Green, Victoria Management School, Victoria University of Wellington and J. Scott Armstrong, The Wharton School, University of Pennsylvania

      Important decisions in the war on terrorism are based on predictions of the decisions that allies, adversaries, and terrorist leaders will make. Decision makers typically resort to unaided judgment but other approaches such as game theory and acting out the interactions between the parties (a procedure we call simulated interactions) have been proposed. Forecasts from simulated interactions using novice role players have been found to be more accurate than forecasts from both experts using their unaided judgment and game theorists. We review the evidence on these forecasting methods for conflicts and make suggestions on how simulated interaction would be useful for assessing alternative strategies and tactics, for example, the reactions of Iraqi groups to different constitutions, or the reactions of hijackers to different types of armed response. A description of the simulated interaction method is available at conflictforecasting.com. (Address correspondence to This email address is being protected from spambots. You need JavaScript enabled to view it.)

    • "Course of Action Analysis Interactive Role-play War-game," Kent M. Miller and Don E. Brown, Department of Systems and Information Engineering, University of Virginia

      Although widely used in business, the legal profession, and the military, studies demonstrating the predictive value of interactive role-playing in conflict forecasting are both sparse and suspect. More recent studies have explored the comparative accuracy of role playing in forecasting a single decision or outcome . We hypothesize that using simulated interactions to forecast the ostensible set of plausible decisions and/or outcomes is of greater utility to decision makers in conflict environments than a single decision and/or outcome forecast. The U.S. Military employs an interactive war-game to forecast the "success" or "failure" of a prospective battle plan when played against a single enemy course of action. A deficiency in the current doctrine is the inability to account for the uncertainty in the threat reactions. Our methodology remedies this shortcoming by allowing simultaneous play of multiple enemy courses of action. We believe the resulting risk assessment will facilitate the identification and development of more robust courses of action. (we define robustness as a quality which describes how well a course of action is expected to perform, taking into account the ostensible set of possible adversary reactions.) We present our findings from experiments conducted within the U.S. Military. (Address correspondence to This email address is being protected from spambots. You need JavaScript enabled to view it.)

    • "Comparing Conflict Prediction: Economic Motive vs. Non-Linear Framework," Dimitrios D. Thomakos and Prasad S. Bhattacharya, Department of Economics, Florida International University

      The empirical literature on domestic conflict shows an inverted U-shaped relationship between democracy, development, and onset of civil war. Our study examines this aspect to predict conflict intensity for seventeen Latin American countries using two different modeling perspectives.

      First, we build an economic model using explanatory variables from existing theoretical work. The conflict intensity is then analyzed from this model using ordinal regression and multinomial logit techniques. Using data from International Peace Research Institute, World Bank, and Statistical Abstracts of Latin America, we find overdependence on agricultural exports, along with lack of public and private investment in an economy characterized by poor socio-political performance, could lead to higher intensity of conflict.

      Second, we explore how our results change and possible improve by using a variety of potentially more powerful models. We examine whether Artificial Neural Networks framework, the Cox Proportional Hazard model, and Markov Switching Model can improve the accuracy of classification prediction for the intensity of conflict. Our results indicate that, for predictive purposes, there may be advantages for prediction in combining prior knowledge in the form of explanatory economic variables with a nonlinear classification model, rather than relying exclusively on the performance of the traditional ordinal regression approach. (Address correspondence to This email address is being protected from spambots. You need JavaScript enabled to view it.)

Return to top

Relevant Journals

Return to top

Mass Media Coverage

To date, the list of mass media coverage is a short one. If you know about mass media report that we have not listed, please contact Kesten Green.

December 2004

Unusually, a journalist for The Atlantic Monthly sought to anticipate the news. "Will Iran be next?" (PDF version)describes a war game devised to predict US plans to deal with an Iran intent on arming itself with nuclear weapons.

November/December 2004

Which terrorist threats should we be concerned about and which are the product of feverish imaginations? "Rethinking doomsday" (PDF version) in the Bulletin of the Atomic Scientists attempts to answer this question.

November 2004

The National Geographic ran an article on "The World of Terror" in their November 2004 issue. Highlights of the article, a public forum, a poll, links, and bibliography are available here.

October 2004

Is it possible to accurately forecast the behavior of terrorists? An article by Kesten Green inThe Oracleaddresses this question on page 8.

August 2004

The Defense Department's interest in artificial Intelligence, or "We developed technology to have the machine sort through this stuff" (PDF version), is the subject of a Mercury News story.

July 6, 2004

Terrorism forecasting using structure analogies and simulated interaction was the subject of an interview with Scott Armstrong on the Australian Broadcasting Corporation's "The World Today" program. (also available inPDF format)

May 01 2004

"Database measured 'terrorism quotient'" - Seisint's approach to identifying people who may have terrorist intentions led to arrests (also available in PDF format)

July 11, 2003

Play acting in order to increase big-ticket sales: One software company CEO describes using simulated interactions to improve the performance of his sales force in a CRN article, "Not Just Play Acting," by Chris Penttila

March 2003

Scott Armstrong was interviewed about the war on terrorism by BBC radio in Manchester, England on March 14, 2003. Here is a summary of that interview

September 2002:

Entrepreneur - Implications of findings on the relative accuracy of conflict forecasting methods for small businesses.

March 26 2002:

"Games or serious business?"Financial Times - Discussion of findings on the relative accuracy of game theorists' forecasts including interviews with academics and practitioners.

February 2002:

Forecasting in Conflicts: How to Predict What Your Opponent Will Do, Knowlege@Wharton - Forecasting methods for the war on terrorism.

July 1982

"This Football Fan Hopes that NFL Players Do Strike," The Philadelphia Inquirer, July 1982

Contact an Expert

The following people are willing to be interviewed on the subject of forecasting decisions in conflicts:

Professor This email address is being protected from spambots. You need JavaScript enabled to view it.

This email address is being protected from spambots. You need JavaScript enabled to view it.

If you would like to be placed on the list of experts willing to respond to media inquiries, please contact Kesten Green.

Return to top

The material for this Special Interest Group is organized and submitted by This email address is being protected from spambots. You need JavaScript enabled to view it.- Please contact him for further information, and corrections, additions, or suggestions for these pages

The following methods are used by practitioners or are recommended by experts:

  • Expert judgment

- unaided
- structured
- game theory

Interestingly, evidence on the relative accuracy of the methods is contrary to experts’ expectations.

Combining forecasts from substantially different methods has been shown to improve forecasting accuracy. It may be as well to combine forecasts from methods that have been shown to be accurate, and to weight forecasts based on evidence of relative accuracy.

Unaided Judgment

The most commonly used method for forecasting decisions in conflicts is experts using their unaided judgment. This is not surprising, as unaided-judgement forecasts can often be derived quickly and cheaply.

Evidence on accuracy

Return to top

Structured Judgment

One method for forecasting using structured expert judgement is the Delphi technique. Delphi panels challenge forecasts and forecasters’ reasoning anonymously. The method has been shown to improve accuracy relative to aggregating the forecasts of individual experts and the forecasts of unstructured groups. There is, however, no direct evidence on the relative accuracy of Delphi for forecasting decisions in conflicts.

How to do it

A panel of experts provide forecasts and rationale anonymously. The panel is provided with a summary of their forecasts and reasons. This process is repeated until responses are somewhat stable – typically after one or two iterations. An unweighted average of the individual forecasts is adopted as the panel’s forecast. When panellists choose options from a list or assign probabilities to options, a set of aggregate probabilities for each option can be calculated from their responses.

Between five and 20 heterogeneous experts with relevant domain knowledge should be used.

Evidence on accuracy

Return to top

Game Theory

Game theory is a method that has been advocated for developing strategy and for forecasting decisions in conflicts. For example, see Scott Armstrong’s review of Co-opetition. The method seems, however, to be infrequently used for forecasting decisions in real conflicts. Researchers have advocated various approaches or hyphenated game theories. For forecasting research, a useful definition of game theory is: that which game theorists do when they are asked to forecast decisions in conflicts.

How to do it

Evidence on accuracy

Return to top

Structured Analogies

Reference to conflicts that are similar to a current conflict for predicting the outcome of that conflict appears to be common practice. This practice has been much documented and discussed in relation to foreign policy forecasting. Structured analogies is a method that forecasts from analogies in a formal way. Structured-analogies forecasts are likely to be accurate if experts are each able to think of two or more analogous conflicts from their own experience.

How to do it

Forecasting with structured analogies involves four steps: (1) describe the target situation, (2) identify and describe analogies, (3) rate similarity, and (4) derive forecasts. We describe the steps here.

(1) Describe the target situation

Prepare an accurate, comprehensive, and brief description. To do so, the administrator might seek advice from unbiased experts or from experts with opposing biases. Where feasible, include a list of possible outcomes for the target situation following the description. Doing so makes coding easier. It also makes it possible to ensure forecasts are relevant to the user.

(2) Identify and describe analogies

Recruit experts who are likely to know about situations that are similar to a target situation. The number of experts depends upon the knowledge that they have about analogous situations, variability in responses across experts, and the importance of having an accurate forecast. Drawing upon the research on the number of forecasts needed when combining, we suggest using at least five experts.

Ask the experts to describe as many analogies as they can without worrying about the extent of the similarities. Then, ask the experts to match their analogy outcomes with target outcomes.

(3) Rate similarity

Once the situations have been described, ask the experts to list similarities and differences between their analogies and the target situation. Then they should rate the similarity of their analogies to the target on the basis of this analysis.

(4) Derive forecasts

To promote logical consistency, use pre-determined rules to derive a forecast from experts’ analogies information. This also aids replicability.

Evidence on accuracy

Kesten Green presented preliminary evidence on the relative accuracy of structured analogies forecasts at ISF 2002 in Dublin and at ISF 2003 in Merida. Kesten and Scott Armstrong are preparing a paper on "Structured Analogies for Forecasting" - full text.

Sources of analogies

Return to top

Simulated Interaction

Simulating a conflict that involves interaction between parties using role players is referred to as simulated interaction. The simulation outcomes are used as forecasts of the actual outcome. Alternative strategies can be compared using independent simulations with different role players.

How to do it

Forecasting with simulated interaction involves four steps: (1) describe the roles of the people in the conflict, (2) describe the target situation, (3) simulate the situation, and (4) derive forecasts. We describe the steps here.

(1) Describe the roles of the people in the conflict

Describe the roles of all the people involved in the target conflict, or describe the roles of two people to represent each party to the conflict. Allocate roles to people who are somewhat similar to the actual people if the cost is not great, then ask them to read and adopt their allocated role for the duration of their simulation.

(2) Describe the target situation

Descriptions of about one page in length are sufficiently long to include information about the parties to the conflict and their goals, relevant history, current positions and expectations, the nature of the parties’ interaction, and the issue to be forecast. Longer descriptions may overburden role players. A comprehensive and accurate description will help role players achieve realistic simulations. A good understanding of the situation and considerable care and effort is therefore advisable at this stage. To this end, obtain information from independent experts, have the description reviewed, and test the material. Conducting simulated interactions using independently prepared descriptions may be worthwhile.

Where knowledge of possible decisions is good, provide a list for role-players to choose from. This provides clarity for role players who are unfamiliar with the type of situation they are simulating, ensures forecasts are useful, and makes coding easier.

(3) Simulate the situation

Ask the role players to improvise and interact with others in ways that are consistent with the target situation. In practice, this appears readily achievable with careful preparation. Typically, interactions might involve face-to-face meetings and preparation for these. Exchanging information using notes or networked computers, or conducting telephone conversations might be more realistic for some situations. Provide realistic settings if this is affordable.

(4) Derive forecasts

For each simulation, the forecast is the final decision made by the role-players. Conduct as many as ten simulations and combine the predictions. For example, if seven simulations led to war, one would predict a 70% chance of war.

Evidence on accuracy

Return to top

Descriptions of Methods

Miller, K. & Brown, D. (2004). "Risk assessment war-game (RAW)." University of Virginia Department of Systems and Information Engineering Working Paper SIE-040009 -Full Text

Field Experiments - Small scale or short-term trials

Field trials are likely to be more realistic, and hence provide more accurate forecasts, than simulated interactions. Although they are recommended by forecasters, they are not often used. There are good reasons for this: experiments in the field will often be expensive; rivals may be alerted; comparing alternative strategies or policies may not be possible; and experimentation may be impossible if a strategy depends on an unusual coincidence of circumstances. The task of forecasting German response to the Allied landings at Dunkirk illustrates these points.

Prediction Markets

Prediction markets may be useful tools for conflict forecasting when valid knowledge is dispersed among many people who are willing to reveal their knowledge through wagers. A paper on the subject by Wolfers and Zitzewitz is available here. Internet-based markets offer predictions on possible events such as a US attack on Iran, a terrorist attack in eastern-US (Foresight Exchange), the exile of Yasser Arafat, and the creation of a Palestinian state (Tradesports.com).

Return to top

The material for this Special Interest Group is organized and submitted by This email address is being protected from spambots. You need JavaScript enabled to view it. – Please contact him for further information, and corrections, additions, or suggestions for these pages

The early registration deadline for the MORS Symposium being held on 20-23 June in Monterey, CA is 1 April. See the Conferences Page for details.

The Intelligence Advanced Research Projects Activity (IARPA) have invited experts to take part multi-year competition to investigate the accuracy of individual and group predictions about global political, economic and military developments.
More...

In a Wharton Magazine blog, Scott Armstrong and Kesten Green describe problems with much current foreign policy decision making based on expert judgments. More importantly, they provide solutions in the form of evidence-based methods than can provide better forecasts of the effects of alternative policies. The blog is here, and a printable version is available here.

Two additional resources for researchers and practitioners concerned with forecasting terrorism and domestic conflict: Predictive Societal Indicators of Radicalism, and CIRI Human Rights Data Project. Available from the
Resources Page.

Information on the International Association for Conflict Management's 2011 Conference in Istanbul and a call for papers can be found on the
Conferences page.

An update on IARPA's judgmental forecasting Aggregative Contingent Estimation (ACE) Program is available
here.

The U.S. Department of Defense has issued a request for information entitled "Retroactive Statistical Evaluation of Science and Technology (S&T) forecast and forecasting methods." For more information, got the Federal Business Opportunities page
here.

IARPA is to host an information and planning conference for a program on group judgmental forecasting. The announcement in full can be found
here...

Evidence on the recommendation to think about roles to forecast decisions in conflict situations: Have you ever been told you should "stand in the other person's shoes" in order to predict the decisions they will make? Research by Green and Armstrong casts doubt on the usefulness of this recommendation.
more...

The Intelligence Advanced Research Projects Activity (IARPA) has issued a request for information on forecasting for national security decision makers.
more...

Copenhagen Business School and the University of Nottingham are organizing a Conference on International and Intercultural Negotiations on 8-9 April 2010 in Copenhagen.
more...

The International Association for Conflict Management's 23rd Conference will be held in Boston on 24-27 June 2010.
more...

The organising committee of the 4th IMA Conference on Analysing Conflict Transformation to be held on 28-30 June 2010 is calling for the submission of abstracts.
more...

The International Symposium on Forecasting in June and the Conflict and Complexity conference in September both have offerings of interest to the conflict and terrorism forecasting communities.
more...

The International Institute of Forecasters are now accepting applications for the 2008 SAS research grants valued at $5,000 each.
more ...

Forecasting with analogies made headline news when President Bush suggested that the consequences of the US withdrawal from Vietnam would be repeated in Iraq if troops were withdrawn before stability was achieved. Analogies can be useful for forecasting, and Green and Armstrong's paper on their proper use appears in the next issue of the International Journal of Forecasting and is also available as a working paper.

How good are experts at predicting the decisions people will make in conflicts? An article by Green and Armstrong that answers this question has now been published in Interfaces 37(3) with commentaries by Goodwin, Kirkpatrick, Koehler, and Tetlock. A working paper version in Full Text.

Handfuls of international conflicts have erupted in most years since the end of WWI. The International Crises Behavior Project have released Version 7 of their database, which now includes descriptions and coding of conflicts to 2004.

 Older News

Use these quick links to go to the topic of interest:

News
The World
War on Terrorism

Salient Conflicts
War in Iraq
Contact

News

The World

War on Terrorism

Salient Conflicts

Forecasts on the War in Iraq

  • Using a model based on statistics from previous wars, researchers Allan Stam and Scott Bennett forecast on April 2 that the war in Iraq would last 2.5 months.
  • Mathematician John Allen Paulos tentatively predicted on March 30 (using Lanchester's Law) that US and British forces could be at a disadvantage relative to Iraqi troops if they were to become involved in house-to-house fighting.
  • Four military historians were reported as predicting that the U.S. and British forces would fail to conquer Baghdad in a March 29 article in the Online Journal. Citing historical analogies, Dr. Manfred Messerschmidt, Dr. Gerd Krumeich, Dr. Bernhard Kroener, and Brigadier Helmut Hauff each dismissed the task as impossible. While most of the analogies cited by the experts involved the besieged city being taken, they argued that these cases were unusual or were different to the Baghdad situation in important respects.
  • Middle East Quarterly editor Martin Kramer found disagreement among experts about political outcomes in Iraq in his article titled "Professional Pundits Place Iraq Bets". While Georgetown professor John Esposito and Columbia University appointee Rashid Khalidi both predicted the occupation of Iraq would lead to inflamed anti-Americanism and no democracy, Princeton emeritus professor Bernard Lewis and Johns Hopkins professor Fouad Ajami predicted enlightenment, modernity, and democracy would prevail.
  • Before the war in Iraq had begun, George Freeman – from the Boston-based consultancy Stratfor – forecast it would be over by mid-April. Commentators discuss Stratfor's record in the same item. Up-to-date forecasts on the war in Iraq and other conflicts are available on Stratfor's site.

Contact

Please send This email address is being protected from spambots. You need JavaScript enabled to view it. an e-mail message providing details if you know of a specific forecast of a current conflict that you think should be listed here. You should include:

  1. a brief description of the conflict

  2. the forecast

  3. the forecasting method

  4. the identity of the forecaster

  5. contact information for the forecaster.

The material for this Special Interest Group is organized and submitted by This email address is being protected from spambots. You need JavaScript enabled to view it. – Please contact him for further information, and corrections, additions, or suggestions for these pages

A resource for managers, practitioners, and researchers concerned with forecasting the decisions of parties in conflict. Forecasting decisions in conflicts is forecasting for:

  • buyer-seller negotiations

  • negotiations among distribution channel members

  • competitor reactions to new product introductions

  • industrial disputes

  • corporate takeovers

  • inter-communal conflicts

  • political negotiations

  • diplomatic and military confrontations.

  • countering terrorism

These pages are part of the Forecasting Principles site. In keeping with the objectives of the site they present research findings that support evidence-based principles (guidelines, prescriptions, rules, conditions, action statements, or advice about what to do in given situations).

Find Out About ...

- Methods for forecasting in conflicts
- Current Conflicts in the world

-
Papers, commentary, and relevant journals
- Resources for practitioners, researchers, and educators
- Mass media coverage of forecasting decisions in conflicts
- Conferences on conflicts and forecasting

- News

The material for this Special Interest Group is organized and submitted by Kesten Green – Please This email address is being protected from spambots. You need JavaScript enabled to view it. for further information, and corrections, additions, or suggestions for these pages.