Being Quantitative and Scientific
Similar to the bias against probable foresight, there is a persistent bias in some corners of our field against assigning numbers and specific outcomes to our work. It’s as if we think the world of words, argument, and qualitative descriptions of the future, is somehow superior to attempts to count, model, and predict. We are impressed with telling entertaining stories, entranced by the way words can influence others. But without specific falsifiable predictions, words remain subjective and populist opinions, speculations, and propaganda.
We foresighters do ourselves no good by stating things like “the purpose of forecasting is not to get the future “right””, found on p.127 in Bishop and Hines’ Thinking About the Future (2013), an otherwise-excellent practitioner’s guide. In reality, getting the probable aspects of the future right, in a numerical and confidence-interval bounded way, is a major goal of forecasting, which is a key anticipation skill in foresight. Using quantitative methods and speaking in probabilistic answers, even when crudely derived, brings an accountability to our work that is rarely seen with qualitative approaches. We must use quantification whenever we usefully can.
So we must welcome to our profession more of those who work to discover the quantitatively predictable aspects of the future, at whatever level of granularity. We must also welcome all those who seek to bring science knowledge and methods, formally and informally, to bear on our foresight problems. These kinds of individuals are as critical to the field as those who are interested in creatively exploring the possibility space, and in finding preferred futures. A scientific and quantitative approach is also critical to building consensus for action, as the phrase “care, count, and act” (first we decide to care about a problem, then we try to count how bad it is, then we report our counts and generate action) in social activism attests.
We can also make sure to make specific and quantitative predictions, and to publicly review our predictions annually, just as The Economist magazine, one of the best-written weeklies in the world, does in their annual The World in [Year] publication. Such accountability practices keep us grounded in evidence, build humility and conservatism, and helps expose our hidden biases and poor mental models.