Six Eras of Foresight Practice
We have previously mentioned the foresight in ancient times of Delphic oracles, religious prophecy, and utopian thinkers. While undoubtedly useful for their time, these well-meaning speculative activities occurred in cultures that were missing critical elements of modern mindset. We can call them precursors to strategic foresight.
Strategic foresight was born in the 17th century European Enlightenment, when modern approaches to freedom of thought and action, including individual rights, the scientific method, egalitarianism, democracy, and separation of church and state emerged. Read Nadler’s Spinoza (2001) for an overview of the life and works of philosopher Baruch Spinoza, perhaps the pivotal Enlightenment mind behind the emergence of our modern democratic state, and the new mindset that allowed a modern, evidence-based foresight to first emerge and flourish in individual minds.
While it was born in the Enlightenment, strategic foresight started growing well, as a set of empirical methods, only during the great wealth and productivity unleashed by the Industrial Revolution. Foresight as an academic field remains a late bloomer by comparison to related professions such as economics, sociology, political science, and science and technology studies, and also in relation to many of foresight’s specialty and related professions, such as forecasting, planning, risk management, and even leadership. Nevertheless, we can foresee several paths ahead to its practice maturity, as we will discuss throughout the Guide.
Era 1: European Enlightenment and Inductive Foresight (1650s to 1840s)
The European Enlightenment birthed modern science, reasoning, and foresight. A key development that allowed these fields to emerge was inductive reasoning, experimental methods of observing individual instances and finding a general rule to describe them. Inductive thinking became legitimate only with probability theory, which began with a collaboration by Blaise Pascal and Pierre de Fermat in 1654, on how to estimate the future in games of chance. Their key insights led to Hugyens’ On Reasoning in Games of Chance (1657), Cardano’s Book on Games of Chance (1663), and Bernoulli’s The Art of Conjecturing (1713). Though this type of reasoning was described by Aristotle in Ancient Greece, we needed two additional millennia and probability theory to finally convince serious people that inductive methods deserved the same attention that deduction (logic) had always enjoyed. These key advances paved the way for the birth of statistics in 1750.
From this era onward, the shortcomings of human logic and rationality could finally be balanced by observation, induction, and experiment. Perennial prisoners of our own mental models, we now had ways to get outside our heads, and falsify and verify them via conducting experiments and collecting environmental data. Another major foresight advance in this era came from philosophers of mechanics (later called physicists) in particular Isaac Newton’s invention of calculus (the mathematics of change) and his landmark publication of universal laws of motion in Principia Mathematica in 1687. After Principia, people knew that by building instruments, collecting data, turning it into numbers, forming hypotheses, and doing experiments, both in mathematical simulations and in the physical world, they could develop probabilistic explanations of reality, past, present, and future.
Enlightenment optimism also birthed our first popular visionaries in qualitative foresight, most famously the writer Louis Sebastian Mercer and his utopian novel The Year 2440 (1770), widely read across Europe in twenty-five print editions. This fertile era ended with Belgian statistician Adolphe Quetelet, and his Treatise on Man (1835), which introduced the idea of “social physics” (later called sociology), the collection of data on crime and social events to build predictive models of the “average man”, and to discover hidden statistical laws for social processes. This work led philosopher August Comte, another founder of modern sociology, to propose his Law of Three Stages (societies typically begin Theological, then become Metaphysical, then Scientific), one of the first developmental models of social change. Encouraged by Newton’s progress in the physical sciences, in A General View of Positivism (1844), Comte made the extreme proposal that all social knowledge must ultimately derive from logical and mathematical precepts and laws. But while logic and rationality have their uses, positivism was a very incomplete way to understand the world. Probability and statistics, and the weak statistical laws and correlations we find in many complex systems, have usually proven far better lenses to describe our environment than logic, rigorous causality, and equations. And in social systems, factors like culture, custom, instinct, intuition, vision, emotion, and possibility are generally far more important than rationality in ascertaining what humans will or may do next.
Era 2: Evolutionary, Industrial, and Statistical Foresight (1850s to 1920s)
The Second Era began with the advent of evolutionary theory, first published in Charles Darwin’s On the Origin of Species (1859). David Sloan Wilson’s Evolution for Everyone (2007) is a nice to evolutionary theory. See Jonathan Losos’s Improbable Destinies (2017), a great intro to evolutionary development (“evo devo”), a topic we’ll discuss in Chapter 11. One of the most interesting features of our natural world is that while many processes are unpredictable, chance-based, and divergent (“evolutionary”) over time, other processes are predictable, fated, and convergent, or “developmental”. Better understanding evo devo processes, and how they work in tension with each other in biology, society, and technology, is one of our great opportunities for improving evolutionary theory.
Evolutionary theory was quickly applied to social change, most notably by Herbert Spencer, who proposed that social evolution occurred by Lamarckian (use and disuse) methods. At the same time, the Industrial Revolution began transforming larger English and European cities and creating a vast new class of factory workers, the public became more interested in change and the future than ever before.
The Industrial Age birthed a new generation of visionaries to extrapolate its wonders and problems into the future, most notably Edward Bellamy, H.G. Wells, Jules Verne, and Kurd Lasswitz. Bellamy’s utopian foresight novel, Looking Backward: 2000-1887 (1887), inspired 160 Bellamy Clubs across the US, to advance his nationalist political ideals. Well’s book Anticipations (1901) is perhaps the first widely read non-utopian social foresight work, a speculative book on the future that was as popular as his science fiction, making Wells the best-known futurist of this era. Jules Verne also attempted predictive social foresight work in Paris in the 20th Century (1863), but his first draft was rejected by his publisher as too dystopian and shelved, whereupon he turned to technologically optimistic science-fiction for a career, producing such classics as Twenty Thousand Leagues Under the Sea (1870). Lasswitz, the father of German science fiction, stressed adhering to natural law and psychology to create speculative fiction. Gegen das Weltgesetz (1878) discusses megacities, overpopulation, synthetic food, and supercomputers.
For more on how science and science fiction have influenced each other over all six eras, and how each revitalize the visionary strand of foresight, read Brake and Hook’s Different Engines: How Science Drives Fiction and Fiction Drives Science, 2007. Visions can be dangerous as well, when we overrely on them without evidential support. We must also include Karl Marx and Freidrich Engels in the class of particularly deadly visions. Their theories of political utopia, beginning with The Communist Manifesto (1848), were perhaps the most harmful ever penned, as utopian communism was employed by opportunistic autocrats to justify tens of millions of deaths in revolution, famine, and imprisonment. Soviet gulags and Mao Zedong’s repressive cultural revolution are emblematic of this bleak era of 20th century life. At one time, almost half the world fell under communism’s dehumanizing and unsustainable vision of the state.
Statistical foresight also made great progress in this era, beginning with Henry Buckle and his impressive History of Civilization (1857), which argued that a true science of both history and the future must eventually be constructed via statistical approaches. Massive government projects to collect statistical data to find hidden constancies (averages, relations, and trends) in society began in the late 1800s, at first in Europe. The US government eventually joined in, commissioning statistician Herman Hollerith to build the first punched card tabulator for the 1890 Census. Hollerith co-founded CTRC in 1911, which became IBM in 1924, moving us from the statistical era to our modern information age. For one of the frontiers of statistical foresight today, see E.T. Jaynes, Probability Theory (1995) (free PDF) on Bayesian inference, our best current model for how humans and computers reason from incomplete information.
The period 1815-1914 is known as Britain’s Imperial Century. The American Century, with our global dominance in science, technology, economic, political, and cultural terms, began in earnest after World War I. It has now exceeded the “century” mark, and shows no signs of ending anytime soon. There are many reasons for America’s incredible rise, not the least of which was our protective geography and our constitution, but perhaps the most important has been our immigrant willingness to jettison much of the past, in our search for a new present and better future. America began to take leadership of the industrial evolution away from Britain, beginning with the American system of manufacturing in the late 19th century. Americans often forget this, but Germany was the other deeply adaptive country in the early 20th century. German industrialists and scientists had taken leadership of chemistry and other key industries from their British inventors, and by 1910, Germany was exporting more than all the other European countries combined. Germany was winning the global trade “battle” at the time. Even over America. So why go to war in Europe? Unfortunately, European political leaders at the time had a fatal flaw in their ability to look to the future. They were deeply stuck in past traditions, which were zero-sum, militaristic, and Machiavellian. Germany fell into old martial habits, and the other leaders followed suit. Blissfully ignorant of Europe’s past, Americans could adapt to the many demands of the Industrial Revolution increasingly faster than Europe, and that superior adaptation rate continues in several important ways today. Our history shows a selective forgetting of outmoded past habits and values, and a strong willingness to fail in the present, are important to continually reinvent the future.
Foresight work spread outward and downward from governments into organizations at the end of this period via strategic planning, first taught at Harvard Business School in the early 1920s. A foresight influence in the opposite direction occurred via the engineer Frederick Taylor and his theory of scientific management, an early precursor to operations research. Taylor’s work helped birth the Efficiency Movement of the 1890s to 1920s, which influenced first US and European businesses, and later their governments, to systematic discovery of ways to increase efficiency and reduce waste, another very valuable type of foresight.
At this time a particularly one-way and utopian form of political central planning emerged in the Soviet Union with the Gosplan in 1921, which evolved into successive five-year plans from 1928-1991, at which point the USSR collapsed. The German Nazis from 1920-1945 and Italian Fascists from 1922-1943 also engaged in particularly autocratic and utopian planning, and an escalating strategy of aggression which rapidly accelerated their collapse. Autocratic central planning strategies can show impressive public benefits in the short-term due to focus and scale, but they often also quickly produce morally corrupt and disastrous results, due to the appalling lack of representation, accountability and meaningful feedback (Skill 8 of the Do loop) in such political systems.
One technology visionary of this era, scholar Henry Adams, proposed in A Law of Acceleration (1904) and A Rule of Phase Applied to History (1909), that scientific, technological, and societal progress are actually built into the physics of the universe, and must accelerate in a manner as predictable as the law of gravity. But the catastrophe of World War I quickly dashed hopes for any simplistic quantitative predictability in the progress of social and economic systems, as did the Great Depression, WWII, the Cold War, and mass political atrocities in the 20th century. Nevertheless, by recognizing the predictably accelerating growth of science and technology’s aggregate capacities, and their increasingly central influence on civilization, Adams was among the first scholars to recognize that accelerating change is much less an evolutionary choice than it is a developmental constraint on Earth’s environment. It is a process we can expect to exist on all Earth-like planets in our universe. The second era closed with the birth of the foresight subdiscipline of science fiction, with the launch of the first SF-dedicated magazine, Amazing Stories, first published by the illustrious inventor-futurist Hugo Gernsback in 1926. Our dreams have never been the same since.
Era 3: Technocratic and Analytic Foresight (1930s to 1950s)
The Third Era began with a series of key papers on computability by logicians including Alan Turing, Kurt Gödel, Claude Shannon, and others that birthed our modern computer and information age. This era also saw the first major US government-funded studies on social and demographic trends, Ogburn’s Recent Social Trends (1933). H.G. Wells famously called for an academic field of foresight to emerge in this era in “Wanted: Professors of Foresight” (1932), but unfortunately no academics rallied to this call by establishing foresight programs until the 1970s. A number of early 20th century futurists continued to champion the idea of partial social predictability, perhaps most notably Ossip Flechtheim in Germany, who first coined the term “futurology” in the mid-1940s, and proposed it as a new science and discipline of social probability and statistics.
Most conspicuously, this era saw rapid advances in science, technology, and logistics during the Second World War. Statistical foresight was invigorated by analytical successes with code breaking (Bombe, 1940) and ballistics modeling on the world’s first programmable digital computers in WWII (Colossus, 1943) and shortly after (ENIAC, 1946), and with new analytical methods like mathematician Stanislaw Ulam’s invention of Monte Carlo algorithms for simulation in the 1940s. As the war wound down, and its terrible costs and dangerous new atomic weapons were soberly assessed, government and military leaders saw the need to fund the emergence of the world’s first think tanks dedicated to long-term foresight work, including SRI (1946) and RAND (1948) each starting out, and still mostly run, on government contracts. RAND quickly became the forefront of this new foresight community, started with $10 million in funding by Air Force general Hap Arnold as a place for bright minds to “get to the future first” in key areas of science and technology.
Our field then entered an overly technocratic and analytical phase of social foresight, with a good deal of simplistic, logical, positivist, optimization-seeking, government- and defense-funded foresight. Operations research and other quantitative foresight methods made good progress in technical fields, on various well-structured and narrowly-defined problems in industrial engineering, logistics, and efficiency research. Some technology foresight work was quite prescient, including Theodore Von Karman’s Toward New Horizons (1944-46), and Vannevar Bush’s Memex vision of a future Internet, in As We May Think (1945), and his 1945 presidential report proposing the National Science Foundation, founded in 1950. But in many cases, the futurists of this era, enamored with their new calculating machines and methods, greatly underestimated the complexity and uncertainties of social and economic systems in their models, and greatly overestimated the degree of logic and control that planners can exert over complex natural systems. They were continually surprised by unintended consequences of social and economic competition and growth, including overpopulation, environmental degradation, conflict, and social revolutions. Modeling and probabilistic prediction of most societal events turned out to be more complex than 20th century society had the computational tools, platforms, patience and money to address. But many US technocrats, emboldened by WWII and its aftermath, overstated their claims and understated their difficulties, and allowed their unique position of wealth and global dominance to bias and oversimplify their foresight.
A film series on the excesses of technocratic foresight in this era is Adam Curtis’s Pandora’s Box (1992). The six episodes show the risks of oversimplifying complex problems, a mindset that persists in many governments and organizations today. Another great overview of the start of our modern professional foresight field in this era is sociologist Wendell Bell’s Foundations of Futures Studies V.1 (2010).
Era 4: Growth vs. Environment Foresight (1960s to 1970s)
Our Fourth Era of foresight was demarcated by two great social revolutions, one in Civil Rights and Women’s liberation, and another in Environmentalism, particularly in the US. With Rachel Carson’s publication of Silent Spring (1962), an Environmental-Limits-Rights school of social foresight emerged in the US. A succession of economic, environmental, and political crises (Cuban Missile Crisis, Stagflation, Three Mile Island, Love Canal, Ocean Pollution, the Vietnam War, the Watergate Scandal) in this era brought urgency to the new school. Environmentalism was greatly aided by such publications as the Whole Earth Catalog (1968), Limits to Growth (1972), and the establishment of Earth Day and the Environmental Protection Agency in 1970. The National Organization of Women, formed in 1966, also helped the US onto a long road toward better treatment and opportunities for the world’s largest subjugated class.
Disenfranchised sectors of the public (blacks, women, gays, ethnic minorities) began to view themselves as A Force More Powerful with respect to social change. See the book (2000) and film (2000) by this name for a great history of nonviolent social activism. At the same time the sexual revolution occurred, counterculture experiments bloomed, and in the violent fringe (Weathermen, Black Panthers, etc.) politically-motivated terror events happened almost daily in the US, a rate that has steadily declined since, in contravention to modern media hysteria. The U. Maryland START Global Terrorism Database cites 447 US domestic terror attacks in 1970 (the first year of their data) and their count has averaged under 15 a year since 2000.
As happened in the post-depression and post-war periods of the last era, many new rights (freedoms from undesirable social states, including many forms of discrimination based on race, sex, and wealth) and entitlements (guarantees of public benefits) emerged in this era. But this time, mass public activism (sit-ins, marches, protests) fueled by even more pervasive new mass media (television, film, periodicals) could be even more broadly credited with procuring many of these new rights and entitlements, while a failed war in Vietnam and a failed presidency with Nixon highlighted the limits of top-down power.
In response to these same events, the Technocratic school morphed into a Growth-Freedom-Technocratic school of social foresight, with major expansions of global trade and agreements and the rise of the neoconservatives and the Chicago school of (laissez-faire) economics in the mid-1970s. By the end of this era, two great schools of foresight thought had emerged, each defining many areas of uniqueness in values and goals. These remain our two dominant schools today.
Also spurring a renewed interest in technology and general foresight were stunning Apollo program technical achievements, and unifying views of the Earth from space as a single and potentially fragile system, a phenomenon futurists call topsight (as opposed to hindsight or insight) or the Overview Effect. Meanwhile, computers continued to make accelerating advances, as they moved from mainframes to minicomputers, and by the end of the era, our first personal computers. In the 1960s, pioneering computer scientists like Herbert Simon and Allen Newell also promised (and greatly overpromised, it turned out) rapid advances in artificial intelligence by the end of the century.
It was in this time of dramatic social, economic, and technological change that our first modern professional foresight practitioner books emerged, as did our first foresight graduate programs (U. of Houston, U. of Hawaii), and professional associations (WFS, WFSF). Many new paid governmental, corporate, and popular foresight projects originated in this era. By 1960, every Fortune 500 company had a corporate planner, though how planning should occur in the executive suite was far from clear. SWOT analysis was born at SRI in the early 1960s, to compare internal/current and external/future positive and negative business factors, and was popular in business by the 1970s.
My guess is that US spending on foresight work per dollar of GDP reached a recent maximum in this era. If so, we can call this era, particularly the 1970s, our latest period of Peak Foresight. Over five times as many people as at present attended the World Future Society conferences during this time, which were a heady blend of new technology optimism and new environmental concern. Top political and business leaders attended WFS conferences at the time (today, they have their own specialty foresight events, like the World Economic Forum’s Davos conference) and were interested in how the young field of professional foresight might help in business and government.
This era also produced several of the most insightful, comprehensive, and rigorous books on professional and global foresight yet printed, books like Fred Polak’s The Image of the Future, V1-2 (1961), Bertrand De Jouvenal’s The Art of Conjecture (1967), Kahn and Wiener’s The Year 2000 (1969), Alvin Toffler’s Future Shock (1970), Joseph Martino’s An Introduction to Technological Forecasting (1973), Jib Fowles’s Handbook of Futures Research (1973), and Freeman and Jahoda’s World Futures: The Great Debate (1978) to name just a few. See Chapter 6 and Appendix 3 for some starter lists of others. Don’t judge a book solely by its publication date, either. Several of these older works still contain insights that are yet to be topped by current publications.
Also in this era, the US Office of Technology Assessment, so far the only foresight research group created for the US Congress, was started in 1972, and this institution survived various budget cutters until 1995. Perhaps due to the novelty of professional foresight at the time, and a backdrop of particularly dramatic social, economic, and technical changes, strategic foresight was taken seriously by business, government, and the public in a way that is yet to be exceeded.
Era 5: Globalization vs. Sustainability Foresight (1980s to 1990s)
In the Fifth Era, both major schools of foresight grew up further. Growth-Freedom-Technocratic foresight evolved into the Globalization school, and Environmental-Limits-Rights foresight evolved into the Sustainability school. On the Globalization side, multinational corporations became even more rich and global relative to the governments that chartered them. Perhaps the single most useful way to understand how our sociopolitical environment has changed in the last half-century can be summed up in the following statistic: In the 1950, roughly twenty of the top revenue-generating entities on Earth were corporations, and the rest were nations. By the end of the 1990s, this ratio had inverted. There were only about twenty nations left on the Top 100 list. We could have sought to limited corporate size relative to the state, both individually and as a group, but such a response would have required a global political consensus impossible at the time. Those nations that individually attempted to regulate corporate size and scope tended to create economies that were less innovative and competitive. At the time, the average citizen was too poor, there were so many basic human problems that needed to be solved, for corporate regulation to be a serious priority. That state of affairs will likely continue to be true for a while yet, though we can now foresee a coming time of abundance, some time this century, where our democratic priorities may change.
Coordination and concentration of wealth in the largest conglomerates has been accelerated by ever better global communication, computing, automation, and control technologies. Major offshoring and deregulation occurred in the US and UK in the 1980s, and the Washington Consensus began driving global economic integration, unfortunately in ways more beneficial to the wealthiest than to the average citizen. Naomi Klein’s Shock Doctrine (2008) offers one very readable, though left-leaning, view of the integration. Meanwhile, the sustainability futurists kept up their accounting of capitalism’s excesses, and continued their role, since the 1960s, as the foresight field’s main social critics and activists.
To better understand the insights and limitations of both foresight schools, see Worldwatch’s State of the World reports (1984-present) for a leading set of foresight perspectives from the Sustainability school. See the Cato Institute’s Economic Freedom of the World reports (2001-present) for a leading set of foresight perspectives from the Globalization school. For two sets of reports that are more balanced perspectives from both schools, see the Economist’s The World in (Year), reports (2004-present), and The Millennium Project’s State of the Future reports (1997-present).
As our social wealth and technological capacities grew even faster during this era, public interest in the future continued to rise, and this period saw the launch of new foresight-centric magazines like Omni (1978-1995), Whole Earth Review (1985-2002), and Wired (1993+). But while corporate and institutional spending on foresight, forecasting, and strategic planning grew in absolute terms, it very likely declined relative to other spending priorities, even while factoring in the temporary surges caused by millennialist Y2K hype and the first internet boom. As Mintzberg outlines in The Rise and Fall of Strategic Planning (1994), a new realization of the limits of top-down strategy, planning and foresight emerged during this era, as the pace and complexity of the business world continued to accelerate.
The increasing velocity of business transactions and events in turn caused corporate planning horizons to shrink from a few years in the 1950s to a few quarters by the end of the century, and short-term thinking and speed of execution became paramount. In the 1980s, advances in nonlinear dynamics research (colloquially, “chaos theory”) also led some scholars and leaders to conclude quite mistakenly, as Stewart and Cohen explain in The Collapse of Chaos (1994), that most of the world’s complex systems, including social systems, were intrinsically unpredictable. The reality, continual partial predictability of complex systems, is more subtle to appreciate.
In this environment, our leading academic foresight programs and professional associations saw membership in stasis or decline in this era, even as global economies boomed, in what we can call a Foresight Winter. This lull in professional foresight interest and funding has parallels to the AI Winter of 1974-1993, which occurred after computer scientists overpromised AI’s capabilities during 1960s to early 1970s, at a time when our scientific methods and machines were still in their most rudimentary forms.
Further technology-aided globalization and waves of deregulation, led by Reagan in the US and Thatcher in the UK in the 1980s allowed income inequality to grow particularly fast in advanced Western nations. Income, asset, and power inequalities will often grow in the first phase of application of new technology in business, if the holders of capital can use that technology to generate greater returns than others in society. This condition is known in economic shorthand as r > g, when rate of return to large holders of capital is greater than general economic growth. But in the second stage of wealth creation, income inequality tends to level off (stays high, but stops growing), as the rate of return drops the greater the capital accumulated, and as more workers are (usually) employed by the mature technology, and broadly higher wages ensue. In the third stage of wealth creation inequality finally declines, as the newly enriched and emboldened middle class gain greater freedom and political power, new entitlements and redistributive policies are enacted, and new regulation of the elite and empowerment of entrepreneurship swings the system back to a more equal state. This three-stage process is called the Kuznets curve, and it is a key bit of political-economic foresight, seen at times in all democracies since the industrial revolution began. We will discuss Kuznets curves in greater detail in Chapter 4 (Models).
We in the developed West have been seeing income inequality growth from an ever faster emergence of increasingly powerful new productive technologies since the late 1960s, and from business and political changes that reinforce market and wealth concentration, so it is reasonable to expect global income, asset, and power inequalities to continue to grow for the next decade or two, at least. See Piketty’s Capital in the Twenty-First Century (2014) for more on that story. In some countries, broadly redistributive processes can fail to emerge for long periods of time, and persistent and stable forms of corporatism, cronyism, plutocracy, and economic, educational, and entrepreneurship decline can occur, as economists Acemoglu and Robinson make clear in Why Nations Fail (2013). Let us make sure the US avoids this fate.
At the end of the twentieth century, the Globalization school was well summarized in Tom Friedman’s The Lexus and the Olive Tree (1999). We live now in a much flatter, faster, and far more wealthy, connected and innovative world. The Sustainability school, by contrast, was well presented in Paul Hawken’s Natural Capitalism (1999). Future industrial capitalism will have to account for and conserve our undervalued ecosystem resources, reduce wasteful consumption, greatly increase resource productivity and efficiency, and keep inequities of income, health, wealth, and well-being in a range that remains productive, rather than corrupting.
Era 6: Security, Sustainability, and Empowerment Foresight (2000s to 2010s)
In our current Sixth Era, the Globalization-Technocratic school has quite effectively promoted (some would say, greatly oversold) to the developed world the concept of a Global Security Crisis. This was for many a predictable result of the Al-Qaeda attacks of 9/11/2001 and the most likely response to those attacks by US defense leaders. I am all for growing global security to counter the threat posed by proliferating weapons technologies, but the challenge is mounting a proportional response. In this new political climate the US has funded multi-trillion dollar undeclared and privatized wars in Afghanistan and Iraq, and built vast new homeland security and intelligence industries of questionable efficiency and accountability. How much of this has been overreaction and how much proactive prevention is unclear. What is clear, unfortunately, are the recurring instances of public deception, waste and mismanagement that have accompanied our current initiatives. See Rick’s Fiasco (2007), and Maddow’s Drift (2013), for good recent accounts. Learning to build antifragile security systems, which get stronger and more efficient as they are challenged, and which react proportionately, like biological immune systems, is one of the key challenges of future global security systems. Sagarin’s Natural Security (2008) and Learning From the Octopus (2013) and Taleb’s Antifragile (2014) are three good introductions to this perspective.
Perhaps the most promising and predictable development in the Sustainability-Rights school in this era has been increasing global initiatives to advance the education, rights, and freedoms of women. Kristof and WuDunn’s Half the Sky (2010), and the film Half the Sky (2012) are great introductions to this story. Women and girls are by far the largest group of human beings who are currently subject to highly unequal social, educational, financial, and political treatment. It is an easy prediction that this state of affairs will be increasingly untenable the wealthier and more connected the world becomes. It is also obvious, though perhaps not to old white men, that the more ethnically representative and gender balanced our professions, boardrooms, and legislatures get, the less prejudicial and conflict-driven our social systems will become. What’s more, all the other issues of the Sustainability school (climate, environment and resource sustainability, population, social responsibility, conflict reduction, children’s rights, etc.) get more politically tractable the more freedom, education, opportunities, and independence women attain. For more data on this irreversible and quite positive global developmental trend read Paxton and Hughes’s excellent Women, Politics and Power (2013).
In this era the Sustainability school has also posed several of our still-growing environmental issues (climate, water, energy, population) as coming crises, particularly those presently being made worse by rapid economic development in emerging nations. Fortunately, many sustainability policy initiatives have emerged in this era, and we are seeing global leadership in this regard from Nordic countries since the 1920s, Germany since the 1950s, and Japan, Singapore, Taiwan, and other Asian countries in more recent decades. In the US, we have usually been laggards in these issues on the governance side, though we presently remain among the leaders in sustainability business and technology.
Perhaps the Sustainability school’s greatest recent governance successes in the US has been the emergence of myriad smaller and local environmental and social justice non-governmental organizations (NGOs), each addressing sustainability and economic policy and justice issues locally or in specialty projects, with increasingly powerful bottom-up funding and online support bases. See Paul Hawken’s Blessed Unrest: How the Largest Social Movement in History is Restoring Grace, Justice, and Beauty to the World (2008) for a recent account of accelerating global NGO and social enterprise activity.
One surprise of this era has been the emergence of a third, and still smaller school of foresight, which we call empowerment foresight, and a political orientation in relation to it, which we call upwing, by contrast to the traditional camps of rightwing and leftwing. Empowerment foresight and upwing politics are generally concerned with personal empowerment, across all the STEEPS spheres (science, technology, economics, education, politics, and society). By far the most effective strand of this new school, because it leverages accelerating technological change, is concerned with issues of digital access and empowerment.
What we call Digital empowerment foresight is prominent among information technology opinion leaders and entrepreneurs. It is a synthesis of freedom-innovation-entrepreneurship and democracy-transparency-justice strands from the first two schools. It stresses broadband growth and access, closing internet and mobile access divides, preserving net neutrality, expanding peer-to-peer sharing, open source software, fair use of intellectual property, DIY culture, citizen science initiatives, crowdfunding, crowdfounding, hackerspaces, startup accelerators, open access research, privacy rights, open government and voter empowerment platforms, whistleblowing initiatives like Wikileaks, virtual non-state currencies like Bitcoin, new forms of digital collaboration, and web-fueled activism, like the Arab Spring of 2010, are all examples of this new school.
We will discuss the empowerment foresight school in more depth in Chapters 7, 8, and 11. It may eventually merge with or be coopted by one or both of the other two older and larger foresight schools, Globalization-Growth-Security or Environment-Sustainability-Rights foresight, or it may mature into its own cohesive third foresight community. Likewise, upwingers as a political movement may be coopted by the much older and larger political parties, or it may be increasingly identifiable as a counterforce to traditional right-spectrum and left-spectrum political groups. As we’ll argue later, a successful upwinger platform would be an excellent development, and it would spur the existing parties, which will likely never go away, to treat personal and digital empowerment issues much more seriously. For the present, the growing popularity of the empowerment school of foresight, with both online opinion leaders and with new economy giants like Google, Facebook, Cisco, and others argues it will continue to gain advocates for the foreseeable future.
Meanwhile, global scientific and technological advances have continued to accelerate, and there is so much new popular and professional interest in the future that we can call our current era another Foresight Spring, similar to the 1960’s-1970’s. Public discussions of the future are rising again, particularly with respect to information technologies and their impacts (surveillance, privacy, education, robotics, GMOs, health care). In 2004 the European Commission funded the European Foresight Monitoring Network (EFMN), a small but prolific group that researched and reported on foresight activities in Europe and around the world, and produced many of their own reports. Like the US Office of Technology Assessment, which was defunded in 1995, the EFMN (see their unfortunately goofy logo at right) had only four years of EC support. That support wasn’t renewed in 1999, making it another in a very long list of abandoned government efforts at open foresight initiatives. Here is the EFMN’s 2009 final report, with a list of its many insightful briefs. Fortunately the European Commission continues to fund some high quality foresight work via its ongoing annual Future-Oriented Technology Analysis Conferences, started in 2009.
There is also a tide of renewed public interest in foresight. In addition to foresight elicitation platforms like Futurium, social sharing platforms like Twitter, Facebook, and Google Plus, while still primitive today, have made it easy for us to share photos, videos, and an opinion in increasingly exciting, information-rich, visual, and participatory online communities. In the last decade, new graduate foresight training programs have also emerged in Australia, Azerbaijan, Canada, Colombia, Denmark, Germany, India, Iran, Malta, Mexico, Pakistan, and the US, doubling our total number of academic programs over this time period (let’s hope the next doubling takes even less than ten years). A new professional foresight body, the APF, was formed in 2002, and is steadily gaining membership. Our free introductory foresight network FERN was formed in 2007 and has also benefited strongly from renewed public interest in our field.
In another promising development, we have seen growth in effective new forecasting methods like predictive analytics, data mining and statistical modeling to find hidden trends, associations, and patterns in all kinds of human systems. Predictive analytics now even has a 1,000+ person annual conference, Predictive Analytics World. Statistician Nate Silver and his journalism group at FiveThirtyEight.com surprised the global forecasting community by accurately predicting the winners of political primaries in all 50 US states with 95% accuracy in both 2008 and 2012. See Silver’s The Signal and The Noise (2012), for more on that story. Silver’s Bayesian and experimental approach to building probabilistic models using large time-series data sets has inspired many in the forecasting community, as has the growing commercial success of web intelligence, machine learning, and quantitative and crowd prediction platforms like Palantir, Good Judgment, Recorded Future, Quid, Revolution Analytics, Wise.io, and many other new entrants to this space.
If web analytics, sensors, and data continue their exponential growth, and these new statistical, evidence-based and collective human judgment based foresight methods continue to develop, we can predict that by the early 2020s public interest in foresight, and organizational and governmental funding of foresight relative to other budget priorities will again exceed our 1970s peak. We suggest calling that coming era a Foresight Renaissance, a topic we’ll turn to at the end of this chapter.