Chapter 2. Personal Foresight - Becoming an Effective Self-Leader

Emotional-Cognitive Biases – Seeing and Fixing Them

Our emotional-cognitive biases (ECBs) are feeling and thinking traits, habits, and styles that are by definition not adaptive. ECBs give us a false picture of reality, and predictably skew us into predictable, inappropriate ways of action.

These are commonly called cognitive biases, but the reality is that both our unconscious intuition and emotion (System 1) and our conscious deliberation and cognition (System 2) can express biases, and either or both systems can be in need of debiasing. Decomposing our biases into these two systems, and looking for how they are triggered, can be particularly helpful for debiasing ourselves. We can typically “talk our way out” of System 2 biases, but we can’t do that with System 1 biases, as they are largely unconscious.

We could call them Unconscious-Conscious Biases (UCBs) to be more accurate, but Feeling-Thinking Biases (Emotional-Cognitive Biases) is an easier to remember label. We talked about values, goals, models, and strategy biases in Chapter 1, as conscious (System 2) biases that we can’t avoid. When they are adaptive, don’t call those biases, but simply good values, goals, models, and strategies. Again, we can talk or think our way out of many of our System 2 biases, and those aren’t the topic of this section.

All ECBs, in our definition, always have at least some unconscious component to them, and per the 95/5 rule, that unconscious part is typically by far the largest contributor. We are typically unaware of that System 1 component. Fortunately, we can be aware of the System 2 (cognitive) component, and that awareness can allow us to work in conscious ways, to reduce both the System 2 component (talking our way out of our prejudices, which sometimes works) and the System 1 component, retraining our unconscious feelings and thoughts with various debiasing techniques.

Seeing our and others ECBs is thus great anticipatory personal foresight. Working to fix them can greatly improve our strategic foresight and action, and so is a worthy and lifelong challenge.

The figure below (available as a wall poster to remind you of your humanity) is a beautiful compilation of 188 cognitive biases , in four main groups, brilliantly categorized by Buster Benson, and illustrated by John Manoogian III, using Wikipedia’s ever-growing List of (Emotional-)Cognitive Biases.

This is an excellent model. Let me offer another useful model, one that breaks some of these biases into Less Recognized and more Socially Recognized biases. Fortunately, modern societies have singled out a number of biases, such as age, ability, gender, and race-ethnic bias, that we think are particularly damaging to our social fabric. Most countries are engaged in a variety of educational, social, and political efforts to reduce those biases. Unfortunately, those efforts will be slow, as long as we don’t effectively address the unconscious component of these ECBs. Unfortunately as well, most biases, by far, are in the Less Recognized category. In my view, many of these Less Recognized Biases will start to be significantly improved only in a world with Groupnets and Personal AIs, perhaps a generation from now, as we discuss in Chapter 8.

Here then is a small list of biases, sorted into these two categories.

Less Recognized Biases:

  • Audience/Marketability Bias (Overattention to marketable, sensational info. Many media stories are written with a marketability bias, such that complex, equivocal, or otherwise less interesting results are simply not reported.)
  • Biocentric Bias (seeing our future only through our own biological desires. Ignoring technology and society as learning systems.)
  • Chemophobia (an unjustified fear of chemicals, whether manufactured or natural, as with allergens. Both in our diet and in our environment.)
  • Confirmation/Selection Bias (Selective attention to information confirming one’s preconceptions. Aka a “filter bubble.” Compare to homogeneity bias.)
  • Deliberative-Logical (System 2) Bias (Overattention to or overvaluing of one’s slow, deliberative, logical intelligence)
  • Exteriorization/Somatization Bias (Overattributing events to exterior vs interior causes, or overattending to exterior information sources)
  • Groupthink/Response Bias (Always following the leader’s or the group’s response.)
  • Homogeneity Bias (Nonrandomness of a sample or nondiversity of a group, “echo chamber” or “filter bubble”)
  • Intuitive-Emotional (System 1) Bias (Overattention to or overvaluing of one’s fast, intuitive, emotional intelligence)
  • Interiorization/Asperger’s Bias (Overattributing events to interior vs exterior causes, or overattending to interior information sources)
  • Loss/Risk Aversion Bias (Greatly preferring avoiding losses or risks to acquiring gains or opportunities)
  • Omission/Inactivity Bias (Belief that a harmful action is always worse than a harmful omission)
  • Optimism Bias (Costs/risks/limits are systematically discounted. Tends to decrease w/ experience/age.)
  • Overdetermination Bias (Believing the future is more predictable, in more ways or precision, than it actually is.)
  • Pessimism Bias (Opps/benefits/freedoms systematically discounted. Tends to increases w/ exp/age.)
  • Recall/Framing Bias (Recall/Overattention to recently “cognitively primed” language or events)
  • Rhyme-as-Reason Bias (Claims or aphorisms are judged as more accurate or truthful when they are rewritten to rhyme)
  • Safety/Conservatism Bias (Overconservative communication and behavior, fear of failure or criticism)
  • Superstition Bias (irrational belief or practice arising from ignorance, misunderstanding causality, supernatural thinking, or fear of the unknown)
  • Undetermination Bias (Believing the future is freer, is less predictable or constrained, than actually it is.)

Socially Recognized Biases:

  • Ability/Disability Bias (inequitable treatment or rights by physical ability/disability)
  • Age Bias (justifying or causing inequitable treatment or rights by age)
  • Class Bias (justifying or causing inequitable treatment or rights by social class)
  • Gender Bias (justifying or causing inequitable treatment or rights by race/ethic group)
  • Race/Ethnic Bias (justifying or causing inequitable treatment or rights by race/ethnic group)
  • Political Affirmation Bias (inequitable treatment or rights by political affiliation)

Let’s briefly discuss a few of these biases now. We’ll explore more in a later version of the Guide.

We will discuss Biocentric Bias in Chapter 7, as the source of many popular but mistaken fantasies about the human future.

Consider Superstition Bias, which is particularly common in societies dominated by orthodox (nonreform) religions, or where there is no strong rational, scientific education. In the late 1940’s, the psychologist B.F. Skinner did experiments with pigeons demonstrating how easy it was to get them to build irrational models of cause and effect. People, unfortunately, are just as susceptible to irrational thinking, particularly in groups. See this lovely video excerpt, Experiment: Superstitious Behavior in Humans (YouTube, 3 min) from Season 2, Ep 6 of the TV Show Trick Or Treat (2008) in which a number of people are placed in a room with a counter, a video monitor, and several objects, and a large sign telling them to score 100 points with the objects within 30 minutes to retrieve a reward. A less obvious sign indicated a much larger reward was available by simply walking out of the room after a time, but the participants did not notice that, as they were so focused on figuring out the “puzzle” of scoring points with the objects. The points were awarded randomly by researchers, but at the end of the event, all the participants except one presumed that their behaviors influenced the counter. They were “fooled by randomness”, attributing causal reward to their actions in irrational ways. Only one participant did not fall victim to the bias.

Consider Exteriorization/Somatization Bias, the overfocusing on bodily problems or symptoms (including one’s own body or behavior in relation to society), and the misattribution of the vast majority of them to to exterior causes (physical things, other people’s behavior) versus interior, emotional-cognitive causes. This is a big bias afflicting all societies as their wealth and leisure grows, and it helps us understand many of the psychosomatic medical ills that affect people in the developed world.

Physicians talk about a mood and thinking disorder called somatization, where people with psychological pain focus on negative symptoms in various body systems other than their brain, and misattribute their natural depression or other temporary mental problems to the problems with those body systems. The causes are usually assumed to be external physical actors (toxins, pathogens, traumas), often operating via hidden and improbable causes. Misattributing one’s own problems to other people’s actions or behaviors, sometimes including their secret behaviors, is closely related to this, social biases that probably deserves its own name (perhaps Scheming/Conspiracy bias?).

Somatization/Exteriorization bias occurs when we blame our present emotional-cognitive condition on exterior factors or actors impacting our body and brain health. An alcoholic who blames all their life problems on their environment, but not on their internal emotional-cognitive states, including alcohol addition, would be one example. Many physicians expect that maladies like Chronic Fatigue Syndrome, while perhaps caused by exterior viruses or pathogens in some fraction of cases, may more commonly be due to internal metabolic dysregulation, or neural dysregulation, as atypical or subclinical depression, either of which may be amenable to interior focus and therapy.

For a balanced look at CFS, read this 2009 post on CFS by the late Wally Sampson. In the post he explores how even normally highly rational groups like the NIH and Science magazine fall victim to bias when they announce that “CFS, a debilitating disease that affects millions of people, may be associated with the retrovirus XMRV.”  This conclusion may be an overgeneralization, an audience bias that fits nicely into the somatization bias of current wealthy cultures. Samson reminds us that the best supported theory by far about CFS is that it is mostly a somatization (emotional-cognitive dysregulation) disorder, with viral and other physical causes affecting at most a small fraction of CFS sufferers. He also makes a few socially uncomfortable points. In his view, many CFS presenters have poor emotional self-awareness, and are resistant to introspection and cognitive behavioral therapy. I’d love to see a study to refute or validate that assessment.

We somatize/exteriorize when we look first for pills to cure our problems, enriching Big Pharma far more than it deserves (and for how they make sure the legal and social systems enable that bias, see Jacky Law’s excellent Big Pharma, 2006). It’s why we fall victim to fad scam health care products like Baltic amber necklaces to hang around our infants neck, in an attempt to minimize their teething pain. Read this nice post at Science Based Medicine, John Synder, Amber Waves of Woo, 2014, explaining Amber Teething Necklace quackery. What our babies need instead is a good sleep, social interaction, love and caring, and parental recognition that teething pain is endurable and temporary. What we need in such cases is a more interior focus, accepting that we, not the environment, are the major “problem”, and working to understand and mediate the sources of our psychic pain.

Making these kinds of assessments of others or ourselves, whether evidence-backed or not, is taboo in many otherwise-developed societies, even though we know that our mood and emotional states can be changed when we focus on them. Instead, we still too often prefer to look outward first, looking for any easy cure that require no self-change. That kind of outward-focused belief in the sources of our problems has become a dogmatic bias, for all of us, for various issues or conditions, at various points in our lives. Yet we can improve our emotional and cognitive thinking at any and all points in our lives, if we only become aware of our issues and use good process.

With respect to socially recognized biases, social psychology tests like the implicit-association test (IAT) can quickly uncover our automatic (unconscious) mental associations, but it’s hard to know when those associations are harmful biases and when they are useful judgments, based on real evidence.

Uncovering our unconscious associations with the IAT

For useful and for some, controversial example, men in almost all cultures commit various violent crimes nine times (to a rough approximation) more often than women. That is a useful mental association, not a bias. Likewise with criminal profiling, which is consciously and effectively done in some countries, while remaining unconscious in many other countries.

Such group-specific biases are complicated to discuss and treat, because we often make the commendable social decision to try to become blind to these distinctions, even encoding such blindness in our law, as we’ve been so harmfully biased against so many social groups for so long. That can be a useful ideal, yet useful mental associations and judgments around social issues will continue to exist, despite our ideals. Figuring out which of these judgments are acceptable, in a world seeking more social equity is the challenge.

Fortunately, our general emotional-cognitive biases typically don’t have this history of social injustice attached to them, so it should not be hard to build online IATs which help us discriminate between biased and useful judgments. Unfortunately, few such tests exist today to my knowledge, or at least aren’t in common use, and online general ECB unlearning and retraining tools are even harder to find. We may need a decade of further psychological research, and smarter and more pervasive digital systems before good general ECB tests and debiasing tools emerge.

In 2015 Facebook shared some of its unconscious social bias identification and unlearning tools at Managingbias.fb.com. Google did the same at Rework.withgoogle.com, their portal for sharing evidence-based human resources tools and practices. I’m proud to say my wife is on the Re:work team. These are great starts in addressing some of the more damaging social ECBs in the workplace. The Center for Applied Rationality (with the lovely acronym CFAR) is also doing good work trying to help us understand predictable failure patterns in human thinking and decision making. I’m hopeful we’ll see more free tools for managing all our ECBs in coming years.

Regarding general ECBs, intuitive-emotional (System 1) bias and deliberative-rational (System 2) bias may be the most obvious. If these are our two most central internal intelligences, it makes sense that one or the other will sometimes be overexpressed. We all know people, and clients, who are commonly too much “in their hearts” or “in their heads,” either on particular issues or in general. Being able to see this, and to meet these individuals where they are, yet also help them understand the other key intelligences, is a very important skill. As you gain their trust, you can help a person who doesn’t have enough deliberative-rational thought to see this, and to use that intelligence more, as well as help a hyperrational person who is out of touch with their intuitive-emotional intelligence to see it and express it better.

Optimism and pessimism biases are perhaps the next cognitive biases that we should briefly consider. Adjusting for them is particularly foundational to doing good foresight. In Chapter 1 we discussed the importance of maintaining a healthy optimism/pessimism ratio in our feelings and thoughts, and the difficulty of doing that as we age, and in a culture and media with a strong fear, pessimism and negativity reporting bias. Recall that neuroscientist Dilip Jeste recommends a roughly 2:1 ratio of internal (not communicated) optimism to pessimism thinking and feeling. In other words, he recommends having two different strategic optimism (vision, opportunity) thoughts for every one defensive pessimism (realism, caution) thought, on a moment-by-moment basis. Psychologist John Gottman has also published evidence that a 5:1 ratio of external (communicated) statements and actions are necessary to keep intimate relationships healthy and happy, in the workplace and in life. This ratios are first approximations today, and they need more evidence to back them up. But I find them excellent initial rules of thumb, and recommend them to you.

Of course, people are going to have ratios that fill a normal distribution around the population average, whatever that is in each culture. A good manager knows that people with naturally different ratios will be strongest at different tasks, and seeks to guide team members to tasks that best fit their strengths. People who are low-ratio strategic optimists (perhaps ranging from 2:1 to 4:1) are likely to make great managers, marketers, and salespeople, for example. Such folks may excel at growth and competitive strategy, and setting realistic stretch goals for their teams.

Naïve polyannas (as, for example, a 10:1 ratio optimist) would be dangerous to have in such a position, and would benefit from professional help with their extremism. Folks who are low-ratio defensive pessimists (perhaps 1:2 to 1:4 in their typical optimistic to pessimistic ratio of thoughts) can make great deputy managers and legal and security staff. They often excel at supporting the leaders, uncovering all the ways things might go wrong (responsible for worrying and precaution), and generally running organizational defense. High-ratio pessimists, by contrast, would be too rigid and self-blinkered to be good deputy leaders. They would be collaboration killjoys that you’d want to quickly identify and move out of any critical positions wherever team morale and dynamics matter.

Where do you presently fit on the optimism:pessimism ratio spectrum, in your internal thinking? In your external communications and actions with others? In a typical waking day, what would you say is your average ratio of opportunity to caution thoughts? Is it usually appropriate for your contexts? We must learn to see our biases, and adjust for them where they don’t help us. We also need to understand the larger social trends of optimism and pessimism in normal human populations and our particular culture, and the traps they create.

For example, with respect to ourselves, our relationships, and our children, at least up to our middle age, many of us are at least mildly overly optimistic. Some positive self-deceptions are healthy. We can recover from trauma faster, and set higher goals, as Shelly Taylor argues in Positive Illusions, 1991. But overoptimism can also be disempowering. When we think “I’m OK, you’re OK”, we don’t try to change, and wake up in a future where we’ve done nothing to better our lot. It doesn’t serve us well when most Americans see ourselves and our children as above average on almost all the key performance indicators, when only 49% can ever be so in reality. When we are overly optimistic, we don’t know our weaknesses, risks, and threats and may not see and change what isn’t working. Along with bias and fear, too much personal ego is one of many reasons we can neglect discussing things that need to be clarified, challenged, or changed.

We can even fall down a slippery slope into narcissism (considered a personality disorder, a particularly predictable and maladaptive collection of biases) becoming addicted to entertaining fantasies and self-images rather than sometimes hard but useful realities. Twenge and Campbell’s The Narcissism Epidemic, 2010, is a great overview of this trap. Narcissists tend to avoid conflict, and anything that punctures their optimism story. Borderline personality disorder is another harmful set of biases. We won’t discuss personality disorders much in this edition of the Guide.

Some of us go the other way on this ratio, becoming overly pessimistic and neurotic with respect to ourselves, our children, or our teams. Smith’s The User’s Guide to the Human Mind, 2011, explores that trap and offers good advice for getting out of it. Getting the right balance of personal and relationship optimism and pessimism, for the context, is key. If too much pessimism or optimism is an issue, neuroscientist Elaine Fox’s Rainy Brain, Sunny Brain (2012) outlines evidence-based procedures to reliably dampen and retrain fear and negativity circuits for pessimists, and to dampen pleasure-seeking and risktaking circuits for optimists, at any age. Such mental retraining takes time, commitment, and expertise, and is greatly aided by an empowered support network.

With respect to our organizations and societies, recall also that many of us, as we age, are overly pessimistic about the dangers and opportunities for positive change in our societies, due to our evolutionary, aging, and cultural pessimism biases. Jeste reminds us that we need to learn to see what’s working well with “two-thirds of our mind,” on average, and keep improving those things, rather than constantly complaining about what we don’t like, a convenient route to abdicating personal responsibility to drive change. Seeking to be a 2/3 organizational and social optimist and a 1/3 pessimist helps us see and use what’s working to fix those things that still need improving.

I’ve been accused of being too optimistic by a few folks, when I talk about accelerating technological change, and developmental immunity and morality. But in my defense, I’ve never said all this acceleration is going to be good for us. Only that it’s baked into the universe, and we apparently can’t stop it, as a universal process. What we can do is make its path and consequences better or worse for humanity. We have great moral choice over the path we take to this brave new world of technological life forms that apparently lies ahead, when we take an honest Big Picture view.

Patterson et al. (2011)

Patterson et al. (2011)

For two excellent primers on communication life skills that promote personal foresight, see Patterson et al.’s Crucial Conversations, 2011 and Stone et al’s Difficult Conversations (2010). These books make clear that both strategic optimism and defensive pessimism, combined with a strong desire to clarify probable, possible, and preferable futures, can greatly improve work and life outcomes. We must use difficult conversations, conducted with empathy, to move beyond bias, ego, fear, and other blocks to truth, to get to a more accurate set of future perceptions, in ourselves, our relationships, and our teams. We can also use these conversations to keep ourselves balanced productively between strategic optimism and defensive pessimism on our teams, as we will discuss now.

Stone et al. (2010)

Stone et al. (2010)

With respect to our organizations, we noted how toxic groupthink bias can be in Chapter 1. Whether it’s the Yes-person (sycophant) variety (unquestioning obedience to the leader) or the herd behavior variety (unquestioning obedience to the crowd), groupthink kills cognitive diversity. Perhaps the most common groupthink trap is the overly optimistic group. Everyone knows a few naive Polyannas, folks who regularly slide into too-sunny or even foolishly utopian ways of looking at things. Too much optimism is another dangerous kind of groupthink.

Klein (2015)

Klein (2015)

If a good psychological balance is to be twice as optimistic as we are pessimistic, on average, how do we stop ourselves from becoming too optimistic? Fortunately some good solutions exist. Let’s look at one now.

In the 1980’s, psychologist Gary Klein, a leader in the field of decision making research, developed the premortem foresight method specifically to address overoptimistic groupthink. Whenever any group leader thinks their group is being too overoptimistic or herd-like, their job is to challenge everyone to temporarily take the opposite position. The leader announces that the star project is “dead”, and they are to now imagine that it has failed in a major, public, and embarrassing way, and give three plausible reasons why it failed over the next few minutes.

Self-preventing prophecies (SPPs) are a milder variation of the premortem. An SPP is a story we tell about the possibility of failure ahead. When the failure and its risk are appropriately described, the SPP elevates our caution and risk awareness, and motivates people to make painful changes to prevent the prophecy, changes they would otherwise resist. Everyone uses SPPs. The danger for pessimists is that they like to turn SPPs into scare stories, greatly inflating the description of the failure and its likelihood. They do this with good intentions, but the effect is often counterproductive, getting people scared for no good reason, and diverting their energy from the good work they were previously doing, to discuss the silly scare story.

When SPPs and premortems are mentioned, by either leaders or team members, they allow the pessimists in a group to give feedback without fear of retribution. They also help the team to rekindle defensive pessimism, both in the individual and in the group. For more, see Klein’s Streetlights and Shadows: Searching for the Keys to Adaptive Decisionmaking, 2011, and Seeing What Others Don’t, 2015. Premortems, and their cousins, self-preventing prophecies, are very useful tools that leaders can use to prevent overoptimistic groupthink.

Overpessimism is another trap for some groups. We mentioned appreciative inquiry as one method to help groups start to move out of that trap. Leaders of such groups can also do visioning exercises, the opposite of premortems, tasking the group to imagine great success with current or new programs. Books like the previously mentioned Rieger’s Breaking the Fear Barrier, 2011, and Ryan and Oestreich’s Driving Fear Out of the Workplace, 1998 can help diagnose fear, a common emotional reason for pessimism. There may be a number of cognitive reasons as well, including toxic workplaces or learned helplessness. Acknowledging the problem and finding and addressing the most likely reasons is key to improving foresight in such organizations.

Cognitive Diversity on Teams is a Powerful Way to Reduce the Impact of Individual Bias

In our next section, we’ll discuss cognitive diversity, and the way it reduces the impact of individual bias. As Scott Page describes in his deeply evidence-based book, The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools and Societies (2008), teams of people who think differently from each other, and see the value of that different thinking, are far more adaptive than those living in non-evidence-based filter bubbles, manufacturing group visions that conform not to reality, but to their particular ways of looking at the world.

Thus two particularly helpful strategies for identifying and managing bias include cultivating emotional-cognitive diversity in your peers, and using emotional-cognitive behavioral therapy on yourself and in your relationships. Let’s briefly discuss each of these now.

Share your Feedback

Better Wording? References? Data? Images? Quotes? Mistakes?

Thanks for helping us make the Guide the best intro to foresight on the web.