The Cognitive Biases Behind Society's Response to COVID-19

The pervasive influence of authority bias throughout the pandemic has been particularly troubling.

There’s an awful lot these days that can rightly be called “unprecedented.” Whether it’s the pandemic itself, the government lockdowns, or the massive bailout programs, it seems like everything we’re contending with is unexplored territory.

The hardest part about having no precedent is that it complicates the decision-making process. Normally, precedent acts as a guide that helps us determine our approach, and it plays a vital role in simplifying the decisions we face. In these times, however, the absence of precedent has made us desperate for simplicity, and in our desperation, I believe we have subconsciously succumbed to our natural predilection for mental shortcuts, also known as cognitive biases.

The study of cognitive biases is fascinating, but also unsettling. Increasingly, researchers are realizing that our biases cause significant errors in judgment, often more than we realize. In light of this, I think it’s worth taking some time to reflect on how our biases have influenced our response to this pandemic. To this end, let’s start by thinking back to when this whole fiasco began.

Before the Lockdowns

Back in mid-March, a number of biases played a role in shaping our initial response. One of the most significant biases in this regard was the availability heuristic. First described by Kahneman and Tversky in 1973, the availability heuristic is the tendency to misjudge the importance, frequency, or likelihood of events by giving excessive weight to events that are easier to recall (such as events that are more recent). This concept is closely related to salience bias, which is the tendency to focus on things that are more prominent or vivid and ignore things that are inconspicuous.

When considering the events leading up to the lockdowns, it’s not hard to see how these biases were at play. The imminent danger of a global pandemic dominated the media discourse in a matter of days, so it makes sense that we gave it a lot of attention.

But while COVID-19 infections and deaths were widely publicized and therefore widely salient, the negative impacts of the impending lockdowns went largely unnoticed. In fact, the disregard for these impacts eventually got so perilous that a group of over 600 physicians sent a letter to President Trump warning about “the exponentially growing negative health consequences of the national shutdown,” and pointing out that “the downstream health effects…are being massively underestimated and under-reported”.

The unintended consequences of the lockdowns have proven to be a sobering reminder of what can happen when we fail to look beyond the immediate intentions of narrow-minded policies. Indeed, this shortsightedness is the very thing Hazlitt warned us about in his book, Economics In One Lesson. But even though our blindness to these secondary effects is natural and understandable, it is by no means inevitable. We are perfectly capable of seeing the unseen so long as we remember to look.

Another consequence of the media’s preoccupation with COVID-19 is that we quickly became familiar with a host of ominous projections. This familiarity likely produced an illusory truth effect, which is the tendency to believe information to be true as a result of repeated exposure, even if it turns out to be false. On top of that, many people developed exaggerated expectations (arguably another bias) and displayed a great deal of overconfidence and hubris.

Finally, as more and more people bought into the alarmist narrative, there was an ever-increasing amount of availability, salience, repetition, and overconfidence. The ensuing positive feedback loop was inexorable, as was the corresponding bandwagon effect.

During the Lockdowns

A few weeks after the lockdowns were imposed it became apparent that the curve had been flattened and the extreme risk had subsided. But as weeks turned into months, the bureaucrats were incredibly reluctant to lift the lockdowns, and when they finally did it was a slow and gradual process. But why was that? Well, the official answer is that they were concerned for our health (unintended consequences be damned), but I think there’s another factor involved called status quo bias.

Status quo bias is our inclination to prefer the current state of affairs and avoid change. To be sure, it didn’t stop the politicians from imposing the lockdowns in the first place, largely because other factors took precedence. But once the lockdowns were imposed, they became the new status quo, the “new normal,” and this meant that there was now a psychological resistance to removing them.

Many explanations for status quo bias have been proposed, and it’s likely that they each contribute to varying degrees. Some of the more common explanations are loss aversion, omission bias, and the sunk-cost fallacy. Let’s briefly explore these in turn.

Loss aversion is the idea that we generally prefer avoiding losses to acquiring equivalent gains. What this means practically is that we often reject a change on account of its potential drawbacks, even if they are outweighed by its potential benefits.

Another explanation is omission bias, which is our proclivity to favor acts of omission over acts of commission. In the trolley problem, for instance, people feel more justified in allowing harm to happen than in actively causing harm. This preference for inaction seems to reflect an underlying moral sentiment, but it may also involve a fear of regret, since we might expect to regret our actions more than our inactions.

Lastly, the sunk-cost fallacy is the tendency to justify the status quo on account of past investments, even when it has become apparent that the investments were misguided and the approach should be revised. Under this framework, our resistance to change stems from our unwillingness to admit we were wrong. The temptation is that as long as we don’t change course, we don’t have to acknowledge that we made a mistake.

Even after we manage to deviate from the status quo, we are still reluctant to change too quickly. This hesitancy is likely a manifestation of conservatism bias, which is the tendency to insufficiently revise our beliefs when presented with new evidence. Although it’s impossible to ascertain the full extent of its influence, conservatism bias offers a compelling explanation for why the restrictions were eased so slowly even after the curve had been flattened. “The lockdowns are already in place,” we told ourselves. “Why not just a little longer, just to be safe?”

The common element in all of these concepts is fear. Fear of loss, fear of regret, and hence, fear of change. And it makes sense, because our natural reaction to fear is paralysis. It feels safer to stay the course than to begin moving in a different direction.

But the politicians aren’t just worried about loss and regret. They’re also worried about optics. And though they wouldn’t like to admit it, their concern for their reputation has probably played a considerable role in their decisions.

Just imagine how it would look if the lockdowns were lifted any faster. First, it would be an implicit admission that the initial lockdowns had been misguided and probably weren’t necessary in the first place. Second, they would run the risk of getting blamed for a rise in infections. But as long as they kept the strict lockdowns in place then they could blame any outbreaks on us because they were “doing everything they could”.

In short, it was much better optics for them to extend the lockdowns “for your safety” than to admit that they were wrong and that they overreacted. And if a few million people had to lose their jobs so the politicians could save face, so be it.

After the Lockdowns

Now that the lockdowns are being lifted and life is slowly returning to normal, another slew of biases is taking hold. The most obvious example here is hindsight bias, which is the tendency to perceive historical events as being more predictable than they actually were. In the same way that many people displayed overconfidence before the lockdowns, many people are now patting themselves on the back, confident that they “knew it all along”.

This ties in closely with confirmation bias, which is our proclivity to search for, interpret, favor, and recall evidence in a way that confirms our pre-existing beliefs. As Sherlock Holmes puts it, we “twist facts to suit theories instead of theories to suit facts.” A great example of this has been our propensity to interpret declining infection rates as a confirmation that the lockdowns “worked”, when in fact this is a textbook example of the post hoc fallacy.

The apparent success of the lockdowns has also led many people to conclude that the initial decisions were necessary and justified. But this conclusion is merely an example of outcome bias, whereby we judge the quality of the decision based on the quality of the outcome. In reality, the presence of a positive outcome doesn’t necessarily prove that the initial decisions were justified. And besides, if we consider the unintended consequences, there’s a decent chance that the lockdowns did more harm than good anyway.

The Most Dangerous Bias

Though all of the biases mentioned so far are important, there’s one that concerns me the most, and it’s called authority bias. Authority bias is the tendency to attribute greater accuracy to the opinion of an authority figure and be more influenced by that opinion. It was most notably established by the infamous Milgram experiment in 1961, which demonstrated people’s surprising propensity to trust and obey authority figures even to the point of violating their own conscience.

The pervasive influence of authority bias throughout the pandemic has been particularly disconcerting. From the very beginning, people have blindly trusted “the experts” even as they shut down our businesses, undermined our response efforts, and trampled our civil liberties.

And quite frankly, that scares me.

Whatever happened to having a healthy distrust of authority? When did we lose our skepticism and suspicion? Have we forgotten that they too are mere mortals? Have we forgotten how to think for ourselves and how to take responsibility for our own lives?

Perhaps. Or perhaps we have simply forgotten to be mindful of our biases. Perhaps we were unwittingly drawn to the easy answers, the herd mentality, the status quo, and the confirming evidence. If that’s the case, then I think there is still hope. But if we’re going to overcome our biases, we need to begin by learning to identify them.

And then we need to lead by example. We need to be honest about our own susceptibility to error. We need to model a healthy suspicion of our own predispositions. We need to take responsibility for our own blindness and be open to correction. And perhaps, in this way, we can show the world what true rationality looks like.