6 min read

If You Ask The Wrong Questions...

Because we do not like feeling uncomfortable, our natural tendency is to seek out information that conforms to our existing beliefs and to ignore conflicting information.
If You Ask The Wrong Questions...

If you ask the wrong questions, don't be surprised if the answers you get are not much use to you.

Have you ever heard the proverb "There are none so blind as those who will not see?" We've all experienced it, usually when arguing with some idiot who disagrees with us on a perfectly obvious point. (See Don't Be An Idiot!, recently updated with some additional categories.) Today I will explain the psychological grounding behind the phenomenon and give you some tips for how to avoid selective blindness yourself.

If you want to become well-informed on a topic, conducting broad research is essential. Looking at the first few links in a Google search will give you a certain set of information, true. This will likely be mainstream information, meaning that either (1) Google is promoting it, or (2) it is consistent with the majority view, or both. Either way, you will probably not be widely criticized for parroting what most people already believe or can find in a quick search.

There is a more pernicious problem in conducting research, which is this: we are our own worst enemies. Being confronted with information that conflicts with our existing beliefs makes us uncomfortable, a situation referred to as cognitive dissonance. Because we do not like feeling uncomfortable, our natural tendency is to seek out information that conforms to our existing beliefs and to ignore conflicting information.

One way to counteract this tendency is to leave open the possibility that we may be wrong. In other words, not to tie our psychological wellbeing to being right all the time. I discuss ways to do this in Don't Mistake Certainty For Correctness. Not needing to be right is a helpful, if modest step.

Now I hear some of you saying, "This doesn't apply to me, at least not all the time. When I am doing research, I don't usually have a pre-conceived idea of what the answer is. Thus, I can't be affected by cognitive bias." Not so fast, my friends. You may not have thought deeply about the issue you are researching, but you nonetheless are likely to have ideas about it. "What do you mean," you ask? "How can that be?"

Our understanding of the world and how it works is a complex, multi-layered construct, built up over the whole of our experiences. Some beliefs are at the forefront of your consciousness, and so you think of them as your core beliefs or values. But you likely have many, many unexamined ideas and beliefs about why things happen the way they do. Humans are amazing pattern recognition machines. So much so, that we regularly and easily see patterns where none exist, finding causality in random correlation.

So even when you approach a topic with what you think is an open mind, you are coming with a lifetime of experiences that have shaped not only what you believe but the very process of how you form new opinions. Even if you don't cling to the need to be right, it can hurt when your belief system does not match up to new evidence.

A far more powerful way to counteract your cognitive biases is to actively seek out information that conflicts with your current view. You will not do this by accident. It requires deliberate effort. But that effort does not have to be burdensome. Some of the smartest people I know view it as a kind of game. They love finding out contradictory information, because it means they've really learned something about the world, or themselves. In this light, discovering you were wrong about something can be a gift. Thinking about it that way is a fine way to remove the potential sting of cognitive dissonance.


I can offer you a simple interim test of what I'm talking about. Some of you may have noticed a little frisson of displeasure when I suggested above that Google promotes certain views (and thus, by definition, must suppress others). Because that is not consistent with what many of you think about Google, i.e. they're just a search algorithm and they don't put a thumb on the scale, you either dismissed it without realizing it or you didn't even notice I wrote it.

Okay, don't believe me. Check for yourself. I'll wait. There is an alternative search engine called DuckDuckGo. Search for something in Google, and then do the exact same search in DuckDuckGo. See how much overlap there is, and whether you detect any skew in the results. It will be easier to see the effect if you use a more polarizing topic, say "Ivermectin." (If you haven't heard of that word, you may need to evaluate your sources of information, but that's a topic for another day.)

So, to sum up: we are biased, pattern-recognition machines, predisposed to confirm what we already believe, even if we don't know we believe it. By reminding ourselves that we are fallible, and making a game of trying to identify our blind spots, we can improve the chances that we are looking at the world with relatively clear eyes. It also helps us avoid inadvertently spreading propaganda. See Are You Spreading Propaganda Knowingly Or Unknowingly?

All this assumes you are genuinely interested in keeping an open mind and discovering objective truths. Sadly, there are many in the public sphere who have succumbed to the sweet temptation of partisan reporting. That is, they come to their research task with a preferred outcome in mind.

How to tell if what you are reading suffers from this defect? Here are two signs: (1) the discussion does not acknowledge the good arguments on the other side; and (2) the author uses data and statistics to explain why human incentives do not apply to a particular case.

To the first point, if you hear someone presenting strong arguments on one side, and none or obviously weak arguments on the other, you may rightly suspect their motives. (See Why Experts Are The Last People We Should Listen To.) Your shield against narrow thinking is to faithfully express both sides of an argument. As a lawyer, I admit arguing both sides comes to me more easily. This, to the great frustration of friends and family who sometimes ask me "Yes, but what do you really believe?"

And as to the second point, incentives predict a great swath of human behavior. Way more than people realize. Identifying incentives, and understanding how they work, gives you a great lens for making sense of the world. So when someone uses data, no matter how impressive-looking, to explain why incentives don't work, you are usually safe to trust your incentive instincts.


Let me give you a concrete example: the so-called Ferguson Effect. The Ferguson Effect refers to a "hypothesized increase in violent crime rates in a community caused by reduced proactive policing due to the community's distrust and hostility towards police." This idea was first expressed in 2014 by the St. Louis Chief of Police in the wake of unrest following the police shooting of Michael Brown in Ferguson, Missouri.

Some pundits took immediately to the idea, and warned of repercussions that would come of painting all police with the brush of a small number of wrongdoers. Many more dove into the fight seeking to "debunk" the idea, some of them out of a concern that the theory seemed to be blaming the victims for what happened to them. See for example this Vox article attempting to reframe the discussion.

The battle raged on in the wake of BLM protests last year, and calls to defund the police. On the one side, you had people explaining that perverse incentives were being created that would likely lead to undesired outcomes. See for example Heather Mac Donald's City Journal article. On the other, you had people conducting research attempting to show that there just is nothing to see here: Nationwide search finds no evidence of 'Ferguson effect.'

Did we really need the Pew Research Center to survey thousands of police officers across the U.S. to learn that almost all of them are more concerned for their safety, and that a supermajority of them are less willing to stop and question suspicious people and are more reluctant to use force when appropriate? If you understand incentives, it was not a stretch at all to predict that police would engage less and crime levels would increase.

And yet I have come across many smart and well-meaning people who insist the effect is illusory. Why? Because it doesn't fit with their desired view of the world. I've taken lately to identifying when someone is talking about not reality, but what I now think of as "Things We Wish Were True," or in shorthand TWWWT.

Feel free to use the TWWWT label to remind yourself that while wishful thinking may feel good, it is no way to accurately interpret the world or make important decisions in your life.

Be well.