Why You Can’t Trust Your Brain

By Fsrcoin

I heard a talk by Dr. Caleb Lack, a clinical psychologist and author. His topic: Why You Can’t Trust Your Brain.

Well, whose brain can you trust? Actually, the brain is an extremely complex organ, with 86 billion neurons (give or take maybe a dozen), and 100 trillion connections. But it’s easily fooled — by itself.

Dr. Lack said “doubting yourself” has negative connotations, but it’s the hallmark of an enlightened mind. Being a critical thinker and skeptic is hard to actually do. The problem is the human brain being “logically illogical.” That is, there are reasons why it does what it does, programmed by evolution.

Two key factors are cognitive biases — predictable patterns of judgment — and mental heuristics — shortcuts or general rules of thumb to decrease effort in decision-making. These tend to oversimplify reality and cause systematic decisional errors. But they are not all bad. We don’t always make bad decisions. In fact, there’s a “less is more” effect — folding too many factors into a decision may impede a positive outcome. And we can never have access to all the information, and must act on what we do have. That means “good enough” decision making. As opposed to investing too much effort in a decision. That’s why we did develop these seeming cognitive quirks — they are actually adaptive in balancing between effort and result.

I myself have come to believe that agonizing over a decision and trying to carefully weigh factors does not tend to improve upon one’s initial gut reaction. Indeed, there’s a lot going on, in the unconscious, to produce that first gut response. (I think I have an excellent gut. A certain president relied entirely on his gut, but the problem was that his gut was a snakepit of pathological bilge.)

Dr. Lack focused on two related metal biases: confirmation bias and belief perseverance. The former is the tendency to welcome information confirming already held beliefs or ideas. Such information sticks in memory, and we discount any problems undermining it. Whereas information at odds with one’s belief is discounted, nitpicked, and soon forgotten. The more emotionally charged a belief is, the more deeply held, the more confirmation bias applies. This is why we developed the scientific method, whose raison d’être is subjecting hypotheses to attempts to disprove them.

Belief perseverance is the related tendency to stick with an initial belief despite disconfirming information. Which actually causes people to “dig in.” That’s why it’s generally useless to argue with persons adhering to a certain political party or personage. Not to mention religious believers.

Dr. Lack spoke about three manifestations of belief perseverance. One concerns self-impressions, beliefs we hold about ourselves. Another he called “social impressions,” beliefs about other groups of people — like, oh, I don’t know, maybe certain ethnicities. The third is “naive theories” about how the world works. As an example he gave the Sun appearing to move around the Earth. Though many of us have gotten wise to this.

He also spoke about illusory correlations — seeing relationships between things not actually connected. The word pareidolia applies to interpreting random stimuli as being something particular. An example was the “face on Mars,” a geographical feature which, photographed in certain light, looked like a human face. We are in fact especially apt to see faces everywhere, a biological adaptation, because interacting with other people is so important for our thriving. More generally, we are subject to patternicity, seeing all sorts of patterns where they don’t exist. Also adaptive: you’re better off wrongly seeing a bunch of pixels as a predator than making the reverse mistake. And agenticity is when you see patterns as having a cause. Like a deity. These cognitive quirks are big reasons why we have religion.

Another example Lack discussed was a ’70s and ’80s idea that Rock music had “backmasking” — Satanic messages when played backwards. Lack played an example. He deemed it pretty far fetched to imagine musicians actually managing this trick — or anyone being influenced by messages almost impossible to perceive.

A final phenomenon he spoke about was priming — the influence of “implicit nonconscious memory” — stimuli in one context affecting behavior in another. He displayed a woman’s face. Then an image which could be seen as either a saxophone player or a woman’s face. Having been primed by the first image to see a woman’s face, that’s what we saw in the second.

Dr. Lack concluded by saying we can’t rid ourselves of cognitive biases but can decrease their effects. One must examine one’s own beliefs, and use tools like the scientific method. And humility, he said, is crucial to critical thinking.

Advertisement