Steven Pinker on Rationality

By Fsrcoin

(This was my July 5 Albany Library book talk; slightly condensed)

Steven Pinker is a Harvard Professor of Psychology; a 600-pound intellectual gorilla of our times; author of a string of blockbuster books. His latest is Rationality – What it is – Why it Seems Scarce – Why it Matters.

In 2011, he wrote The Better Angels of Our Nature: Why Violence Has Declined. AndI recall where a radio interviewer was, like, Pinker, are you out of your mind? Violence declining? But of course that was well supported by evidence.

So now it’s Rationality. And many will similarly say, Pinker, are you out of your mind?

Evidence for human irrationality does abound. And this might seem the worst of times for a book celebrating rationality, with two big elephants in the room stomping on it.

One is American politics. Some voters have always behaved irrationally, yet the system functioned pretty well nevertheless. But now the inmates have taken over the asylum. Or at least one of our two parties; and recalling Yeats’s line: the best lack all conviction, the worst are full of passionate intensity.

Then there’s the international sphere. The Better Angels book emphasized three quarters of a century without wars among major nations. Russia’s Ukraine war blows that up. An assault on rationality.

But maybe, with the world seemingly gone mad, this book on rationality is actually timely.

The core of rationality is logic. Pinker gives the example of a logic puzzle, involving four coins. I’ll omit details; most people get it wrong. But Pinker says we’re better at applying logic when it “involves shoulds and shouldn’ts of human life rather than arbitrary symbols” like in the puzzle. He calls this our “ecological rationality,” our horse sense (though horses don’t have it to anything like our degree).

Here’s a simple logic problem that even many mathematicians, including Paul Erdos, have gotten wrong. The famous Monty Hall problem. On “Let’s Make a Deal,” there are three doors, one hiding a car and two hiding goats. You pick Door #1. Then Monty opens Door #3 to reveal a goat. Should you switch to Door #2? Most people say the one-in-three odds haven’t changed. Wrong! Monty opened Door #3 knowing it had a goat. He didn’t open #2 which, you therefore now know has a 2 in 3 chance of hiding a car. So you should switch.

Pinker emphasizes that rationality is goal oriented, saying “Do you want things or don’t you? If you do, rationality is what allows you to get them.” This entails using knowledge, which he defines as “justified true belief.” People are again logical and rational (generally) in everyday life, but too often fall down on the “justified true belief” thing.

Pinker suggests that seeking an ultimate philosophical reason for reason is misguided. Any postmodernist’s attempt to argue against reason implicitly concedes that rationality is the standard by which any arguments, even arguments against rationality itself, stand or fall. (Similarly, the assertion that nothing is really true would — if correct — apply to that assertion itself.)

And rationality is not just one among many alternative ways of seeing things. Not, as Pinker puts it, “a mysterious oracle that whispers truths in our ear.” Indeed, “reason is the only way we can know anything about anything.”

There’s a common idea that reason and emotion are separate, at odds with each other. Pinker quotes David Hume that reason is, and should be, “the slave of the passions.” While neuroscientist Antonio Damasio has shown that emotions give us the motivations for deploying reason, so the two are inextricably linked. Then Pinker notes that some of our goals can conflict with others; and “you can’t always get what you want.”

We furthermore have goals we don’t even choose, programmed into our genes by evolution. One rational goal may be a slim, healthy body; also making you more sexually attractive, thus advancing a gene-based goal of reproducing. While conflicting with a desire to eat a delicious dessert — which also serves an ancestral genetic goal, to load up on calories when you can. We use our reasoning minds to mediate among conflicting goals.

But of course not with perfect rationality. Scientific work, notably by Kahnemann and Tversky, has revealed many seemingly irrational human cognitive biases. Also programmed into us by evolution, during our long hunter-gatherer past. For example, we fear potential losses more than valuing gains. So may pass up a chance to win $5 if it means an equal chance to lose $4. Sounds irrational. But for our early ancestors, a “loss” could well mean death. And Pinker poses the question, what could happen to you today making you better off? What could happen making you worse off? A lot worse off? So maybe our loss avoidance bias is not so irrational.

And even if, in isolation, some of our ancestral cognitive biases still seem irrational, realize that we’re talking about short-cut heuristics enabling us to make quick intuitive decisions about stuff coming at us every hour of the day. If you had to think your way rationally through all of it, you couldn’t even function. But using that repertoire of innate heuristics, we do function quite well. Making their use quite rational in a broader overall perspective.

Now, what about morality? Hume famously said you can’t get an ought from an is — in other words, how things are (i.e., facts) can’t tell us how they should be (moral laws). Thus there can be no true moral laws, only opinions. Some solve this by invoking God as the source of morality. But that was knocked down by Socrates, in Euthyphro, asking whether something is moral because God says so, or does he say so because it is moral? If the former, why submit to his arbitrary edicts? But if God does have reasons for his moral rules, why not just embrace those reasons and skip the middleman?

Meantime, Pinker says morality is all about how we behave in relation to others. And there we can rationally recognize everyone’s right not to be unjustifiably messed with. If you feel free to bash others, you can’t say they cannot bash you. Thus Pinker posits impartiality as key — nobody’s personal perspective can override those of others. Which is basically the golden rule.

And note that this does not mean self-sacrifice. It’s actually rational from the standpoint of self-interest. Because it makes you feel good about yourself, and also makes a world that’s better for everyone, including you.

There’s a chapter on critical thinking. Pinker catalogs a host of traps we fall into: the straw man argument, moving the goal posts, what-aboutism, ad-hominem arguments, and so forth. Alas such things “are becoming the coin of the realm” in modern intellectual life. And Pinker quotes Leibnitz in the 1600s envisioning a world where all arguments would be resolved by saying, “let us calculate.” Lyndon Johnson liked to quote, “Come let us reason together.” Yet Pinker comments that life is not that simple, and doesn’t work by formal logic. We know what words mean, but applying them in the real world can be challenging. You can get in a lot of trouble nowadays trying to define the word “woman.”

Another chapter deals with probability and randomness. Many people have only a vague sense of what probability really entails. Do you fault the weatherman who said there’s a 10% chance of rain, and you get soaked? Or the political analyst who gave Hillary a 70% probability of winning? And we tend to judge an event’s probability by the availability heuristic, another Kahnemann-Tversky cognitive bias. That is, we judge how likely something is by how easily examples come to mind. Like with plane crashes.

In a state of nature, lacking better information, that’s not necessarily irrational. But modernity does give us better information, telling us plane crashes are much rarer than car crashes. Yet many people operate on the opposite assumption. And the availability heuristic scares people off from nuclear power — we vividly recall a few high profile accidents (which actually killed very few) — while ignoring the tens of thousands of deaths caused annually by air pollution from conventional power plants. They don’t come to our attention. Pinker calls the news media an “availability machine,” serving up stories which feed our impression of what’s common in a way that’s sure to mislead. (It’s why people always think crime is rising.)

The book goes through many examples of how we commonly misjudge probabilities. For example, it’s reported that a third of fatal accidents occur at home. Does that mean homes are very dangerous? No; it’s just that we spend a lot of time there. We confuse the probability that a given fatal accident occurred at home with the probability that a fatal accident will occur while at home. Two very different things.

Or how about this? A majority of bicycle accidents involve boys. Does that suggest boys ride more recklessly? Or — that boys ride more than girls?

We also overrate the significance of coincidences. I’m often at my computer typing, with the radio on. Is it spooky when I hear a word on the radio just as I’m typing the same word? Not really. I type a lot of words, and hear a lot of words. So such coincidences are bound to occur regularly. Even sometimes with obscure words. My favorite instance: Equatorial Guinea mentioned on the radio just as I was working up a coin from that country. What are the odds? Well, finite.

There’s a chapter on Bayesian reasoning, named for Thomas Bayes, an 18th century thinker. It’s all about how added information should modify our predictions. Like in the Monty Hall problem: his opening one door added information. A key concept is the “base rate.” Suppose 1% of women have a certain disease. There’s a test for it, 90% accurate. Suppose a woman tests positive. What is the chance she has the disease? Most people, including doctors, give it a high probability — forgetting the base rate, which is again only 1%. Bayesian math here tells us that with a disease that rare, a test 90% accurate will produce about ten times more false positives than true ones. So the gal’s likelihood of having the disease is only about 9%. In Bayesian lingo, the 1% is the “prior” — prior information giving us expectations we modify with further information — the test.

One of the most hated theories of our time, Pinker says, is “rational choice theory.” Associated with Homo Economicus, the idea that people act to maximize self-interest. Well, of course we know they do; yet don’t always. Pinker cites an experiment where money-filled wallets were dropped, and most got returned. However — was that really against self-interest? Again, most people feel good about themselves when doing the right thing; shameful and guilty otherwise. And what is life about, if not feelings? Pinker comments that rational choice theory “doesn’t so much tell us how to act in accord with our values as how to discern our values by observing how we act.”

So far I’ve talked about making decisions and choices for ourselves. But it’s another thing when dealing with someone else who’s also trying to maximize their self-interest. This is game theory, which Pinker says explains a lot of behavior that might seem irrational. He mentions the game of chicken, which I once wrote a poem about:

Here’s the trick to playing chicken:

You just keep driving straight,

And don’t swerve, ever.

The other guy will always swerve first.

You’ve got to be crazier than the other guy.

And if the other guy is crazier than you,

And doesn’t swerve,

And you’re killed in a fiery crash,

So be it.

The classic illustration for game theory is “the prisoner’s dilemma.” Two partners in crime are interrogated separately. Each is told that if he rats on the other, he’ll go free, and the other gets ten years. If both talk, each gets six years. If neither talks, each gets six months. So collectively they’re better off staying mum, but only if both do, and neither knows what the other will do. Self interest for each says talk. And if both talk, they’re screwed with six year sentences.

There’s seemingly no good solution. But if the game is repeated, it turns out the best strategy is tit-for-tat — betraying a partner only if previously they betrayed you. And in fact much of human social life resembles this. We indeed behave toward others like it’s a repeated series of prisoner’s dilemma; and that’s why social cooperation tends to prevail. We still can get “the tragedy of the commons,” where individual self-interest ruins things for everybody. But that’s not actually so common. People mostly restrain themselves.

Next topic: correlation does not mean causation. The concept of causation, says Pinker, is at the heart of science — figuring out the true causes of things. So we can do something about them.

Pinker likes to put some humor in his books. A husband couldn’t satisfy his wife in bed. They consult a rabbi. He suggests they hire a buff young man to wave a towel over them in bed. It doesn’t work. So next the rabbi suggests switching: the young man shtups the wife while the husband waves the towel. An lo, great results. So the husband declares to the young man: “Schmuck! Now that’s how you wave a towel.”

And that’s how Pinker illustrates the concept of causation.

So finally he gets to the question: what’s wrong with people? Saying we have a “pandemic of poppycock.” Belief in Satan, miracles, ESP, ghosts, astrology, UFOs, homeopathy, QAnon, 2020 election fraud, replacement theory. And when science produced one of its greatest near-miracles — Covid vaccines — a lot of of Americans said no thanks.

Pinker acknowledges that all the logical and cognitive pitfalls he discussed play some role. But none of that could have predicted QAnon. He also won’t blame social media, pointing out that conspiracy theories and viral falsehoods are probably as old as language. Look at the Bible — talk about fake news. Meantime, even the most flagrant conspiracy mongers still behave, in mundane day-to-day life, with great rationality. So what indeed is going on?

For one thing, rationality can be a nuisance, producing unwelcome answers. Pinker quotes Upton Sinclair: it’s hard to get someone to understand something if their income depends upon their not understanding it. So we use motivated reasoning to reach a preferred conclusion. Indeed, Pinker says the true adaptive function of our reasoning ability may be to win arguments: “We evolved not as intuitive scientists but as intuitive lawyers.” Thus confirmation bias: we embrace any supposed information that confirms a cherished belief, while dismissing or disregarding anything discordant.

However, Pinker suggests the rational pursuit of goals needn’t necessarily encompass “an objective understanding of the world.” Which might conflict with, for example, a goal of fitting in with your peer group (a big propellant for confirmation bias). Pinker calls this “expressive rationality” — adopting beliefs based not on truth but as expressions of a person’s moral and cultural identity. (A related word perhaps strangely doesn’t appear in the book: groupthink.)

Pinker focuses here on our political polarization, between what have really become “sociocultural tribes.” Resembling religious sects “held together by faith in their moral superiority and contempt for opposing sects.” True of the woke left, but especially Republicans, now epitomizing members of a religious cult — whose sense of selfhood depends upon their not understanding that their deity is a stinking piece of shit.

But most Americans actually consider themselves less susceptible to cognitive biases than the average person. That’s the Dunning-Kruger effect — people with deficient thinking skills lack the thinking skill to recognize their own deficiency.

So Pinker says the paradox of how we can be both so rational and so irrational lies in self-aggrandizing motivation. Just as the core of morality is impartiality, likewise with rationality, one must transcend self-interest. I try to apply an ideology of reality — shaping my beliefs on the facts I see — rather than letting my beliefs shape the facts I see. But that does not come naturally to most people.

As Pinker notes, the most obvious counter-example is religion. Yet this book about rationality has relatively little to say about religion. Perhaps Pinker feared turning too many people away from his message. But “faith,” as Mark Twain put it, means believing what you know ain’t so. Believing things despite lack of evidence; even in defiance of evidence. And Pinker does say, “I don’t believe in anything you have to believe in.”

But what does it really mean to believe something anyway? An interesting question. Many religious people believe they’re going to Paradise. Yet few are in any hurry to depart. Pinker distinguishes beliefs consciously constructed versus intuitive convictions we feel in our bones. And we divide the world into two zones: hard factual reality, where our beliefs tend to be accurate and we act rationally; and a zone where reality is more elusive, a zone of mythology, not undermining our day-to-day functioning. There, even holding a false belief can be rational in the sense of serving certain goals — making one feel good, tribal solidarity again, or avoiding fear of death.

Pinker does fault our society for failing to sufficiently inculcate some of science’s foundational principles (which contradict religion): that the universe is indifferent to human concerns, that everything is governed by basic laws and forces, that the mind is something happening in the brain. Thus ruling out an immortal soul.

But, ever the optimist, he also reminds us how much rationality is actually out there. Some people distrust vaccines, but not antibiotics (and so much else in modern medicine and science). And culture can evolve. Ours has evolved tremendously; a lot of what was acceptable not so long ago is no longer acceptable. (There may be some overcorrection.)

It’s a battle against what Pinker sees as a “tragedy of the rationality commons.” Wherein self-interested and self-motivated argumentation gobbles up all the space. Yet he thinks the greater community can mobilize against this; for example, internet media in particular have awakened to the problems, roused by two big recent alarm bells: misinformation about Covid, threatening public health, and about the 2020 election result, threatening our democracy.

The final chapter is titled “Why Rationality Matters.” As if that still needs answering. Pinker presents a whole catalog of how common mistakes of rationality cause concrete harm. He cites one study identifying 368,000 people killed between 1970 and 2009 from blunders in critical thinking. I said to myself: really? Only 368,000? And of course countless Americans died from Covid irrationality.

Yet still, immense technological progress, improving quality of life (and its length) has been achieved through rationality. Likewise our moral progress, in a great roll-back of cruel unjust practices. Pinker says that in researching this, his greatest surprise was how often the first domino was reasoned argument. Very powerful after all.

I would add that globally speaking, a huge factor propelling human rationality has been the spread of education (the Dunning-Kruger effect notwithstanding).

Well, it might seem like I’ve veered back and forth between positive and negative. But I’ll conclude with the book’s final words: “The power of rationality to guide moral progress is of a piece with its power to guide material progress and wise choices in our lives. Our ability to eke increments of well-being out of a pitiless cosmos and to be good to others despite our flawed nature depends on grasping impartial principles that transcend our parochial experience.”

That is, rationality. Humanity’s best idea.