Academic Freedom Conference: Rationality and Academic Freedom with Steven Pinker

Speaker 1:

Welcome to our afternoon session. It’s a great pleasure and an honor to introduce our next speaker, Dr. Steven Pinker. Dr. Pinker is a Johnstone Family Professor of Psychology at Harvard University. He’s studying how our mind interacts with language and visual world around us, how children acquire language, and how we communicate and cooperate. He also looked into such fascinating question as how we came to be as we are. That is about the role of evolution in shaping our language and our ability to communicate. Dr. Pinker’s work is recognized by numerous awards and accolades, and he’s also known well beyond the domain of his field.

He wrote several books for general audiences. I have one here with me. Most relevant to our conference that his books, The Better Angels of Our Nature and Enlightenment Now, where Steven Pinker explains that rationality and liberal enlightenment principles are good for humanity. He uses historic evidence and hard data to show how these principles made the world a better place for everyone through science, technology, and most importantly, humanism. This is why we should defend these principles, and that’s what we are here for. Pinker was named TIME’s 100 Most Influential People of the World Today and made it several times into Foreign Policy list of Top 100 Global Thinkers. Without further ado, please join me in welcoming Steven Pinker.

Steven Pinker:

Thank you. On the one hand, our species can claim astonishing feats of rationality, such as walking on the moon and taking pictures of our home planet, plumbing the mysteries of the cosmos, of life, of our own minds. We have pushed back against the horsemen of the apocalypse, blunting the scourges that have commiserated our species for hundreds of millennia, such as war, famine, poverty, and child mortality. At the same time, a majority of Americans between the ages of 18 and 24 think that astrology is “very” or “sort of scientific.” Large proportions believe in conspiracy theories such as that the Covid 19 vaccines were actually microchips that Bill Gates wants to implant into our bodies to surveil us. Many people circulate fake news such as, Joe Biden Calls Trump Supporters “Dregs of Society” or Yoko Ono: I Had An Affair with Hillary Clinton in the ’70’s. And many believe in forms of paranormal woo-woo such as, possession by the devil, extrasensory perception, ghosts and spirits, witches and spiritual energy in mountains, trees and crystals.

So how do we resolve this paradox, this contradiction? This is the subject of my most recent book, Rationality, What It Is, Why It Seems Scarce, Why It Matters. I’m going to draw out some of its implications for academic freedom this afternoon. Well, are we rational or aren’t we? How do we even ask that or address that question? One way is to ask, do people follow some of our benchmarks of rationality, normative models like logic and probability? Let me give you a couple of examples. These are demonstrations that have been in the psychology literature for decades, highly replicable, and I’ll throw in a couple of puzzles for you to ponder. Here’s a simple example from logic. Every card in a deck has a number on one side and a letter on the other. The challenge is, does the deck obey this rule? If a card has a D on one side, then it has a three on the other. Which cards do you have to turn over to verify whether the rule is true? We’ve got a D, we’ve got an F, we’ve got a 3, we’ve got a 7.

Hey, the experiment has been done since the 1960s, and the vast majority of people choose either the D card or the D and the 3 card. The correct answer is the D and the 7 card. Why D and the 7 card? Well, if you were to, let’s see, do I get my… Yes. So everyone knows that you’ve got to turn over the D card because if it doesn’t have a three on the other side, the rule is falsified. Everyone knows you don’t have to turn over the F card. A lot of people think you have to turn over the 3 card, but when you think about it, that is irrelevant. The rule says if D, then 3, not if 3, then D. This is an example of the fallacy of affirming the consequent. And you do have to turn over the seven card because if you turn it over and there was a D on the other side, it would falsify the rule if D, then 3.

The common explanation is that people are vulnerable to confirmation bias. Namely, we all are pretty good at seeking out evidence that confirms our hypotheses. We’re not so good at figuring out what would dis-confirm them. Here’s another example. This is from application of Bayes’ rule to medical decision making. In fact, I think this is sometimes called the Stanford medical diagnosis problem because it was first given to medical students at Stanford. In a population-wide screen, we find that the probability that a woman has breast cancer is 1%. If a woman does have breast cancer, the probability that you test positive is 90%, the sensitivity of the test. If she doesn’t, then the probability that she, nevertheless, tests positive is 9%, that’s the false positive rate. A woman tests positive, what is the chance that she actually has the disease? Again, this has been given over many decades. Most popular answer in a sample of physicians is 80% to 90%. The correct answer according to Bayes’ theorem is 9%. Yes, your doctor may tell you that you are almost certain to have a disease when you’re almost certain not to have the disease.

The standard explanation from Amos Tversky and Daniel Kahneman among others, is that we are vulnerable to base rate neglect. That is that people tend to ignore the priors in the Bayesian equation and they base their judgments on representative stereotypes. So what do we conclude from these demonstrations? What did these fallacies show? Well, one conclusion is that humans are just irrational, and this is a popular interpretation of the literature on judgment and decision making from cognitive psychology and behavioral economics. But not so fast. Here’s a twist on the logic problem. Consider this rule. If a bar patron is drinking beer, he must be over 21. You are a bouncer in a bar and you have to enforce the rule. Which of the following do you have to check? There’s a guy drinking beer, do you have to card him and find out how old he is? There’s a guy drinking coke, do you have to check his ID card? There’s a guy who’s obviously over 21, do you have to peer into his cup to see what he’s drinking? There’s a guy who’s obviously under 21, do you have to check what he’s drinking?

Well, now everyone will answer that you’ve got to check the age of the guy who’s drinking beer. You’ve got to check the beverage of the guy who’s under 21. But this is logically isomorphic to the card selection task if D, then 3, and all of a sudden, everyone is a logician. It’s called the content effect, not particularly enlighteningly, but circularly. But people are illogical with problems couched in abstract symbols, if P implies Q, D and 3, but they can be logical with certain kinds of meaningful content, particularly obligations and precautions, social contracts. Here’s a twist on the probability problem. 10 in every 1,000 women have breast cancer. Of these 10, 9 will test positive. Of the 990 women without breast cancer, about 89 will nevertheless test positive A woman tests positive, what is the chance that she actually has the disease? Well, this time people could think, well, 98 test positive in all, 9 of those have cancer, 9 out of 98, about 9%. Now, 87% of doctors get it correct. Even a majority of 10 year olds get it right.

But of course, this is identical to the Bayesian medical decision making problem that I presented a few slides earlier where everyone got it wrong. The explanation is that there’s a cognitive difference between natural frequencies, that is, proportion of events that we encounter in our lives versus single event probabilities. The probability that that woman has cancer. Which when you think about it, is something of a strange concept. Either she has cancer or she doesn’t. What does it even mean to say the probability that she has cancer? And indeed, it is not an intuitive concept, but proportion of a larger sample is an intuitive concept. So a better conclusion about human rationality in general is that people use ecological rationality, ecological in the sense of adapted to their natural environment. That is, we can reason about content that’s relevant to our lives, co-mingled with our subject matter knowledge. We can estimate probabilities as we encounter sequences of events in our lives.

People have more trouble with formal rationality, that is, abstract rules and formulas. P implies Q, if D, then 3. These are extremely powerful because you can plug in any content. You don’t have to be familiar with the subject matter, but they are not intuitive. They must be learned and consciously deployed. Now, I spend a lot of time in rationality talking about these normative models or benchmarks for rationality, how people fall short, what it means. But I have learned that that is not the question about rationality that people are curious about. Are we bayesians? Are we intuitive logicians? The question that everyone is curious about is, if people can be rational, why does humanity seem to be losing its mind? Why the conspiracy theories and the quack cures and the fake news and so on? It’s a challenging problem, and I think there are at least four parts of the answer.

The first is a robust phenomenon known as motivated reasoning. Rationality is always in service of a goal. Just thinking true thoughts is no one’s definition of rationality. It’s deployed as a means toward an end. That end is not necessarily objective truth. You could deploy your rationality to win an argument in which the stakes matter to you. As the journalist Upton Sinclair once said, “It’s very hard to get a man to understand something when his livelihood depends on not understanding it.” To show how wise and moral your group is, your religion, your tribe, your political sect, and how stupid and evil the opposing one is, and complimentary, to gain status and avoid ostracism as a hero for your side. A bias that’s sometimes called the myside bias, subject of an excellent recent book by Keith Stankovich. The meaning is obvious. Perhaps the most robust of all of the several hundred biases documented by cognitive science and behavioral economics. And by the way, one that, unlike all of the other biases, is not correlated with IQ. That is being smart is no protection against the myside bias.

Let me give you an example. So I’m going present a syllogism, and the question is, is the syllogism valid? That is, does the conclusion follow logically from the premises? If college admissions are fair, then affirmative action laws are no longer necessary. College admissions are not fair. Therefore, affirmative action laws are necessary. Okay, is this valid? A valid syllogism? The answer is, it is not. It is the fallacy of denying the antecedent. P implies Q does not mean that not P implies not Q, which is the form of this argument. Now empirically, a majority of liberals commit the fallacy. A majority of conservatives are immune to it. You ask conservatives, what is the explanation? It would be, well, we told you all along, liberals are irrational.

But here’s another syllogism. If less severe punishments deter people from committing crime, then capital punishment should not be used. Less severe punishments do not deter people from committing crime. Therefore, capital punishment should be used. Well this time, a majority of conservatives commit the fallacy and the liberals turn out to be the logicians. Of course, what’s happening in both cases is everyone wants to ratify the conclusion that their side endorses all along and they don’t follow through the logical implications. Okay, I think that that’s one part of the explanation of why people believe weird things. Another is that we all are equipped with some deeply rooted folk intuitions, ways in which we make sense of the world that are probably as old as our species, if not older, that probably have made some adaptive evolutionary sense, such as the intuition of dualism, that every person has a body and a mind.

If you’re a normal person, when you deal with someone else, you don’t treat them as just a hunk of meat. You don’t treat them as a robot. You impute a mind to them that they have beliefs and desires. Even though you can’t experience them directly, you can’t deal with them without treating them as someone that has a body and a mind. Well, from there, it’s a short step to imagine that there can be minds without bodies, and so you have beliefs in spirits and souls and ghosts and an afterlife and reincarnation and ESP. Another of these deeply rooted intuitions is essentialism, that living things contain some kind of invisible essence or stuff that gives them their form and powers.

Well from there, it’s a short step to think that disease must be caused by some kind of adulteration or contamination of one’s essence by some foreign substance. And so you get the intuition of resistance to vaccines as old as vaccines themselves. Because when you think about it, what is a vaccine? But you’re actually taking the very germ that gives you the disease and you’re injecting it into your own flesh. That is intuitively, deeply creepy. You also get receptivity to homeopathy and herbal remedies, which seem to be transferring a healthful essence of some other living thing into you. And you get the widespread practice discovered in, not just our culture, but in many cultures independently of various forms of purging and bloodletting and fasting and getting rid of toxins, all of which satisfy the intuition of somehow squeezing or purging or expelling the contaminant that made you sick in the first place.

And then there’s, in intuition of teleology. We know that our own artifacts, our own plans, are designed with a purpose. We build things, we do things in order to achieve some goal. Well, from there, it’s a short step to imagine that the whole world is designed with a purpose. And so you have belief in creationism and astrology and synchronicity and the vague Oprah-esque philosophy that everything happens for a reason. Okay, part three of the explanation, these folk intuitions are unlearned, at least among those who have unlearned them. And objective truths are acquired only by trusting legitimate expertise, scientists, historians, journalists, government record keepers, mainstream authors. Few of us really can justify our beliefs, including our true beliefs. So many people deny the scientific consensus on an anthropogenic climate change or on evolution. You ask a scientist why, and the universal reaction is, “Oh, we need more scientific literacy in this country. These people just don’t understand the science.”

Itself, a dogma, an empirical claim without any scientific support because if you actually do surveys of the scientific literacy of people who believe or don’t believe in evolution, who believe or don’t believe in human caused climate change, their levels of scientific literacy are the same. If you ask someone who believes in human made climate change, what causes it, they’ll say, “Oh, it doesn’t have something to do with the ozone hole and toxic waste dumps and plastic straws in the ocean?” They’ll have a vague sense of green versus polluting and that’s good enough for them. And likewise, tests of understanding of Darwin’s theory of natural selection among people who believe in evolution is equally clueless. What does predict people’s belief in climate change and evolution is their politics, pure and simple. The farther you are to the right, the more you deny climate change, the more you deny evolution. Flaky beliefs persist in people who don’t trust the scientific establishment. Some people do, some people don’t, and that’s what predicts their beliefs.

The final part of the explanation is in posing the question, why do people believe outlandish fake news and conspiracy theories? Part of the answer is, it depends what you mean by belief or by believe. George Carlin, once quipped, “Tell people there’s an invisible man in the sky who created the universe, and the vast majority will believe you. Tell them the paint is wet, and they have to touch it to be sure.” So it seems to me that people hold two kinds of beliefs. There’s what I call the reality zone. This is the physical objects around them, the other people they deal with face-to-face, the memory of their interactions. Where here, people treat beliefs as testable and they hold them if there are reasons to believe that it’s true. They have to in order to keep gas in the car and food in the fridge and the kids fed and clothed and off to school on time, you really have to respect the laws of reality because they are unforgiving. Your life will go much worse if you don’t. And indeed, people are pretty logical in this zone.

Then there’s what I call the mythology zone. So the distant past, the unknowable future, what happens in remote corridors of power, the White House, 10 Downing Street, the Kremlin, the microscopic, the cosmic, the counterfactual, the metaphysical. Here, beliefs are held not because they’re provably true or false, but because they’re entertaining, they’re uplifting, they’re empowering, they’re morally edifying. Whether they are true or false is unknowable and hence, irrelevant. That’s not why you believe them. And it’s easy to see why people would have this intuition. Until the modern era, the scientific revolution, the enlightenment, modern history and journalism. The truth of beliefs in the mythology zone was unknowable. You really did not have any basis for understanding what caused disease or what was the origin of life.

And this I submit, is the default human intuition when it comes to beliefs in these more cosmic, historic, microscopic, macroscopic realms. Our minds are adapted to the vast majority of human history. You don’t even have to think about the hunter gatherers and the Pleistocene Savanna. You just have to think about humanity for 99.9% of its existence, and 100% for people who have not signed onto enlightenment, science and history. So what do I mean by examples of beliefs about the mythology zone? Well, religion, which almost by definition is a matter of faith, not evidence. National myths, the glorious martyrs and heroes who founded our great nation, and it’s always a nuisance when real historians expose their feet of clay, historical fiction.

Does anyone really care whether Henry the V utter those stirring words that Shakespeare attributed to him? Or when you watch the TV series, The Crown, which purports to be nonfiction, do we really care whether Charles and Di really had the dialogue that the script writers wrote? Some historians called on the BBC to post a warning that many of the dialogues in the series were fictitious. The BBC refused, although they did post a trigger warning against bulimia. Fake news and conspiracy theories. It’s not clear in what sense people who believe them, believe them. So let’s take for example, Pizzagate, the predecessor to QAnon, according to which Hillary Clinton ran a child sex ring out of the basement of a Washington DC pizzeria, which in fact, turns out didn’t even have a basement. But what was the typical response of a believer in Pizzagate? An example is to leave a one star review on Yelp that said the dough was incredibly under baked and there were some suspicious looking men who were giving funny looks to my five-year-old son.

Now, this isn’t the response that you’d expect of someone who literally believed that children were being raped in the basement. You’d expect they would call the police. So what does, I believe that Hillary Clinton ran a child sex ring, really mean? What it really means is I believe that Hillary Clinton is so depraved that she’s capable of running a child sex ring, and how do we know she isn’t? Or probably a more accurate paraphrase would be the following, “Hillary, boo.” In other words, beliefs can be expressions of moral convictions. Well bringing it all back home. Here’s another candidate for a mythology zone, the sacred beliefs of intellectual elites, belief in the blank slate, in the permanence of bigotry and misogyny, in the inequity of the West, the independence of gender identity and biological sex, the ubiquity of abuse and trauma. And I’m sure you could extend the list.

It’s not clear that these are treated as empirical hypotheses so much as things that any decent person has to believe. Indeed, the mythological status of these elite beliefs, which I suggest is a human intuitive default, gets additional fortification by epistemologies like relativism, postmodernism critical theory, social constructionism, according to which, objectivity and truth are mere pretext to power. Well, that can easily merge with our default intuition that these cosmic moral beliefs are not empirical and they are signs of your moral commitment and purity. So this, I think, somewhat helps to us to understand the complete mutual unintelligible between enlightenment liberal science, according to which hypotheses about the causation of major events are in principle testable. There are ways of finding out one ought to ground one’s belief in the best evidence and the family of beliefs of postmodern critical wokeism according to which it just doesn’t even compute that these are things that we should and ought to determine the truth and falsity of.

Now, I don’t want to leave the right wing off the hook because there are counterparts in this skepticism of ability to ascertain the truth of politically tinged statements in right wing populist post-truth rhetoric, according to which, intellectual elites on the one hand, and strong, authentic leaders on the other are mere rival claimants to prevailing beliefs. There’s no fact of the matter as to whether things are true. It’s just a matter of which powerful elite you align with. And “flooding the zone with shit,” as Steve Bannon admiringly put it, can be a legitimate mode of discourse to undermine elite claims to truth, leaving the claims of the charismatic authoritarian leader as the only thing that people can put stock in. You see this, and we’ve seen another example of this in Vladimir Putin saying things that are patently preposterous, such as he’s rescuing Ukraine from Nazis like Volodymyr Zelenskyy, where it’s not clear how much he expects people to believe it as a verifiable proposition as by flooding the zone with manure. He undermines the credibility of any belief leaving the official doctrines of the charismatic leader as the remaining default.

So Bertrand Russell once said, “It is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true.” And I used to think this was, well, yes, obvious, trite, banal. Turns out that this is a radical, unnatural manifesto. If it seems banal and trite and obvious to you, then you are a child of the enlightenment who has signed on to a belief that is quite unusual in a human history. Well, how can we become more rational? A starting point is that the tools of formal rationality, the normative models of logic, probability, game theory, should become second nature. Rationality should be the fourth R together with reading, writing an arithmetic, captured in a headline from the satirical newspaper, The Onion, CDC Announces Plan To Send Every US Household Pamphlet On Probabilistic Thinking. However, it is a depressing truth sooner or later understood by every teacher, every university professor, that most students forget most of what they are taught as soon as the ink is dry on their exam. So it can’t just be that there is a course on critical thinking and then you’re done with it.

Norms of rationality should just be part of our conventional wisdom, our norms of propriety in any kind of intellectual exchange. There should be a widespread awareness of common fallacies like the availability bias, basing estimates of risk and probability on anecdotes rather than data. The myside bias arguing at hominem and so on. There should be a respect given to basing your beliefs on evidence and changing your mind when the evidence changes. This is also a highly exotic belief. Most people treat beliefs as signs of your fortitude, your integrity. That people will actually say that it’s important to stick with your beliefs even when the evidence goes against that, that’s a human tendency we should try to push against.

And understanding that because humans are fallible, conjecture and refutation offering a hypothesis and subjecting it to criticism, to attempts to falsify it are the only roots to knowledge. No one is infallible, no one is omniscient. The only way that our pathetic species can approach the truth is by brooding ideas and determining whether they are true or false with obvious relevance to freedom of speech and inquiry. Namely, when we suppress free speech, we are disabling our species only mechanism to ascertain the truth. Which leads to part three, how can we become more rational? Well, this gets back to the paradox of how we have collectively managed to attain such astonishing feats of rationality, even though each one of us is vulnerable to fallacies and biases, is that we get together and form institutions with rules explicitly designed to promote rationality.

The general idea is that one person can notice and make up for another person’s biases. Each one of us thinks that we are correct. We are victims of motivated reasoning of myside bias, although we’re much, much better at spotting the flaw in someone else’s argument. That is something that we can put to use. It makes us collectively more rational than any of us can be individually. So just to be concrete, remember the card selection task if D, then 3? And you remember that typical performance of a member of our species is not a pretty sight, about 1 out of 10 get it right. You put people in small groups and ask them to solve the problem, and 7. Out of the 10 groups get it right. All it takes is for one person in the group to spot the correct answer, and they almost always convince their group mates of it.

What do I mean by rationality promoting institutions? Well, an obvious one would be science with its demands for empirical testing and peer review. Democratic governance with its checks and balances and open debate. Journalism with its demands for editing and fact checking the adversarial pre proceedings in the judicial system. Even Wikipedia, which to everyone’s surprise has turned out to be surprisingly reliable, has achieved that with a community of editors that can correct each other’s work, all of whom have to sign on to commitments to neutrality and objectivity. And we can compare unfavorably, most other electronic sources of information which don’t have that commitment or those feedback mechanisms. And of course, kind of, ish, academia with its commitment, when it does have that commitment, to freedom of inquiry and open debate.

What it means is that if we have any hope of enhancing rationality, we’ve got to secure the credibility and objectivity of these rationality promoting institutions. Self-appointed or other appointed experts should be prepared to show their work and not deliver pronouncements as if they were just another priesthood. Fallibility should be acknowledged. We all start out ignorant of everything, and that should be acknowledged so that if advice does change in response to evidence, that’s not a sign of weakness, not a sign of lack of expertise, but quite the contrary. That is our only way of determining what the truth is.

And of course, gratuitous politicization should be avoided. We should not brand issues with the tribal identity of the left or right if we hope to bring out the greatest possible rationality, because myside bias is one of the greatest threats to rationality. So climate change, vaccines, public health measures should not be branded as left-wing causes. Academic freedom should not be branded as a right-wing cause. This is a point made by others, including Nadine Strossen, we should have, at our fingertips, all of the examples in which free speech has been essential to progressive causes, such as abolition of slavery, civil rights movement, opposition to the war in Vietnam, that we should not confine ourselves to examples where it’s been people on the right of center who have depended on free speech. Indeed, this is an important enough conclusion given the literature on human biases and fallacies that I will say it again, academic freedom should not be branded as a right-wing cause.

Speaker 1:

Thank you so much for this wonderful lecture and question I have is how do you talk to manifestly irrational people? Is there a strategy?

Steven Pinker:

How do you talk to-

Speaker 1:

To manifestly irrational people.

Steven Pinker:

Manifestly irrational people. Okay. Well, the thing is, so it’s a little bit like the question, how would you convince the pope that Jesus was not the son of God? The answer is you can’t. So you can’t convince everyone of anything, but there are people who are not yet committed to one side or another. There are new babies born all the time who aren’t born believing in QAnon, and you can try to persuade them. You can also try to mobilize some of the perhaps, illicit contributors to belief just to right the scale. So in the case of climate change, for example, I think it’s extremely important to get people on the right to just endorse the very idea that there is human made climate change and to emphasize possible ways of dealing with climate change that don’t feel too lefty greeny like a carbon tax, like nuclear power.

Not necessarily because they are the best responses. I mean, I happen to think they are, but just that if it doesn’t feel like if you believe in climate change, then you’ve got to endorse undoing economic growth or heavily handed government regulation to try to separate the possible solutions with their political colorings from the empirical hypothesis that there is a phenomenon in the first place. And there are other ways in which you could try to got to bend opinion through persuasion ideally, but also through association. Realizing that you’re not going to be able to convince everyone of everything.

Speaker 1:

So it’s not hopeless.

Steven Pinker:

So I think it’s not hopeless.

Speaker 3:

Steven, can you comment on what it is you feel in recent years that has caused this resurgence of perhaps irrationality? For someone who doesn’t know anything about psychology, I’ve heard the phrase mass psychosis banded about. Could you explain that to us and does that have a role?

Steven Pinker:

Yeah, so we shouldn’t take it for granted that there has been an increase in irrationality because when it comes to domains other than people’s day-to-day lives, it’s the human default. We’ve always had conspiracy theories. They’ve resulted in wars and pogroms and genocides. We’ve had fake news about paranormal phenomena. It’s called holy scriptures. We’ve had journalism in the 19th century. Before the newspapers got their act together, it was almost entirely fake news. There were extraterrestrial civilizations, there were sea monsters, there were two-headed babies. The Abraham Lincoln had seances in the White House to commune with the spirits of dead ancestors. Many scientists were dualists, they convened seances. So there’s a lot of background irrationality. The way I think about it is we’re living through an era of irrationality inequality.

At the top end, we’ve never been more rational in the sense that we’ve applied tools of rationality to domains that formerly were completely confined to intuition and hunches and tradition. Everything from evidence-based medicine to data-driven policing, moneyball in sports, now every sports team has a statistician, feedback informed psychotherapy. So there are, to say nothing of the advances in technology and biotechnology. So at the top end, we’re getting very, very rational, but as the human default, many of us continue to indulge in this fantastic mythological thinking. And probably the internet and social media has helped. Formerly, there were supermarket tabloids with sightings of Elvis and babies born talking, but they can clearly propagate more readily. Even though social media do play a role, I think that it would be unwise to attribute it all to social media. I think that’s just a little too pat. There’s other democratized media, like AM talk radio, like cable news, prior to social media.

There’s also the fact that we are more residentially segregated by education and social class. So you have college educated people staying in college towns or big cities, leaving behind rural and outer suburban and ex-urban and they’re less likely to rub shoulders, so more likely to develop polarized beliefs. Fewer institutions that cut across social classes, including classes with different levels of education like churches, like bowling leagues, like the armed forces. And so there’s more segregation into self-reinforcing echo chambers.

And also with growing overall affluence, people differentiate themselves in status competition, not by flaunting luxuries like they did during the gilded age. It would be tacky to wear an ostentatious fur coat or a big diamond pinky ring, but by their beliefs. People’s beliefs become more central to their identity. They care more about their belief on equity and justice and meaning and purpose rather than just keeping a roof over their heads and food on the table. That’s just a general trend in modernity, is for beliefs to be a greater concern. And that can include beliefs that differentiate your class, your kind of people from the kind of people you don’t want to be associated with. And so you get a proliferation of beliefs for their own sake.

Christopher Nadon:

My name’s Christopher Nadon, and I’m hoping you can diagnose my fallacy. So I didn’t know that speech was a right wing or left wing fallacy, but you say it’s that it’s a right wing issue, is a fallacy. I take it that doesn’t mean that believing it’s a left wing issue is true. It’s also a fallacy, right?

Steven Pinker:

Yeah.

Christopher Nadon:

Okay. So I’m good okay? So far I’m okay?

Steven Pinker:

So far so good.

Christopher Nadon:

Okay.

Steven Pinker:

Yeah, because speech is what you do to establish anything about anything. So almost by definition it’s not politically fallacious.

Christopher Nadon:

That’s what I was hoping, okay. It’s been a while since I’ve been in that logic class. Actually, I don’t think I was ever in a logic class, as I’m about to demonstrate. Now maybe here’s the fallacy. So it’s a 50/50 chance as to whether it is left wing fallacy or right wing fallacy that I’m going to come to a conclusion of, does your understanding of rationality help us understand why so many people today, at least on American campuses, believe that it’s a right wing fallacy.

Steven Pinker:

Well, partly because, as we’ve heard from a number of speakers, a lot of elite zones of discourse, particularly universities, have drifted or perhaps lurched leftward. And many of the attempts to impose a consensus of elite beliefs consists of punishing people who don’t have hard left beliefs, which has become the reigning belief system equated with decency in these institutions that have lurched leftward. But as Greg Lukianoff has shown, in reality of course, there also are threats from the right. When I was at MIT for many years, the president once told me, President Chuck Vest, that he would get every week some enraged alumnus insisting that he fired Noam Chomsky. So the pressure can certainly come from the right as well.

Speaker 5:

I’m surprised it was only once a week.

Christopher Nadon:

Two questions. Again, thank you. But then you’ve just said that it’s a left wing fallacy because that’s what left wing people do, but I haven’t quite figured out, I mean, isn’t that a circular explanation itself? Because the university is left wing, it’s been adopted, but how did that happen, I guess? Can your understanding of rationality help us understand how that happened?

Steven Pinker:

So wait, why universities and other elite institutions tend to drift leftward? Is that the question?

Christopher Nadon:

Sorry.

Steven Pinker:

That is the question or not the question?

Christopher Nadon:

I’d like to let someone else speak.

Steven Pinker:

Okay.

Eric Rasmussen:

Two short questions. One style, one substance. Eric Rasmussen, Academic Freedom Alliance, paying for my t-shirt. The style one, is there some psychology reason for using the colors at the end of the sentence? Will we understand it better? Substance one, as an economist, I shouldn’t say this maybe, but most people aren’t very smart, so wouldn’t it be more rational for them to concentrate on deciding who to trust rather than trying to think logically, which they’re really bad at?

Steven Pinker:

Yeah. So the answer to the first question is that I, perhaps foolishly, tried to replicate the color scheme of the jacket of the book Rationality, which had a fade from warm to cool colors in the graphics of the presentation. The other is, well, it’s not just that most people aren’t… Well, I guess, by definition, half the people are below average in intelligence and the majority of people are not in the highest quantile. But it’s not just the people who aren’t so smart. None of us can have expertise in everything. So we all do have to trust in authority to a certain degree. What we ought to do is calibrate that trust according to the earned credibility of those institutions. That is not trust someone who just says I’m an expert, but to see on what is the basis on which they make claims to their expertise.

Namely, are they part of an institution where the rules that apply would tend to push toward truth and militate away from error? Such as peer review, open criticism, fact checking, reputation for accuracy, error correction mechanisms, which is, needless to say, one of the reasons we all should be defending academic freedom. Not only is it the only root to truth that exists for us, and perhaps any cognitive agent, short of being vouched safe, the truth by an angel, the only way to achieve truth is to try out ideas and then allow them to be evaluated. But also, in order to earn the credibility of the public, we’ve got to give them reason to believe that our newspapers, our government agencies, our universities, above all, our professional societies, really do run by rules that would tend to push them in the direction of truth, which is why the suppression of speech, the punishment of dissidence, is so corrosive.

It gives people reason. I think they go overboard. So when people say to me, well, you say, in one of my other books that there’s a scientific consensus that human activity is warming the planet, but why should I be impressed by the scientific consensus? Everyone knows that if you were to voice a dissenting voice in a university, you’d be canceled. So I’ll just blow off the consensus. Now, that is an overreaction, because in general, for all of the horrors that we’ve heard about the last couple of days, still, on average, you’d be better off trusting something that came out of a university than a randomly selected Facebook page or tweet. And it’s important that we keep things in perspective, that there are horror stories, there are systematic areas in which there have been an error, but we need the control group. And if the control group is a community where there’s no review, that anyone can say anything, well, we actually know what that control group is. It’s Twitter and Facebook.

So I’m saying that it’s an overreaction, but nonetheless, it remains true that our greatest imperative for preserving academic freedom and the impression of academic freedom is to earn the trust of people in institutions like scientific societies, mainstream press, universities.

Eric Kaufmann:

All right, yeah. Eric Kaufmann, Birkbeck College Department of Politics. Question here is, it seems to me that rationality fits into your mythological category. In other words, whether I believe it or not, doesn’t really affect my life too much. Well, like burning myself on a pan. So the natural follow up question to this is, does rationality need something like cancel culture to survive and to thrive, or a set of myths behind it? Because just rationality alone, you’ve laid out some very rational reasons to believe in it, but that’s not necessarily going to make it win. So what are your thoughts on that?

Steven Pinker:

Well, no, because one of the great trends in just modern life is that the tools of rationality really can lead us to live better lives. That is, you really are better off if you get vaccinated. You are better off if you fasten your seatbelt, if you worry about falling off a ladder more than getting eaten by a shark. And there are actually data behind this, namely, people who are less susceptible to the classic fallacies in the judgment and decision making literature, the availability bias, base rate neglects, on cost fallacy, all of the standard behavioral, economic and cognitive psychology tasks, on average, they actually do have better life outcomes. That is, they’re less likely to get sick, less likely to get fired, less likely to undergo accidents. So we’ve got these tools, so there’s a reason to endorse rationality, that is, to not rely on your own experience, but to avail yourself on our collective expertise in your own life decisions.

Eric Rasmussen:

But just very quickly, as a project, rationality as a ideology, let’s say, for its success in an evolutionary battle against other ideologies, surely it needs to have something emotional behind it to compete. I mean, isn’t that part of what’s going to make it win in a purely evolutionary sense?

Steven Pinker:

Well, I think there is something to it in that if the aura of rationality, whether it’s cool or not, if it became more cool, that would be a good thing. And it might seem like there’s a paradox in arguing for rationality, in that what are you using to argue for it, if not rationality? On the other hand, that cuts both ways. And that is, if anyone casts any doubt on rationality, you can always say, “Well, is what you just said rational? And if it isn’t, why should I believe it? Why should you expect us to believe it?” So rationality, as soon as you say anything about anything and hope to persuade or argue, you’ve already lost any argument against rationality. You’re already committed to it.

Now, how do you make that more intuitive? How do you make it so that it’s more cool or hip? Or another way of putting it is how do you make it so that it’s embarrassing to commit a statistical or a logical fallacy? Well, this is what the so-called rationality community tries to do, at least among themselves. Now, the rationality community itself, the problem with it is not the rationality, the problem is the community. And they’ve developed their own local more A’s and norms that sometimes are dubiously rational. But the original idea that there should be just a tacit understanding, standards of what you do and you don’t do, of what earns you brownie points within your group. And they should include things like being epistemically humble, having some Bayesian intuitions. If we could export that to the culture at large, that would probably be a good thing for everyone. But I don’t have a recipe for doing that.

Speaker 1:

Unfortunately, [inaudible 00:52:15]

Speaker 10:

Yeah, we are late.

Steven Pinker:

Thank you guys. Thank you, everyone.