Oh yeah, if the Canadians in the crowd start feeling morally superior due to the title, please note your own cognitive biases. It is not something that is limited only to our southern neighbours!
Here's the link to the original article in Wired:
https://www.wired.com/story/why-pure-reason-wont-end-american-tribalism
WHY PURE REASON WON’T END AMERICAN TRIBALISM
AUTHOR:
07:00 AM
Robert Wright (@robertwrighter) is an Ideas contributor
for WIRED and the author of Nonzero, The Moral Animal, The Evolution of
God, and, most recently, Why Buddhism Is
True. A visiting professor of science and religion at Union
Theological Seminary, he publishes The Mindful Resistance
Newsletter.
IF
YOU HAVEN’T encountered any reviews of Harvard
psychologist Steven Pinker’s new bestseller Enlightenment Now—which
would be amazing, given how many there have been—don’t worry. I can summarize
them in two paragraphs.
The
positive ones say Pinker argues convincingly that we should be deeply grateful
for the Enlightenment and should put our stock in its legacy. A handful of
European thinkers who were born a few centuries ago set our species firmly on
the path of progress with their compelling commitment to science,
reason, and humanism (where humanism means “maximizing human flourishing”).
Things have indeed, as Pinker documents in great detail, gotten better in pretty
much every way—materially, morally, politically—since then. And if we stay true
to Enlightenment values, they’ll keep getting better.
The negative reviews say things like
this: Pinker attributes too much of our past progress to Enlightenment thought
(giving short shrift, for example, to the role of Christian thinkers and
activists in ending slavery); his faith in science and reason is naive, given
how often they’ve been misused; his assumption that scientifically powered progress
will bring happiness betrays a misunderstanding of our deepest needs; his
apparent belief that secular humanism can fill the spiritual void left by
rationalism’s erosion of religion only underscores that misunderstanding; and
so on. In short: In one sense or another, Pinker overdoes this whole
enlightenment thing.
My
own problem with the book is the sense in which Pinker underdoes the enlightenment thing. In describing the
path that will lead humankind to a bright future, he ignores the importance of
enlightenment in the Eastern sense of the term. If the power of science and
reason aren’t paired with a more contemplative kind of insight, I think the
whole Enlightenment project, and maybe even the whole human experiment, could
fail.
If you fear I’m heading in a deeply
spiritual or excruciatingly mushy direction—toward a sermon on the oneness of
all beings or the need for loving kindness—I have good news: I’ve delivered
such sermons, but this isn’t one of them. Eastern enlightenment has multiple
meanings and dimensions, and some of those involve more logical rigor than you
might think. In the end, an Eastern view of the mind can mesh well with modern
cognitive science—a fact that Pinker could have usefully pondered before
writing this book.
Pinker’s
argument is more sophisticated than some caricatures of it would have you
believe. In particular, he recognizes the big kink in his famously optimistic
take on the future: Though reason can help us solve the problems facing
humankind, our species isn’t great at reasoning. We have “cognitive
biases”—like, for example, confirmation bias, which inclines us to notice and
welcome evidence that supports our views and to not notice, or reject, evidence
at odds with them. Remember how unseasonably warm it was a few months ago? The
answer may depend on your position on the climate change question—and that
fact makes it hard to change people’s minds about climate change and thus build
the consensus needed to address the problem.
Pinker also understands that cognitive
biases can be activated by tribalism. “We all identify with particular tribes
or subcultures,” he notes—and we’re all drawn to opinions that are favored by
the tribe.
So far so good: These insights would
seem to prepare the ground for a trenchant analysis of what ails the
world—certainly including what ails an America now famously beset by political
polarization, by ideological warfare that seems less and less metaphorical.
But
Pinker’s treatment of the psychology of tribalism falls short, and it does so
in a surprising way. He pays almost no attention to one of the first things
that springs to mind when you hear the word “tribalism.” Namely: People in
opposing tribes don’t like each other. More than Pinker
seems to realize, the fact of tribal antagonism challenges his sunny view of
the future and calls into question his prescriptions for dispelling some of the
clouds he does see on the horizon.
I’m not talking about the obvious
downside of tribal antagonism—the way it leads nations to go to war or dissolve
in civil strife, the way it fosters conflict along ethnic or religious lines. I
do think this form of antagonism is a bigger problem for Pinker’s thesis than
he realizes, but that’s a story for another day. For now the point is that
tribal antagonism also poses a subtler challenge to his thesis. Namely, it
shapes and drives some of the cognitive distortions that muddy our thinking
about critical issues; it warps reason.
Consider,
again, climate change. Pinker is not under
the illusion that many members of his (and my) climate-change tribe are under:
that people in our tribe have objectively assessed the evidence, whereas
climate change skeptics have for some reason failed to do that. As with most
issues, few people in either tribe have looked closely at the actual evidence.
On both sides, most people are just trusting their tribe’s designated experts.
And what energizes this trust? Often, I
think, the answer is antagonism. The more you dislike the other tribe, the more
uncritically you trust your experts and the more suspiciously you view the
other tribe’s experts.
For purposes of addressing this
problem, a key link in the tribalism-to-cognitive-distortion chain is this: The
antagonism is directed not just toward the other tribe’s experts but toward
their evidence. Seeing evidence inimical to your views arouses feelings of
aversion, suspicion, perhaps even outrage.
If
you don’t believe me, just observe yourself while on social
media. Pay close attention to your feelings as you encounter,
respectively, evidence at odds with your views and evidence supportive of them.
It’s not easy to do this. Feelings are designed by natural selection to guide
your behavior automatically, without you
reflecting on them dispassionately. But it’s doable.
And,
by the way, if you manage to do it, you’re being “mindful,” as
they say in Buddhist circles. Mindfulness involves being acutely aware of,
among other things, your feelings and how they guide your thought—an awareness
that in principle can let you decide whether to follow this guidance.
If
earning the label “mindful” isn’t enough of an incentive for you, how about
this: The foundational Buddhist text on mindfulness, the Satipatthana Sutta, says that complete and all-encompassing
mindfulness (of feelings, physical sensations, sounds, and much more) brings
full-on enlightenment—the utter clarity of apprehension that is said to entail
liberation from suffering. So to become a bit more mindful as you peruse social
media is to realize an increment, however small, of enlightenment in the
Buddhist sense of the term.
Or, to translate this back into Western
talk: an increment of making-inroads-against-cognitive-biases. So long as you
remain truly mindful, you will be less inclined to reflexively reject evidence
at odds with your views, less inclined to uncritically embrace—and impulsively
retweet—evidence supportive of your views.
One
take-home lesson from this mindfulness exercise is that the term “cognitive bias” is misleading. Confirmation bias isn’t
just a product of the cognitive machinery, a purely computational phenomenon.
It is driven by feeling, by affect. You reject
evidence inconsistent with your views the way you reject food you don’t like or
the way you recoil at the sight of a spider. The thought of embracing unwelcome
evidence makes you feel bad. You may even have an
urge to, in a sense, attack it—find the critical factual error or logical flaw
that you know must be propping it up. Evidence that supports your views is, on
the other hand, attractive, appealing—so much so that you’re happy to
promulgate it without pausing to fully evaluate it; you love it just the way it
is.
This view of cognitive biases is
consistent with a decades-long trend in psychology and neuroscience (a trend
that was anticipated by Buddhist psychology eons ago): the growing recognition
that the once-sharp distinction between cognition and affect, between thinking
and feeling, is untenable; thinking and feeling inform one another in a
fine-grained and ongoing way. I assume Pinker knows this at an abstract level,
but he doesn’t seem to have really taken it onboard.
That, at least, could help explain why
his prescriptions for combating cognitive biases sound less than potent.
He wants schools to do more effective
“cognitive debiasing”—to cultivate “logical and critical thinking” by
encouraging “students to spot, name, and correct fallacies across a wide range
of contexts.” Back when I was in high school, we did exercises very much like
this in English class, and they blessed me with an enduring tendency to … look
for such fallacies in arguments made by people I disagree with. Period.
And,
actually, human beings are pretty good at that even without special
instruction. The problem isn’t that natural selection didn’t bless us with
critical faculties; it’s that our feelings tell us
when to use those faculties and when not to use them, and they do this in a way
that typically escapes our conscious awareness.
Pinker also has some ideas specifically
geared to cognitive biases that surface in a tribal context. He suggests
changing the “rules of discourse in workplaces, social circles, and arenas of
debate and decisionmaking.” Maybe we can “have people try to reach a consensus
in a small discussion group; this forces them to defend their opinions to their
groupmates, and the truth usually wins.” Or get “mortal enemies” to “work
together to get to the bottom of an issue, setting up empirical tests that they
agree beforehand will settle it.”
These things don’t sound very
scalable—even leaving aside the question of how long any of the supposed
benefits would last in the wild.
I’m not saying these proposals are
worthless. And I certainly agree with Pinker that no student should graduate
from college without learning about cognitive biases. (I’d also encourage
college students to read this book, which, like all of Pinker’s books, is a
model of sharp analysis and clear exposition, an example worthy of emulation
whether or not you agree with all of it.)
Still,
if I could implement only one policy to solve the problem Pinker wants to
solve, it would be the teaching of mindfulness meditation in
public schools. One virtue of this approach is that it doesn’t involve
convincing participants to buy into some high-minded goal like collaborating
with “mortal enemies.” Indeed, the practice of mindfulness meditation often
starts as simple self-help—a way to deal with stress, anxiety, sadness. The
path from that to, say, more mindful engagement on social media is, if not
quite seamless, pretty straightforward.
The
path to full-on enlightenment is, of course, a bit more tortuous. Happily, the
salvation of humankind doesn’t depend on anyone in the next generation actually
getting there. (I’m agnostic on the question of whether anyone ever has gotten there.) One of the most underappreciated
aspects of full-on Buddhist enlightenment is the sense in which it is a state
of complete objectivity, an absolute transcendence of the perspective of the
self: a kind of view from nowhere. And from the very beginning of a mindfulness
meditation practice, there can be gains along that dimension; as you get less
reactive and more reflective, you can, in principle, get better at objectivity,
bit by bit by bit.
And note one bonus of this approach to
combating cognitive biases. By addressing the antagonism that underlies them,
mindfulness can make direct inroads on the more obvious threat posed by
tribalism—the conflicts that kill people, the simmering tensions that keep them
from getting together and solving problems like, say, the proliferation of
nuclear and biological weapons.
The great Enlightenment philosopher
David Hume—who used careful introspection as part of his methodology—famously
wrote that reason is “the slave of the passions.” Pinker doesn’t quote this
line. He does note that Hume and other Enlightenment thinkers were “aware of
our irrational passions and foibles” and says they saw that “the deliberate
application of reason was necessary precisely because our common habits of
thought are not particularly reasonable.”
But
I don't think the “deliberate application of reason” is by itself up to the
challenge. After all, our minds are designed to delude us into thinking
we are being reasonable when we’re not. It is only when
we make it a practice to look carefully at the mechanics of the delusion—look
at the way affect steers reason, the dynamic Hume so vividly described—that we
have much hope of solving the problem. And if you want to do that, if you want
to actually look at those mechanics and see them at work in yourself, then reason alone isn’t going to do the trick.
If you really want to see these things, I recommend that you start by sitting
down and closing your eyes.
Meeting of the Minds
·
Is mindfulness meditation a capitalist
tool or a path to enlightenment? Or is it both?
·
Either way, it might just be able
to save America from its plight of political
polarization.
·
And speaking of polarization: If you
really want to understand people who think differently than you, you would do
well to have that conversation offline.
AUTHOR:
No comments:
Post a Comment