Politics, Science

Tribal cognition: a few additional notes

Free Photo: Tobacco Flower

In response to my recent post on tribal cognition as a barrier to reason-based political deliberation, a reader draws my attention to a 2012 New York Times Op-Ed. in which Cass Sunstein proposes a theory very close to what I called “tribal cognition”:

In the face of entrenched social divisions, there’s a risk that presentations that carefully explore both sides will be counterproductive. And when a group, responding to false information, becomes more strident, efforts to correct the record may make things worse.

Can anything be done? There is no simple term for the answer, so let’s make one up: surprising validators.

People tend to dismiss information that would falsify their convictions. But they may reconsider if the information comes from a source they cannot dismiss. People are most likely to find a source credible if they closely identify with it or begin in essential agreement with it. In such cases, their reaction is not, “how predictable and uninformative that someone like that would think something so evil and foolish,” but instead, “if someone like that disagrees with me, maybe I had better rethink.”

It follows that turncoats, real or apparent, can be immensely persuasive. If civil rights leaders oppose affirmative action, or if well-known climate change skeptics say that they were wrong, people are more likely to change their views.

In fact, a recent interview at Vox with Stephan Lewandowsky, author of The Debunking Handbook, suggests that many psychologists have already embraced Sunstein’s proposal. That is, they recognize that the perceived political identity of both messenger and message can influence whether someone is receptive to an evidence-based argument. In other words, it appears that psychologists studying political communication already view “cultural cognition” and (what I called) “tribal cognition” as distinct, and recognize that both can play important roles in thwarting reason-based deliberation. (In fact, the idea that people will be more open to persuasion by experts they perceive as sharing their values already appears in the Kahan et al. “HPV Vaccine” article from 2008 — before the Sunstein Op-Ed.!)

Continue reading

Politics, Science

Sources of political disagreement: “tribal cognition” versus “cultural cognition”

Free Photo: Balloon Test

Why does the presentation of persuasive evidence — even evidence of a scientific consensus — so often fail to resolve political debates? How is it, for example, that so much of the American public on the right refuses to accept the scientific consensus regarding the causes and risks of climate change?

For a while now, I’ve thought that Dan Kahan’s theory of “cultural cognition” offered the most persuasive answer to these questions. Kahan rejects the idea that the problem lies in Republicans’ lack of information about climate science. Offering more evidence isn’t going to resolve the issue at this point. It might even aggravate the problem.

Rather, Kahan offers empirical evidence that the Republican resistance to climate science is an example of a more general phenomenon: the human tendency to arrive at conclusions that are congenial to our cultural values, and to resist, dismiss, or attack conclusions that threaten our values and identities.

But the more I’ve learned about the specifics of the cultural cognition theory, the more I’ve felt like it leaves something out.

In this post, I’d like to propose a hypothesis that complements cultural cognition’s explanation for the frequent failures of evidence-based discussion to lead to increased agreement on politically charged issues. When I first heard about Kahan’s work, I thought that the theory I’m about to present was what he meant by “cultural cognition.” But as I’ve read more about his work, it’s become clear to me that the idea I have in mind is a distinct one.

I’ll call the hypothesis “tribal cognition.”

Continue reading

Literature, Politics

STEM education, imagination, and political failures

One reader responds to my recent attempts to defend the humanities in terms that those with power over universities might find persuasive:

One thing that strikes me [about arguments to promote STEM education] is that we’ve fallen into the trap of promoting one story so hard that we’ve become blind to its limitations.  What do I mean?  STEM education is valuable to our economy–it promises to develop the technical skill that our workers need to be competitive when unskilled labor is increasingly automated.  But everyone got so caught up in promoting that story that we lost sight of its limits–that the greatest problems facing us today are not engineering problems.

Yes, it’s true that breakthroughs in science and engineering will be required if we’re going to mitigate (forget stopping) the effects of global climate change.  But we wouldn’t need those breakthroughs (at least, not as much as we do now) if we’d been able to reach the political agreement that would have been required to implement a policy solution years ago. I hate to repeat a cliche, but the problems of the last thirty years have been failures of the imagination, rather than failures of engineering and implementation–e.g., the slow reaction to the HIV outbreak in the gay community, the consequences of deregulation, the reaction to 9/11, the challenges of creating a social safety net in an increasingly tribal society…

So if the humanities require a defense, part of that defense is certainly that a democracy is not sustained on STEM alone–that a competitive economy on a warming planet makes us nothing more than the tribe that’s putting up the most moai on Easter Island.  … [T]he very imagination that a democracy needs to survive is what a liberal education–a critical component of which is the humanities–is supposed to foster.