Citizen Science Musings: It’s Not the End of the World

Send to Kindle

By Sheldon Greaves

marvin-the-martianNot that there was much doubt of the outcome, but I did read around about what various people has to say about the so-called "end of the world" attributed to an ancient Mayan Calendar. The result is just like many other such scenarios: with expectations dashed, rationalization sets in to create a new explanation, or even a new reality to explain why was was certain to happen did not.

Having taught undergraduate classes in Religious Extremism in years past, escatology (the study of the end of the world) is something that intrigues me. Lately, I've also been reading about the role of bias and emotion, and how it interacts with our ability to make rational decisions and handle factual evidence.

It turns out that we humans are not nearly as rational as we like to believe. Anyone who works in marketing can tell you that most buying decisions are ultimately driven by emotion as much or more as by reason. Our brains are wired to react quickly rather than to stop and think things through. In a survival situation this makes perfect sense. That reaction to external dangers does not stop at physical dangers. As one article recently noted, "We apply fight-or-flight reflexes not only to predators, but to data itself." The author, Chris Mooney, continues:

In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers. Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

All kinds of people are susceptible to this behavior, including scientists, which is why they put such a premium on being careful, not overstating their positions, taking their time, letting the data speak for itself as much as possible, although sometimes even that isn't enough. Psychological experiments have demonstrated that moral standards and political ideologies have a great deal of influence over who people might consider to be an expert in the first place. Even the best scientific evidence can be misinterpreted badly if ideology is given too much play.

The recent expectation of the end and subsequent disappointment is a case in point where a sense of disconnect or apocalyptic hope or alienation from the world was so strong that a vague wording of an ancient text they could not read, let alone understand became the basis for believing an extraordinary prediction, and no amount of factual evidence could persuade them otherwise. Other studies show a "blowback effect" that happens when you confront an ideologue with facts that refute their claims; they dig in even deeper, cling to their ideas with even greater tenacity.

An explanation of why this happens is beyond the scope of this post, but I mention it as a cautionary tale as we head into the new year. To think like a scientist one has to strive for a somewhat unnatural self-awareness and willingness to put aside preconceived notions when the evidence demands it. This, more than anything else, is what to me defines a scientist.

 

And, from all of us at CSL, our warmest holiday greetings to all of our readers and friends, and our hope for a safe and prosperous New Year. -sg

This entry was posted in Best Practices, Citizen Science Musings. Bookmark the permalink.

Leave a Reply