In case you haven’t heard, the world is about to end on May 21. According to 89-year-old radio host Harold Camping and his followers, who have been placing billboards and subway ads across the country, people must repent now and “cry mightily” for God’s mercy in order to be “raptured” into heaven on Saturday, or risk being left behind as the world is wracked by earthquakes, plagues and, ultimately, complete destruction.
Psychology is typically lousy at predictions. But it can predict, with the weight of some pretty strong evidence, that if the world doesn’t in fact end this weekend, most true believers of the apocalypse will not lose their religion. Indeed, the failure of the apocalypse to materialize will only strengthen believers’ convictions.
The psychology behind this phenomenon applies to far more than fanatical religious behavior. It also helps explain everything from cults to the insidious influence of drug company paraphernalia.
Previous studies of apocalyptic religions have explored believers’ reactions when the world doesn’t end as predicted. For his 1956 classic, When Prophecy Fails, for instance, Leon Festinger followed members of a UFO cult who believed that they would be rescued by a spaceship on Dec. 21, 1954, after which the world would end in a catastrophic flood.
Before the prophecy foundered, believers had already begun preparing for the world’s end, quietly and with little publicity quitting jobs or school and giving away their money and possessions. Afterward — when their prophet, Dorothy Martin, explained that their faith had been the key to preventing the flood — the followers began a blitz of proselytizing, more convinced than ever that their beliefs were correct.
What accounts for this “irrational” behavior? Shouldn’t the failure of a very precise prediction for which they had made extreme sacrifices have prompted disillusionment and disgust — not greater commitment?
Not according to Festinger’s theory of cognitive dissonance, which predicts that the more we have given and invested in a particular point of view, the less likely we will be to abandon it in the face of contrary evidence. It’s the same cognitive process that kicks in when we are made to behave in ways that are inconsistent with our beliefs; in the face of that disharmony, we often change our beliefs to be congruent with our behaviors and self-perception. Cognitive dissonance is uncomfortable and leads people to seek resolution.
So, if we have demonstrated commitment to something or someone — the way the UFO cult members did — we become even more committed and will go so far as to change our ideas to remain consistent with that commitment and avoid any regret or sense of failure related to earlier choices. In other words, if we’ve paid a lot for a bunch of grapes, we will find them sweet — but if someone else gets them first, we will believe they were probably sour anyway.
Experimental studies and real-world observations have consistently supported this psychological theory. For example, when research participants were asked to perform an embarrassing task — reading explicit sexual descriptions aloud, including obscene words — in order to gain admission to what turned out to be a boring group, they claimed the group was much more stimulating than people who paid a lower price of admission. Fraternities and military organizations have used this tactic for centuries, of course; that’s why hazing will probably never be eliminated, because it does actually enhance group commitment.
Similarly, studies have found that when people pay more for a bad wine or meal, they are more likely to report that it was tasty, compared with people who paid less for the exact same crappy dinner or drink. No one wants to be a sucker.
Festinger himself led another study that offers more insight into this phenomenon. He had students do incredibly dull tasks: repeatedly turning pegs or placing spools on a tray. After they were lulled into sufficient boredom, the students were asked to do the researchers “a favor.” Some were offered $20 to do the favor, some were offered $1 and the rest nothing at all. The favor involved trying to convince another student (actually a researcher) that the task was not boring.
Later, another researcher asked the participants what they really thought of the task. Those who had been paid $20 or nothing said it was excruciatingly tedious. But those who were paid $1 mainly reported that it wasn’t that bad. Why? People who had either been paid handsomely or not at all had no reason to rationalize their lies; they knew the task was boring, but they were either getting paid enough to say otherwise or they were doing the researchers a genuine favor.
But those who were paid only $1 were in a bind: either they had to see themselves as being bought off cheaply or convince themselves of what they were telling the other student — that the task was not boring. They tended to fool themselves into believing the lie.
This is why, when proselytizers are sent out to seek religious converts — an undertaking that has an extremely low rate of success — it still works to reinforce religion. Even if the believers convince no one else to join, the act of proselytizing strengthens their own faith.
Cognitive-dissonance research has since been replicated in numerous settings and variations. And it has a bunch of disturbing implications. One has to do with marketing — that people think products are better when they are priced higher; high cost not only makes bad food and wine taste better, but it even works with medicines, with studies finding that brand-name drugs have a stronger placebo effect. Another has to do with social influence: if we’ve paid a lot for something, we tend to try to convince others of its value — not because it’s great, but because we don’t want to look like fools.
(More on TIME.com: Dr. Ben Goldacre talks Pharma, Placebos and Quacks)
This is how people wind up pushing their friends and families to take expensive “life changing” seminars. What’s more, the worse they are treated in these seminars, the more likely they are to try to convince others to join. A similar phenomenon happens when teens say that boot camps “saved their lives,” even as research shows it doesn’t reduce bad behavior.
Going back to the brand-name drug example, imagine the situation from a medical researcher’s perspective. Let’s say there’s a doctor getting paid $1 million to promote a drug company’s products; then, also consider the doctor who’s getting only a few pens and free samples to promote the same drug. Based on Festinger’s findings, it’s possible that the undercompensated doctor — the one who technically would have no “conflict of interest” to report — may actually be more compromised than the one who’s getting paid well to speak on behalf of the product.
Whether or not the world ends on Saturday, the effects of cognitive-dissonance theory are something to consider. Incidentally, Camping previously predicted that the world would end in 1994, so his die-hard followers have already been through this process once before.