Rank #1: Living With False Prophets
This week, I talk about the doomsday religion that social psychology Leon Festinger studied in the 1950s. His studies led him to develop his theory of cognitive dissonance. The concept of cognitive dissonance has been with us for a long time (Wikipedia points out an example from Aesops Fables), but Festinger was the first to coin the term. In doing so, he launched a line of research that has been really fascinating.
Festinger used his ideas about cognitive dissonance to predict that the failure of Martin’s prophecy would leave the Seekers conflicted. This conflict would drive them to add consonant elements to their situation, things that would make them feel better about their beliefs. More specifically, he predicted that the Seekers would try to convert others to their cause.
Obviously, people don’t always continue to believe something after it’s been disproved. So what determines the stubborn belief and proselytization shown by the Seekers? Festinger suggests there are five conditions that need to be present.
- A person must be committed to their belief to the point that it influences their actions.
- These actions must be important and difficult to undo (such as leaving your job or spouse).
- The belief must be specific and concrete, so that there can be no doubt when it is proved wrong.
- The belief must be proved wrong, and this disproving must be recognized by the person holding the belief.
- The person holding the belief must have social support, made up of people who believe the same thing.
In discussing Festinger’s theory, researchers Hardyk and Braden (see the references below) say;
“These five conditions define a situation in which the believer has two sets of cognitions that clearly do not fit together. That is, he is experiencing a great deal of dissonance between the cognitions corresponding to his belief and the cognitions concerning the failure of the predicted event to occur. This situation, however, is one in which it is almost impossible for the individual to reduce his dissonance. He cannot give up his strongly held beliefs, and he cannot deny that the predicted event has failed to occur. He is also unable either to reduce the importance of his commitment to his beliefs or to make the disconfirmation irrelevant to them. Therefore, the believer who holds to his belief under these conditions has but one recourse if he is to reduce the dissonance; he must seek new information consonant with his beliefs. One of the best sources of new consonant cognitions is the knowledge that others’ beliefs are the same. The authors suggest, then, that the need for new supporting cognitions will lead the believer to try to convince others of the validity of his beliefs.”
But here’s the really interesting thing- studies of other groups built around apocalyptic predictions have found different patterns of behavior. In the same article quotes above, Hardyk and Braden talk about The Church of the True Word, which predicted nuclear war on a specific date. They retreated to bunkers they had built in the desert, in the hopes that they would emerge several weeks later and convert the fraction of the population spared by the fallout.
If you remember anything from history class, you’ll probably recall that nuclear war never did happen. Eventually, the Church of the True Word came out of the ground, excited to be alive. They decided that they had misinterpreted the messages from God, and were pleased that they had passed the test of faith. However, they didn’t try to convert others to their cause, and even got a little camera shy around all the reporters that came to record their emergence. All this, despite the fact that they met the five conditions that Festinger said would lead to proselytization.
Hardyk and Braden suggest some possible explanations for this. They say that maybe True Word members had more social support to begin with, so they didn’t have as much of a need to convert others. They also point out The Church of the True Word wasn’t ridiculed in the media the same way the Seekers were, so maybe there wasn’t as much dissonance for them. But I think the broader point should be that even though the True Word didn’t feel the need to proselytize, they still needed to do something to resolve their cognitive dissonance. In this case, they decided that they had misunderstood the original prophecy.
There are a lot of other studies that build off of Festinger’s theories, and many that criticize them. This article by Dawson provides a great overview. It’s an important line of research, and not only because doomsday prophecies are surprisingly common. By understanding how people handle conflicting information, we can better understand ourselves and others as we navigate the complications of everyday life.
When Prophecy Fails– Leon Festinger, Henry W. Riecken and Stanley Schachter; Wilder Publications
Prophecy Fails Again: A Report of Failure to Replicate– Jane Allyn Hardyck and Marcia Braden; Journal of Abnormal and Social Psychology
Nov 15 2013
Rank #2: The Case of the Lemon-Fresh Crook
This week, I talked about the incompetent bank robber that inspired the research behind the Dunning-Kruger effect. As I explain in the podcast, the Dunning-Kruger effect describes the inability of stupid people to realize their own stupidity. Or perhaps that’s overstating it a bit. When someone is inept in a particular domain, they’re often unable to realize just how inept they are.
As is often the case in psychology, David Dunning and Justin Kruger weren’t the first people to note the phenomenon they described. There are a number of wonderful quotes about this quirk of the mind from such thinkers as Bertrand Russel (“The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.”), Charles Darwin (“Ignorance more frequently begets confidence than does knowledge.”), and even Shakespeare (“The fool doth think he is wise, but the wiseman knows himself to be a fool.”). Still, Dunning and Kruger were from a prestigious university and they formally described the phenomenon and supported it with evidence. So in the end, they’re the ones after whom the effect is named.
One of the compelling things about the Dunning-Kruger effect is the fact that is seems to apply within so many different contexts. In the original paper (linked below), the authors tested the ability to assess how funny a joke was (they compared participant ratings to the averaged ratings of professional comedians), the ability to find grammar mistakes in a piece of writing, and the ability to answer questions of logic. They were very particular to design tasks that relied on different kinds of skills for success. The results were strikingly similar in each case; people who did poorly on the tests also did poorly at assessing their own performance. More specifically, the worst performers almost always thought that they were above average.
This is because, according to Dunning and Kruger, the skills that you need to perform a task well are the exact same skills you need to assess your own performance. These participants weren’t deluding themselves, and they weren’t in denial. They simply had a failure of metacognition. Metacognition is usually defined as “thinking about thinking,” and if I understand the Dunning-Kruger effect correctly*, it’s not some separate program running through the brain. It’s simply comparing the skills that you’ve already learned to your actual performance. If you haven’t learned any skills, it’s really difficult to assess how you’re doing.
Dunning and Kruger tested this in their original paper. They gave some of their participants a chance to come back and grade the performance of some of their peers. Then they gave them a chance to re-assess their prior performance.
We reasoned that if the incompetent cannot recognize competence in [themselves or] others, then they will be unable to make use of this social comparison opportunity… We predicted that despite seeing the superior test performances of their classmates, bottom-quartile participants would continue to believe that they had performed competently.
In contrast, we expected that top-quartile participants, because they have the metacognitive skill to recognize competence and incompetence in others, would revise their self-ratings after the grading task. In particular, we predicted that they would recognize that the performances of the five individuals they evaluated were inferior to their own, and thus would raise their estimates of their percentile ranking accordingly.
Their predictions were confirmed. While the highly skilled participants corrected their overly modest self-assesments, the incompetent participants did nothing to check their ego. (The incompetent were also worse at evaluating other tests). They simply didn’t have the metacognative skill to learn from the examples right in front of them. And that, I think, rather nicely explains a wide range of human behavior.
*If I misunderstood the Dunning-Kruger effect, but wrote a whole blog post about it anyway, the universe would probably explode with irony.
The Anosognosic’s Dilemma: Something’s Wrong but You’ll Never Know What It Is (Part 1)– Errol Morris; The New York Times
Dunning–Kruger effect– Wikipedia
Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments– Justin Kruger, David Dunning; Psychology
Image compliments of Maarten van Maanen.
Feb 04 2014
Rank #3: Regarding Henry
This week’s podcast is about the amnesic patient, H.M. (pictured left). I’d wager that H.M. (Henry Gustav Molaison) is the single most famous person in neuroscience. Pretty impressive for someone who couldn’t hold down a job after the age of 27. But as much as this is the story of H.M., it is also the story of the brilliant Dr. Brenda Milner. Dr. Milner used the polite, young man with amnesia to teach the world about how memory worked.
You don’t have to look very far before you realize that we’re all saturated with references to patient H.M. and his curiously disordered memory. He’s written about in almost every Intro to Psych textbook. The media makes throw away references to him in whenever they talk about memory research. His story was probably the inspiration for the movies Memento and 50 First Dates. Something about his inability to form new memories, about the way that he was constantly and helplessly living in the moment, seems to have captured our collective imagination.
Information about Dr. Brenda Milner, the brilliant woman who actually put H.M. on the map, seems to be a lot less popular. That’s a damn shame, because in addition to being a genius, Milner seems like a personable, funny and highly quotable thinker. In doing research for this episode, I was reminded more than once of Richard Feynman. Of course, Milner doesn’t have a memoir in which she talks about forays into professional gambling or preparing academic papers in strip clubs. But the autobiographical chapter on her life commissioned by the Society of Neuroscience displays the same honesty and love of science that’s found in the best of Feynman’s writing.
One of the things that makes H.M.’s comparative fame over Milner so peculiar is that his case is not nearly as unique as most people think. As I mention in the episode, H.M.’s surgeon was Dr. William Scoville. Scoville performed an experimental surgery in which he removed both of H.M.’s hippocampi. He was, by all accounts, surprised to discover his patient’s post-operation memory loss, and he called Milner for her expertise. While she was studying H.M., she discovered that Scoville had actually performed this experimental surgery before, on severely schizophrenic patients. For reasons that aren’t clear, Scoville never followed up on any of these cases. But Milner did.
Schizophrenia care in the 1950s was not a pretty thing. Antipsychotic medication had just been invented, but lots of physicians still used the old treatment paradigm. They drugged the patients into a stupor and strapped them to their gurneys. Sometimes they were put into isolation. Some “forward thinking” surgeons, including Scoville, jumped on the frontal lobotomy bandwagon before that proved to be a dead end. Amongst all of this, there was very little patient advocacy. So perhaps we shouldn’t be surprised that a Connecticut neurosurgeon was able to round up schizophrenic patients for an experimental surgery with little scientific basis. And perhaps we shouldn’t be surprised that after months of sending people home with brand new memory problems, he still didn’t know anything was wrong. If it wasn’t for Dr. Milner’s follow up testing, I’m not sure we would know about these anonymous patients at all.
At 95 years of age, Milner is in the (extremely protracted) twilight of her career. But hopefully there are other people waiting in the wings. People who won’t only make new discoveries or keep us fascinated, but who will remind us of the incredible power that scientists wield. Hopefully they’ll remind all of us that such power has to be used in service of others, and for the betterment of the world.
H. M., an Unforgettable Amnesiac, Dies at 82– Benedict Carey; The New York Times
Obituary: Professor William Beecher Scoville– World Federation of Neurological Societies
Still Charting Memory’s Depths– Claudia Dreifus; The New York Times
Understanding the human brain: A lifetime of dedicated pursuit. Interview with Dr. Brenda Milner– Chenjie Xia; McGill Journal of Medicine
Jan 18 2014
Rank #4: A Holy Trinity
This week, I talk about a surprisingly commonly delusion in which people believe themselves to be Jesus Christ. This is actually a specific version of the more general condition where a person comes to believe that they’re someone of enormous religious significance, often God himself. And these God delusions can themselves be considered a variety of delusions of grandeur.
We know that religious delusions have been around for a long time. The ancient Roman emperor Caligula, for example, referred to himself as the god Jupiter. A 2012 study from Poland analyzed a random selection of schizophrenic case studies from throughout the twentieth century, and found that religious imagery factored into about half of them. They also found that religious delusions and hallucinations have been decreasing over time. Maybe due to a decrease in how religious the surrounding society is? A British study of modern patients suffering from schizophrenia found that only 24% of them had religious delusions. It also found that religious delusions are associated with more severe cases of mental illness. Other differences exist as well. Men seem to be the ones that are most likely to think of themselves as God, while women more humbly believe themselves to be saints. The same study found that religious delusions aren’t connected to a person’s religiosity (how devout they are).
So maybe the religiosity of a society, but not the religiosity of an individual, determines the likeliness that a mentally ill person will have religious delusions. Maybe that explains why so many people who visit Jerusalem start believing that they’re Jesus.
Regardless of how one comes to the delusion, the actual experience of believing that you’re Jesus Christ seems to be no less incredible than you’d expect. After interviewing several of his patients who suffered from the condition, psychologist Alan Gettis described it as the ultimate experience of awesome grandeur and terrible persecution, every waking moment, at the same time. One of his patients, Larry, was exhausted by the overwhelming obligation he felt to the multitudes in his flock.
It’s worth mentioning that none of the three Christs I talk about in this week’s podcast were cured by Dr. Rokeach’s experiments. Identity is a complicated thing, and delusional identities even more so. We need to continue researching how the mind goes wrong and how to fix it so we can help people like Larry.
An Essay on Crimes and Punishments– Cesare Beccaria, Voltaire; Google eBook
The Jesus delusion: A theoretical and phenomenological look– Alan Gettis; Journal of Religion and Health
Jesus, Jesus, Jesus– Vaughan Bell; Slate [I owe a huge debt to Dr. Bell’s article, one of the few pieces in the popular press about The Three Christs of Ypsilanti]
The Three Christs of Ypsilanti– Milton Rokeach; The New York Review of Books
Dec 14 2013
Most Popular Podcasts
Rank #5: The Dreams of Otto Loewi
I’ve been holding onto this story since before I even started this podcast. A professor of mine used the story of Otto Loewi’s dreams (in his version, there were three dreams) to introduce a unit on neurotransmitters. When I sat down to research this week’s episode, I was surprised how much other interesting stuff was in Loewi’s life as well. The experiment he dreamed up got him the Nobel prize, and then the Nobel prize money allowed him to buy his freedom from the Nazis. In fact, the half of the money that went to his research partner and friend ended up coming back to him, and his family survived off of that money while they rebuilt the life that had been shattered by Nazi invasion. So in a way, Otto Loewi had a dream that, years later, saved his life.
It’s even more incredibly when you consider that there are about a thousand reasons why Loewi’s experiment shouldn’t have worked, even if his hypothesis about chemical communication was correct. By his own admission, Loewi never would have carried out the research if he had taken time to think about it.
“On mature consideration, in the cold light of the morning, I would not have done it. After all, it was an unlikely enough assumption that the vagus should secrete an inhibitory substance; it was still more unlikely that a chemical substance that was supposed to be effective at very close range between nerve terminal and muscle be secreted in such large amounts that it would spill over and, after being diluted by the perfusion fluid, still be able to inhibit another heart.” (Quoted from Wikipedia, which in turn is quoting Otto Loewi)
I want to clarify something about what Loewi’s research meant. His findings settled a major scientific debate of his day, showing that nerve cells use chemicals, and not electricity, to communicate. But a lot of you reading this have probably heard that nerve cells do use electricity. So what gives? Nerve cells, like muscle cells, are “excitable,” which means that they do something when they’re stimulated. If you zap a bone cell, nothing will happen, but if you zap a nerve cell in just the right way, a little pulse of electricity will run down its length. So Loewi’s contemporaries knew that electricity was involved in nervous system function, and they knew that there were tiny gaps between nerve cells. When they had to figure out how signals got from one cell to another, it seemed rather obvious to most people that the cells would use their electricity to send a little spark across the gap. Otto Loewi demonstrated that electrical signals are only used within individual cells, and that chemicals are used to transmit a signal from one cell to another.
Alas, nature loves exceptions to the rules. We now know that Loewi’s ideas about chemical communication are correct most of the time. In the 1950s, shortly before Loewi’s death, we discovered that some nerve cells don’t have a gap between them. Instead they’re connected directly to each other, with specialized proteins that anchor them in place. This union of cells is called a “gap junction” and they actually do send electrical information from one cell to another.
So don’t be afraid to turn to your dreams for inspiration, but don’t be surprised if people later find that they’re only partially correct.
Clinical Neuroscience– Kelly Lambert, Craig H. Kinsley; Macmillan Publishing
Inspiration From Dreams in Neuroscience Research– D. Todman; The Internet Journal of Neurology
Otto Loewi– Wikipedia
Otto Loewi’s Great Dreams– Renate G. Justin, MD; The Permanente Journal
Dec 06 2013