For those who are interested, I have a new study out in Medical Care titled "The Hazards of Correcting Myths About Health Care Reform" (gated) with Jason Reifler and Peter Ubel.
Here's the abstract (see also this morning's press release):
Context: Misperceptions are a major problem in debates about health care reform and other controversial health issues.
Methods: We conducted an experiment to determine if more aggressive media fact-checking could correct the false belief that the Affordable Care Act would create "death panels." Participants from an opt-in Internet panel were randomly assigned to either a control group in which they read an article on Sarah Palin's claims about "death panels" or an intervention group in which the article also contained corrective information refuting Palin.
Findings: The correction reduced belief in death panels and strong opposition to the reform bill among those who view Palin unfavorably and those who view her favorably but have low political knowledge. However, it backfired among politically knowledgeable Palin supporters, who were more likely to believe in death panels and to strongly oppose reform if they received the correction.
Conclusions: These results underscore the difficulty of reducing misperceptions about health care reform among individuals with the motivation and sophistication to reject corrective information.
And here's the key graph showing how the correction was effective among less knowledgeable Palin supporters but backfired with those who were more knowledgeable:
For more, see my previous articles on misperceptions and factual beliefs:
-Beliefs Don't Always Persevere: How political figures are punished when positive information about them is discredited (pre-publication version) (with Michael Cobb and Jason Reifler)
-Misinformation and Fact-checking: Research Findings from Social Science (with Jason Reifler)
-Why the "Death Panel" Myth Wouldn't Die: Misinformation in the Health Care Reform Debate (ungated copy)
-When Corrections Fail: The Persistence of Political Misperceptions (pre-publication version) (with Jason Reifler)
-The Limited Effects of Testimony on Political Persuasion (pre-publication version)
I wonder which Death Panel Myth Brendan was trying to correct:
1. The ObamaCare bill establishes a government body explicity named "Death Panel".
2. The ObamaCare bill includes a government body that will decide on a case by case basis who will or will not receive life-saving treatment.
3. The ObamaCare bill includes a government body that will decide on an overall basis who will or will not receive life-saving treatments.
4. Financial considerations will force ObamaCare inevitably to include a panel that will decide to limit care in some cases, even if current wording doesn't specifically define such a panel.
I think #1 and #2 are false, #3 is true and #4 is neither true nor false, since it's a guess about the future.
IMHO the vagueness of just what is meant by "death panel" makes this particular myth a problematic choice for the purpose of this research.
P.S. I haven't read the full paper. Perhaps the questions used by the reserarchers distinguished between various possible meanings of "death panel If so, then I withdraw this critcism.
Posted by: David in Cal | January 08, 2013 at 09:29 PM
#2 - here's the question we asked (agree/disagree):
President Obama’s original health care reform proposal would have created government panels with the power
to deny care to individual elderly patients.
Posted by: bnyhan | January 08, 2013 at 09:34 PM
These results may appear counterintuitive, disappointing or surprising, but it is actually normal and expected that the "politically knowledgable" would be less likely to change their mind in the face of fact-checking. These results are actually almost perfectly predicted by Quasi-Rational or Behavioral Economics (see Kahneman, Tversky, and Thaler). And although the terms didn't exist yet, it's perfectly in line with Festinger's classic "When Prophecy Fails". It's a wonderful demonstration of our inability to ignore sunk costs, and I thank you for conducting it.
A "sunk cost" is any irretrievable expenditure. Rationally, we should ignore any sunk cost. And we have often expected ourselves to behave rationally. But we know now we don't, and this has the kind of policy and debate implications the article points out.
To give a real world example. I'm going to visit a friend this weekend. Several weeks ago, I purchased my ticket. In the intervening time, other opportunities for entertainment have arisen. If I were a perfectly rational being (morality of disappointing my friend aside), I should, according to classical economics and, I dare say, the expectations of the authors of this study, base my decision on whether or not to go or stay on what will bring me the most pleasure this weekend.
But that is not the case. We know, now, that my decision will be highly influenced by the price of my ticket. I paid $3.50 round trip on the bus, and I am likely not to go. Had I paid $250 for a flight, I almost certainly would go.
The point is that the price of the ticket "should" have no bearing on my decision as to what I should do this weekend. That money is spent, gone, and irretrievable regardless. But it does influence my decision (and I'm even aware of this phenomenon!).
The crucial point is that the money (be it $3.50 for the bus or $250 for the plane) has already been spent, is irretrievable, and should have no bearing on the decision I make for what is best for this weekend. But it does.
We only have to change the term "knowledgable" to "invested" to see how this is replayed in this study. A "low information" person has invested less time, thought, and money into their decision. To change their mind is the equivalent of forgoing a $3.50 bus fare. But a person who is highly invested, who has spent time and perhaps money acquiring the knowledge of the "death panels" will be as beholden to those sunk costs as I would be to a $250 plane ticket. And the more expensive the ticket, the more likely I am to make the trip, no matter what events have become available if I stay.
And this is exactly what happens in this study.
Posted by: Matt Garbett | January 12, 2013 at 02:58 AM
In a way it's logical that, among those holding falswe beliefs, the more knowledgeable are less likely to change their mind in the face of fact-checking. The knowledgable ones have already heard the facts, but they've found a way to rationalize their beliefs anyway. Repeating the facts gives them something they've already seen and (wrongly) discounted. Also, their high level of knowledge may give them more confidence in their (false) beliefs.
Posted by: David in Cal | January 14, 2013 at 11:12 AM