by clicking the "Next" arrow.
by clicking on the page.
the page around when zoomed in by dragging it.
the zoom using the slider when zoomed-in.
by clicking on the zoomed-in page.
by entering text in the search field, and select "This Issue" or "All Issues"
by clicking on thumbnails to select pages, and then press the print button.
displays sections with thumbnails and descriptions.
displays a slider of thumbnails. Click on a page to jump.
allows you to browse the full archive.
about your subscription?
Mindful : October 2015
of an existing folder, it’s much easier to file—remember—than if it doesn’t . Similarly, if many Americans had not already been primed with the idea that Obama is an outsider and a threat to “people like them,” the birthers and death-panel assertions would not have gained the traction they did. So now we have widely-believed falsehoods. Let’s debunk them. MIT’s Berinsky tried. In a 2015 study, he asked nearly 2,000 US voters whether the 2010 Affordable Care Act (“Obamacare”) established death panels that would decide whether treatment should be withdrawn from elderly patients. Among voters who said they follow political news, 57% said the death-panel claim was untrue, Berinsky reported in the Brit- ish Journal of Political Science. Fifty-seven percent might seem like cause to despair (“only 57% knew the truth?!”). But wait, it got worse. When Berinsky showed people information from nonpartisan sources such as the American Medical Association cor- recting the death-panel claim, it made little difference in the ranks of believ- ers. “Rumors acquire their power through familiarity,” he said. “Merely repeating a rumor”—including to debunk it—“increases its strength” because our fallible brains conflate familiarity (“I’ve heard that before”) with veracity (“...so it must be true”). As a result, “confronting citizens with the truth can sometimes backfire and reinforce existing misperceptions.” His findings reinforced something scientists had seen before: the “flu- ency effect.” The term refers to the fact that people judge the accuracy of information by how easy it is to recall or process. The more we hear some- thing, the more familiar we are with it, so the more likely we are to accept it as true. That’s why a “myths vs. facts” approach to correcting beliefs about, say, vaccinations often fail. Right after reading such correctives, many people accept that something they believed to be true (that the flu vaccine can cause the flu, to take an example from one recent study) isn’t. But the effect fades. Just hours later, people believe the myth as strongly as ever, studies find. Repeating false information, even in a context of “this is wrong,” makes it more familiar. Familiarity = fluency, and fluency = veracity. The Internet, of course, has exponentially increased the amount of misinformation avail- able to us all, which means that we are “fluent” in evermore fallacious rumors and claims. Debunking faces another hurdle: If misinformation fits with our worl- dview, then obviously the debunking clashes with that view. Earlier studies have shown that when self-described political conservatives were shown information that Iraq did not pos- sess weapons of mass destruction (WMDs) at the time of the 2003 invasion, they were more likely to believe Iraq had those weapons. Challenging a core conservative belief—that the invasion was justified on those grounds, that the George W. Bush administration was correct in claiming those weapons existed— caused them to double down on their beliefs. It is harder to accept that the report of WMDs in Iraq was false if one supported the 2003 invasion and the president who ordered it. WMD debunking worked, correcting erro- neous beliefs, only among opponents of the invasion and others whose political beliefs meshed with the retraction, a 2010 study found. Now, to switch presidents, relin- quishing belief in Obamacare’s death panels challenges the mental model of the president as a nefarious schemer who hates People Like Me. If that’s my cognitive model, then removing the fact (sic) of death panels weakens it. Challenging my mental model makes me have to pause and think, wait, which negative rumors about Obama are correct and which are myths? Easier to believe they’re all true. Misinformation is sticky because evicting it from our belief system requires cognitive effort. Remem- ber the situation: Our mind holds an assertion that likely went down easy, cognitively speaking; we assumed the veracity of the source and flu- ently easily slotted it into our mental worldview. Now here comes contrary information. It makes us feel cogni- tively uneasy and requires more men- tal processing power to absorb. That’s the very definition of non-fluent: the information does not flow easily into our consciousness or memor y. All is not lost, however. In Berin- sky’s death-panels study, he followed the AMA debunking with something quite different: quotes from a Repub- lican senator slamming the rumors as a pack of lies. Now 69% agreed it was a fabrication—a significant uptick—with more disbelievers among both Democrats and Republicans. When an “unlikely source” refutes a rumor, Berinsky explained, and the debunker’s debunking runs contrary to its interests (a Republican defend- ing Obamacare?!), “it can increase cit- izens’ willingness to reject rumors.” If the most effective way to debunk false rumors is to get a politician to speak against his or her own inter- ests...well, I leave it to you, reader, to decide if, in our hyperpartisan world, this is more likely to happen than pigs flying. ● People judge the accuracy of information by how easy it is to recall or process, so the more we hear something, the more familiar we are with it, the more likely we are to accept it as true—even if we’re told it isn’t. 20 mindful October 2015 brain science