Back in April, I hailed the founding of The Analyst Institute as a sign of the experimental revolution that is taking place in campaign tactics. On Friday, Pollster.com's Mark Blumenthal reported on a press briefing by TAI's Todd Rogers describing the "record use" of experiments in 2008 by Democratic campaigns who were seking "to figure out exactly what impact their voter contact activities were having":
The Catalist data make a strong case that Obama gained most among the voters that the Obama campaign and its allies targeted. But a complicated question remains: Did the field and media campaign activity cause the shift, or were the campaigns effectively piling on among voters that were the most likely to shift anyway? That question, as the Catalist analysts concede, is more difficult to answer with observational data, although within a few months they will add to their database records of which individuals actually voted in 2008, allowing for more analysis of which efforts helped boost turnout and which did not.
Of course, as all pollsters know, proving that sort of causation with this sort of data analysis is very difficult if not impossible. What works better are "randomized controlled experiments" that compare how randomly sampled voters exposed to an experimental "treatment" (in this case various campaign activities) compare to randomly sampled voters in "control groups" with no such exposure. Nine months ago, Brenden Nyhan blogged abut the founding of a new Democratic organization called the Analyst Institute , directed by a Harvard PhD named Todd Rogers. This development, Nyhan wrote, signaled that political operatives were "finally catching on" to the experimental work by Yale's Alan Gerber and Donald Green on the effectiveness of campaign techniques.
Yesterday, Rogers confirmed Nyhan's intuition. He drew back the curtain and provided a few examples of what he described as a "record use" of controlled experiments by the Democrats in 2008, used as they "had never been used before . . . to figure out exactly what impact their voter contact activities were having."
One such experiment involved post election survey work conducted in 11 states by the Service Employees International Union (SEIU) on both experimental and control groups of their members. In this case they held back a random sample "control group" of voter who received no contact from SEIU during the campaign. They then surveyed both the control group of non-contacts and a random sample of all the other voters who received campaign mail and other contact by SEIU.
What impact did the "hundreds of thousands" of targeted contacts SEIU make during the election have in "actually changing support for Obama?" According to Rogers, their post election survey found the "surprisingly positive effects" illustrated in the slide below. The campaign contacts "undermined McCain favorability, increased Obama favorability" and convinced voters that "Obama was better on jobs, the economy and health care," exactly the messages communicated by the SEIU campaign.
![]()
As someone who worked as a Democratic pollster for twenty years (until turning to blogging full time in 2006), I can confirm the unprecedented nature of the the experimental work that Rogers describes. Similar experiments had been conducted before (I worked on a few), but these previous efforts were typically sporadic and scattershot. What is different now is both the scale and sophistication of the work and also -- in one of the least understood aspects of campaign 2008 -- the increased cooperation now occurring among Democratic party organizations, campaigns, and consultants to systematically study which campaign techniques work, and which do not.
These tactics will not only revolutionize campaigns but also allow political scientists to answer all sorts of previously unanswerable questions about what works, what doesn't, and why.
Not only is this a welcome advance for campaigns and political scientists, it's a promising new approach for the Zagat Guide.
Posted by: Rob | January 19, 2009 at 10:43 AM
Or have I confused this field study with the Yale study of restaurants?
Posted by: Rob | January 19, 2009 at 10:44 AM
I can't fault Brendan for cheering advances in his field. Still, I wonder about the ethics of scientifically manipulating voters. Fifty years ago, "The Hidden Persuaders" explored how advertising manipulates consumers. Unlike Brendan, that book took a rather negative view of the manipulators.
Posted by: David | January 20, 2009 at 07:32 AM