Thanks to Malcolm Gladwell (Blink) and the one-word titled books he has inspired, we are being exposed to more social science research than ever before, much of it with relevance to ethics. I’ll admit to using some of these when they support my point of view, and that is the problem: what such studies supposedly signify often tell us more about the biases of the analysts than the behavior of the subjects. Two recent studies illustrate the point.
The March 2010 issue of “American Politics Research” has a study by Yale researchers called, “Were Newspapers More Interested in Pro-Obama Letters to the Editor in 2008? Evidence from a Field Experiment.” It was supposedly inspired by an incident during the 2008 campaign when Republicans screamed “Bias!” at the New York Times when it refused to print Sen. John McCain’s op-ed response to a Barack Obama by-lined column. To test the oft-stated accusation that the mainstream media is biased leftward, the researchers randomly sent a pro-Obama or pro-McCain letter-to-the-editor, identically worded but with one or the other candidate’s name, to 100 large newspapers. The results: one-third of the newspapers receiving the pro-McCain letter expressed interest, but only one-fifth of the newspapers receiving the pro-Obama letter expressed interest. The researchers reasoned that this showed that papers were more likely to express interest in a letter if it went against the position expressed by their editorial page, and since Obama was endorsed by more of the newspapers, the McCain letter was more likely to be printed. Thus, they concluded, the GOP was probably wrong about the Times. Editorial page editors are driven more by a desire for balanced and contrarian reporting than sheer bias.
A disclaimer here: I have not read the study itself, but only second-hand accounts of it, and since these accounts come from newspapers that are desperate not to be seen as biased, they may be misrepresenting the conclusions. But if the accounts were accurately reported, this study proves nothing at all, and tells us absolutely zilch about whether the Times was unfair to John McCain out of ideological favoritism:
1. Turning down a rebuttal op-ed by presidential candidate is not by any means a fair equivalent to accepting some letters to the editor.
2. The fact that other papers might try to balance their op-ed pages by accepting conservative letters to the editor hardly proves that the Times wasn’t showing bias when it turned down McCain’s op-ed.
3. Using letters to the editor to “balance” op-eds, if that is indeed the intent, is itself an unfair and disingenuous practice. Does a 120 word letter by amateur pundit Joe Blow really balance a George Will column? Is a rant by Moe the Mummer really a fair counterweight to Paul Krugman?
4. Liberal papers have liberal readers, and thus more liberal letters to the editor. Since most of the papers, according to the researchers, endorsed Obama, that mean that they were probably also receiving more pro-Obama letters. The researchers’ pro-Obama letters, therefore, had to overcome more competition to make it into print than the otherwise identical pro-McCain letters. This alone could account for the larger proportion of McCain letters published.
5. If the letter was generic, vague and unconvincing, as seems likely since it adapted to either candidate’s position, printing one hardly was much of a gift to that candidate’s cause. Indeed, a liberally biased editor might try to achieve false “balance” by printing the weakest conservative commentary available, while being more selective with letters supporting the paper’s point of view. A hundred papers doesn’t provide much statistical certainty. Assuming that only one of the letters (and not both) were sent to each paper, that means that only 50 received each. One-fifth, or ten, receiving the pro-Obama letter printed it, while one-third, or about seventeen, printed the McCain letter. That’s a difference of just seven letters…not much, and I would argue not enough, to support any conclusions about the presence of bias in the media
6. Finally, isn’t it a bit bizarre to use the fact that the vast majority of the papers studied endorsed the Democratic candidate to argue that there isn’t a liberal bias in the media?
My analysis is that this was a badly designed experiment with a conclusion unsupported by the results. And, incidentally, tricking newspapers into printing fake letters to the editor, thus squeezing out the real opinions of real citizens and interfering with their efforts to participate in civic dialogue (and to counter the liberal biases of their newspapers, in many cases) surrounding the campaign in a historic election is unethical. It is especially unethical when the study is as sloppy as this one.
I’ll discuss the second ethics research study in the next post.
I misread the title as Dubious Studies Ethics. I would like to see that guide for several fields.
It would be very, very interesting to see what papers received the letters and what the letters said. It’s kind of like wanting to see the original data on global warming, not just the summary conclusions.
And you’re right: do a study next on what percentage of readers read the letters to the editor as assiduously as they read the op-eds. Kind of like printing a headline retraction on page 18 below the fold, isn’t it?