December 29 is one of the bad days in ethics history, beginning with the 1170 murder of England’s Archbishop Thomas Becket as he knelt prayer in Canterbury Cathedral by four knights of King Henry II. The knights were not explicitly ordered to kill Becket, the King’s friend who had become a problem when he took his role as Archbishop of Canterbury to be a calling to defend the Church against royal efforts to constrain its power. Instead, Henry made his wishes known by making the public plea to his court,
“What a parcel of fools and dastards have I nourished in my house, and not one of them will avenge me of this one upstart clerk.”
This is often quoted as “Will no one rid me of this troublesome priest?” Either way, the idea of such an oblique request is to relieve a leader of responsibility for the actions of subordinates, giving the leader plausible deniability. It didn’t work for Henry, but it may have worked for, for example, President Obama, whose Internal Revenue Service illegally sabotaged Tea Party groups in advance of the 2012 election, greatly assisting Obama’s efforts to defeat challenger Mitt Romney. In truth, when a powerful superior makes his or her desires known, it may as well be an order. An order is more ethical however, because it does not require the subordinate to take the responsibility upon himself.
1. But The worst example of a U.S. ethical breach on this date is the Massacre at Wounded Knee in 1890, when the U.S. Cavalry killed at least 146 Sioux at the Pine Ridge reservation in South Dakota. It is definitely the most people killed because of a dance: the government was worried about a growing Sioux cult performing the “Ghost Dance,” which symbolized opposition to peaceful relations with whites, and was seen as inciting violence. On December 29, the U.S. Army’s 7th cavalry surrounded a band of Ghost Dancers under the Sioux Chief Big Foot near Wounded Knee Creek and demanded they surrender their weapons. A fight broke out between an Indian and a U.S. soldier, a shot was fired, and an unrestrained massacre followed. Of the estimated almost 150 Native Americans were killed (some historians put this number at double that number), nearly half of them women and children. The cavalry lost only 25 men. Many believe that the tragedy was deliberately staged as revenge for Custer’s Last Stand 14 years earlier, which seems like a stretch to me.
Facebook is so out.
“Meh. Look at this neat picture of my dog!”
Ethics Strike One was the research itself, using its own, trusting users as guinea pigs in a mad scientist experiment to determine whether their moods could be manipulated by secretly managing the kind of posts they read from Facebook friends.
Ethics Strike Two was the lack of its subjects informed consent for the study, violating the basic standards of human subject research. A boilerplate user agreement that makes a vague reference to using data for “research” in no way meets the requirements of informed consent for this kind of study.
This brings us to Ethics Strike Three. In justifying the legality and ethics of the research, Facebook’s researchers explained that leave to perform such experiments was consistent with the user agreement (See Strike Two): “[the experiment] was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” As I pointed out above and in my previous post on this topic, this isn’t informed consent as the research field and various ethics codes define it. But even if it was, this statement is a lie. Continue reading
Facebook apparently has been manipulating the feeds that some users get to see in order to measure how it the content affects the tone of their own posts.
You can read about the research here; I’m not publicizing it, because the Facebook’s research is an abuse of users and their trust. I don’t mind them reading my posts, for they own the service, and the service is in their name. I assume they will use my data and content to make money, but I didn’t agree to allow them to manipulate me, or what I write, feel, or think. I’m also not especially optimistic about the uses the results of such research might be applied to.
The researchers claim that the research is ethical because a computer program scanned for words that were considered either “positive” or “negative,” but the Facebook content wasn’t actually read. Facebook terms of service state that user data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
Since Facebook users agree to the terms of service, the researchers argue that this constitutes “informed consent” for their experiment.