Boy, there’s a lot of pro-censorship sentiment going around these days. I wonder why?
The latest comes from Facebook, which now is going to attempt to shield us from “hoaxes.” I don’t trust the government to decide what I should read and I don’t trust Facebook to do it either. Nobody should.
Back in the sixties, Economist John Kenneth Galbraith wrote papers and books asserting that large corporations were becoming the new nations and states, and that it was their power, not elected governments, that would decide how we lived. Galbraith wasn’t the best professor I aver had (he was the tallest), and his assertions in this realm were certainly exaggerated, but a lot of what he foresaw has come to pass. It is true that the First Amendment prohibition against government censorship of expressive speech doesn’t apply to private entities, but it is also true that huge corporations like Facebook weren’t even a twinkle in the eye of the Founders when that core American value was articulated. Any corporate entity that has the power to decide what millions of Americans get to post on the web is ethically obligated to embrace the same balance of rights over expediency that the Constitution demands of the state, specifically free speech over expediency, period, exclamation point, no exceptions. Embodying Clarence Darrow’s statement that in order for us to have enough freedom, it is necessary to have too much, the Supreme Court has even pronounced outright lies to be protected speech.
For this reason, Facebook’s well-intentioned anti-hoax policies—boy, there’s also a lot of well-intentioned lousy policies going around these days, being applauded for their goals whether they work or not. I wonder why?—add one more offense to core American ideals.
You can read Facebook’s new policy here. The key section:
“To reduce the number of these types of posts, News Feed will take into account when many people flag a post as false. News Feed will also take into account when many people choose to delete posts. This means a post with a link to an article that many people have reported as a hoax or chosen to delete will get reduced distribution in News Feed. This update will apply to posts including links, photos, videos and status updates. Posts that receive lots of reports will be annotated with a message warning people that many others on Facebook have reported it.”
“Reduced distribution on the news feed” means that Facebook will decide what content should get maximum circulation and what won’t. Its earlier promise in the policy announcement that “We are not removing stories people report as false and we are not reviewing content and making a determination on its accuracy” is disingenuous. Facebook might be removing them from the content I get to see, robbing me of the opportunity to tell the friend who posted it to grow a brain.
Anyone who reads Ethics Alarms with any regularity knows how much I detest web hoaxes. I view them as the web equivalent of poisoning the water supply, and those sites, like the despicable News Nerd, that intentionally set out to sneak fake stories under the (pathetically inadequate) story-checking radar of legitimate news organizations, are no better than the purveyors of internet viruses. What they do, however, irresponsible as may be, is still speech. Like all abuses of speech, the democratic way to respond to it is to rebut it, condemn it, and make the irresponsible speakers objects of contempt and scorn.
Facebook says that satire will be safe, but that “false and misleading” news stories will be the object of scrutiny. Most news stories are false and misleading. A recent Denver Post column about proposed so-called “personhood” legislation stated, “Personhood legislation reflects rigid religious fundamentalist thinking that violates separation of church and state and basic human rights for women.” That statement has (let’s see…) at least three assertions that I would say are false or misleading: one requires no religious beliefs at all to conclude that life begins at conception; nothing in the law violates the separation of church and state, and there is no fundamental human right to kill other human beings, if the public decides that’s what the unborn are. This fraudulent garbage causes a lot more harm than the kind of fake story Facebook uses as its example ( “Scientists irrefutably prove the existence of Santa Claus”) if for no other reason than you have to be an idiot to believe the Santa hoax, but just ignorant to buy the column’s inflammatory nonsense. (Note: I am not in favor of the personhood bills.)
Deceit and misrepresentation are standard, if unethical, tools of political persuasion. Take last night’s State of the Union Speech: President Obama said, “Tonight, we turn the page. The shadow of crisis has passed.” I don’t care which of the many crises you choose to apply that statement to, though he was discussing the economy at the time, it’s utter, utter, utter nonsense, and I pray to the skies that he knows it. Presumably Presidential deceit is immune from limited circulation on Facebook newsfeeds, but why is it any more protected than a typical dishonest meme from the Daily Kos torturing numbers to show that Obama’s budgeting has been a model of fiscal responsibility, or the News Nerd’s latest, claiming that the government is disabling cell phones in moving cars. (Have you split your sides laughing at that one? Let me reiterate: the News Nerd is not satire, as it claims. The News Nerd’s goal is to get news organizations to believe their made up stories, and spread false news. This is still protected by the First Amendment.
As are all such hoaxes. Facebook needs to let its users learn not to be gullible. One person’s hoax is another’s inspiration.
Back off.

I don’t disagree with your central argument, but I absolutely disagree with your headline. Facebook is not a government entity, nor are its users required to participate. You have the right to delete and block comments here – which isn’t that dissimilar to what Facebook is doing (in fact, Facebook is being less intrusive than you are when you banninate someone (which you have EVERY right to do).
Is this a bad idea? Probably. Unethical? The case could be made. But unconstitutional? I don’t see how you can possibly argue that. The Second Amendment reads “CONGRESS shall make no law (etc.). Doesn’t say boo about what private citizens or companies can do.
And by the way – look for a text message from me.
I’ve argued it before, and the argument is this: the Constitution sets out core values for the nation. It tells the government what it can’t do; it tells us what we should do. We should respect the religion of others. We should not take another’s property. We should respect privacy. We should avoid racial discrimination. For the government, this is a legal document. For the public, its a statement of ethical principles. Facebook is defying one of those, and thus going against the Constitution.
Unconstitutional.
Would I agree that it’s going against the principals of the Constitution? Perhaps. But the term “unconstitutional” has a very specific legal meaning. I can agree with you that the Constitution may serve as something of a statement of ethical principle for the general population, though this was not its intent.
Perhaps I’m more sensitive to this than many – though certainly not you – in that I was once responsible for moderating roughly a dozen very active forums. There were times when I had to delete posts and, occasionally, block users. Just as I’m sure you’ve seen yourself, such actions result in cries of censorship and violation of second-amendment rights.
As you also know, those protests were bullshit. No one was forced to use our forums; they participated with our permission. We were no more violating someone’s rights to speech than a homeowner does when telling an obnoxious drunk it’s time for them to go home.
Unless I’m misreading something here, it appears to me that you’re essentially arguing that the fact that it’s Facebook we’re talking about – with all its wealth and power – puts it into some sort of different category. But I don’t see how that argument works. Facebook has a long track record of dubious ethics, and I can easily see where this could backfire (example: millions of progressive automatically flagging and story originating from Fox as a hoax).
If Facebook were owned and operated by the government, I’d completely agree that this was unconstitutional. But Facebook is its own entity – a powerful one, certainly. And anyone who doesn’t question their values on a lot of things hasn’t been paying attention. But I think labeling their actions as unconstitutional weakens that very important word, which at its core is a LEGAL concept, not an ethical one.
Millions of so-called progressives will indeed be automatically flagging Fox News stories as hoaxes. Some are already gloating about it online in direct response to Facebook’s new policy.
And as I read the policy, it will be based on complaints.
To be “fair”, I would predict that unethical conservatives, when they get wind of what’s going on, will come around and report every news story from left-wing sources in retaliation. One might expect this to cause the whole news hoax policy to bomb, but I expect Facebook will come up with ways to filter out hoax reports it considers false. How they would possibly do so without introducing their own bias is another question.
Unconstitutional has a legal meaning in a legal context. It has meaning in an ethical context too. The too narrow application of principles set out in the founding documents—I think they were very much intended as societal guidelines, especially the Declaration—is used to justify organizational bad conduct. The laws, like the Civil Rights Acts, were based on the idea that citizens also have an obligation to live by Constitutional principles. Thus private businesses can’t refuse to serve minorities. That’s why I object when businesses say they will only serve Romney voters. It’s un-American.
I never kick anyone off the comments for content, unless they disrupt discourse, if they stay on topic and follow the rules. I even give the racists one free rant. And the reason is the First Amendment. In ethics, it isn’t about the law, but what’s right. The Constitution tells us what Americans believe is right.
I think that’s why he identified that even though a private entity, once as big as it is, it is ethically obligated to adopt the values of the nation…
Which is why he would also get an out of the nature you protest.
“Facebook needs to let its users learn not to be gullible.” In the meantime, those gullible people re-post and re-post and re-post until a presumption of truth emerges. I’m not in favor of censorship, but I have had to debunk so many posts by friends and relatives since the advent of the internet that some of them think I’m just a busybody killjoy. They don’t WANT to know that the thing that they were so proud of reporting out to their e-mail and other social media friends is not true. Some of us will always check stories. Some of us will just be SO AMAZED/OUTRAGED/DELIGHTED/DEPRESSED/YOU-NAME-IT that they will keep spreading untruths gleefully, and gleefully uninformed of the harm they are doing.
I guess I don’t agree. Facebook makes no bones already about carefully curating what its users see in their newsfeed. This is just one more item added to its algorithms that decide whether you should know about cousin Susie’s new baby, or whether someone you haven’t spoken to in ten years “woke up feeling great today!”. Facebook is a vehicle designed to sell eyeballs for advertisers. If less people are logging in because of the proliferation of viral hoaxes, then yes, if only for the sake of the bottom line, probably filter those out.
If people don’t like the policies, they should take their eyeballs elsewhere, so I do think the unconstitutional” argument is more than a bit of stretch. Facebook has never been, nor has it ever billed itself as a free-for-all speech-wise. It routinely bans and edits photographs and posts that people put up for offensiveness, vulgarity and the like. MySpace had more of a free reign, but people hated it. Apparently they like the constrictions that Facebook puts up. Free market has spoken. Yah!???
Somebody count the rationalizations in deery’s argument. I stopped at four, and I wasn’t through reading.
Eh, no more than you torturing logic to make what Facebook has announced into some Constitutional crisis.
It is weird to me to now decry Facebook’s policy of deciding what you see and don’t see, when their whole business model is based on precisely that. What is the difference between deciding you shouldn’t see a post because they think it is a hoax, and you shouldn’t see a post because they don’t feel the subject matter is germane to you? Has Facebook always been an unethical enterprise, and you just realized it now with the hoax thing, or how is the hoax thing distinguishable from how Facebook runs things in general?
And no wcome the Logical Fallacies: First a straw man: nothing, but nothing, in the post of comments suggest or hint that I was calling or regard this as a Constitutional crisis.
What their business is based on is irrelevant. It is USED for political announcements; what is written there gets people fired and loses elections. Facebook’s narrow definition of the role it plays in US discourse and society is disingenuous, as is yours.
Hmm, so you don’t have a problem with the hoax thing, if they didn’t use it for political posts?
Like I said above, and what you have failed to answer is: Has Facebook always been an unethical enterprise, and you just realized it now with the hoax thing, or how is the hoax thing distinguishable from how Facebook runs things in general?
No, I have a problem with the hoax thing because I don’t trust Facebook, and because even hoaxes are protected speech, unless they rise to the level of fraud. The whole Obama administration has been a hoax.
Facebook isn’t unethical; Facebook is repeatedly operated unethically, as in last years using Facebook users as unconsenting experimental subjects.
I agreed downthread that Facebook’s unconsenting experiments were unethical, mostly due to the hidden nature of it.
But that isn’t the case here. If don’t trust Facebook with the hoax thing, why are you trusting it with all the other stories it decides to hide from you before you ever get the chance to see it? All of which would be protected speech, so I am curious as to why the hoax thing has tipped you over the edge. Facebook, with its heavy ban button and hair trigger post removal policies has never made much of a secret about how it operates in that realm. As described, the hoax thing seems to a somewhat milder version of the way Facebook normally removes and hides posts it feels are offensive, ungermane, provocative, etc. It has never been, or pretended to be, a bastion of free speech. Trying to hold it up to that standard now seems pointless and futile, not to mention quite contrary to its business interests, regardless of the size. People go to Facebook precisely because it shows them things they want to see, and hides things from them they would prefer not to see. That’s what all of its algorithms are dedicated to doing. If the hoax thing falls under that category, so be it. The article makes it sound like the hoaxes would just be harder to see, but you can still seek it out on Facebook if that is your thing.
I said I didn’t trust Facebook. I didn’t limit that statement. If there was an alternative to keeping up with the theater community, which is the main reason I’m on it, I’d junk Facebook in a minute. You do see the ethical and free speech difference between hiding posts based on various schemes to make me pay more money, or because of the level of “friend,” and the content of the message, with Facebook deciding what is “appropriate,” right?
Maybe not…
You do see the ethical and free speech difference between hiding posts based on various schemes to make me pay more money, or because of the level of “friend,” and the content of the message, with Facebook deciding what is “appropriate,” right?
Except Facebook does already definitely decides to upgrade or downgrade posts based on the contents of the post, what you have posted and commented on before, and somewhat more bizarrely, what other people comment on a particular post. If you use the words “happy” or “congratulations”, it will promote the post more, even more if people commenting on the post do as well. If people commenting use the word “false” or “untrue”, it downgrades the post, and won’t show it to as many people. That is a very basic Facebook algorithm that people have noticed and Facebook has admitted to in the past. It obviously doesn’t want all of its algorithms known, because it doesn’t want people to swamped with spambots and gaming the system (without paying to do so).
But yes, like it or not, Facebook decides for you what is “appropriate”, and has done that from the very beginning when you first signed up. This is absolutely no different than what it has always done.
1. “It’s always done this way” is not an excuse.
2. It is obviously different. If it wasn’t different, Facebook wouldn’t have announced it as a change.
3. That “happy”/”false” stuff, as I understand it, was part of the criticized unethical research, and was not applied to all users, and has been suspended. If not, it’s still unethical.
4. You are making the rationalization that if one tolerates some unethical conduct, you are bound to tolerate all of it. Wrong.
3. That “happy”/”false” stuff, as I understand it, was part of the criticized unethical research, and was not applied to all users, and has been suspended. If not, it’s still unethical.
As far as I can tell, people using the words “congratulations” while commenting on your post will send it straight up to the top of your news feed. So that is still in effect. It is harder to tell with the “false” thing, but that appears to be in effect as well.
I guess I am having a hard time comprehending your logic. Facebook has always been upfront, prideful even, about promoting and hiding posts based on an algorithm that is mostly secret and constantly evolving. They actually reveal part of their algorithm to the public, and you find that unethical?
. You are making the rationalization that if one tolerates some unethical conduct, you are bound to tolerate all of it. Wrong.
I am struggling to see the difference between Facebook hiding and promoting posts based on appropriateness (as decided by them), apparently ethical, and Facebook hiding and promoting posts based on appropriateness (as decided by them), apparently unethical. Nothing has changed. Facebook is still making a semi-opaque business decision in what you see and what you won’t.
As proposed, I don’t see any political or other discrimination, so even then I don’t see how it is unethical. The free speech thing I think is a huge nonstarter, because Facebook would probably not be a very viable, or valuable entity if it allowed unfettered free speech. Indeed, its whole business model rest on the fact that it charges for speech and bans speech that it feels in inappropriate or does not fit in with its business model. It is an awkward, at best, fit, trying to put free speech obligations on a for-profit entity. I really don’t think it works at all in this case.
As with all conduct, there are lines. A hosting sevice that bans invivility, or inciting violence, or libel, are all within their expertise to do so, and infringes little on free expression. Hiding content because it’s unpopular (complaints) or “false” (which is a value judgement, not susceptible to control by algorithm, and over the line into unethical conduct. Sure, nobody has to tolerate it—that doesn’t excuse the conduct.
Your argument is essentially “it is what it is”—rationalization #41, The Evasive Tautology.
As with all conduct, there are lines. A hosting service that bans incivility, or inciting violence, or libel, are all within their expertise to do so, and infringes little on free expression. Hiding content because it’s unpopular (complaints) or “false” (which is a value judgment, not susceptible to control by algorithm, and over the line into unethical conduct.
But Facebook already hides content because it is unpopular or because of complaints. I doubt very much that Facebook is employing people to personally look over any of the billions of post that people put up every day and deciding which ones to promote and which ones to downplay. That is precisely what the algorithm is for. All Facebook did with this announcement was to provide a glimpse of the man behind the curtain into a slice of how their algorithm works. If too many people game the algorithm so it I no longer useful, they will change it, otherwise it is useless.
But using your reasoning, the whole Facebook business model seems to be unethical from the very beginning, which doesn’t seem to your argument. This action seems to be directly in line with all their other actions in this area. They want to put things in front of people that are germane and interesting, and keep things away which annoy people and cause them to stop visiting the site. I fail to see how you’ve properly distinguished their actions from being an unethical business from the very beginning.
You just said the same thing you said before. This is different in kind from what Facebook has been doing. This is making value judgments on truth, not words. Seriously, you can’t see that this is new and unacceptable level, just as the effort to manipulate Facebook users’ moods was?
“To reduce the number of these types of posts, News Feed will take into account when many people flag a post as false. News Feed will also take into account when many people choose to delete posts. This means a post with a link to an article that many people have reported as a hoax or chosen to delete will get reduced distribution in News Feed. This update will apply to posts including links, photos, videos and status updates. Posts that receive lots of reports will be annotated with a message warning people that many others on Facebook have reported it.”
Facebook doesn’t seem to be making any personal judgments about whether a particular post is false or not, only how people are reacting to a particular post. If they react in a negative way to a post, then the algorithm show less people that post. It seems to be more of the same as to how FB works now, except they are acknowledging it, and explicitly describing how this particular aspect works.
You could try to get the theater community onto google+ instead… :p
How’s this list, some may be incorrect. I am loathe to do this, but there is a big Caveat at the bottom on all of the identifications:
“I guess I don’t agree.
Facebook makes no bones already about carefully curating what its users see in their newsfeed.
#36. Victim Blindness, or “They/He/She/ You should have seen it coming”
#47 Contrived Consent, or “The Rapist’s Defense,” aims to cleanse unethical conduct by imagining that the victim consented to it, or secretly sought the result of the wrongful act.
“This is just one more item added to its algorithms that decide whether you should know about cousin Susie’s new baby, or whether someone you haven’t spoken to in ten years “woke up feeling great today!”. “
22. The Comparative Virtue Excuse: “There are worse things.”
44. The Unethical Precedent, or “It’s Not The First Time”
#29. The Altruistic Switcheroo: “It’s for his own good
“Facebook is a vehicle designed to sell eyeballs for advertisers. If less people are logging in because of the proliferation of viral hoaxes, then yes, if only for the sake of the bottom line, probably filter those out.”
#29. The Altruistic Switcheroo: “It’s for his own good
“If people don’t like the policies, they should take their eyeballs elsewhere, so I do think the unconstitutional” argument is more than a bit of stretch.”
#47 Contrived Consent, or “The Rapist’s Defense,” aims to cleanse unethical conduct by imagining that the victim consented to it, or secretly sought the result of the wrongful act.
“Facebook has never been, nor has it ever billed itself as a free-for-all speech-wise. It routinely bans and edits photographs and posts that people put up for offensiveness, vulgarity and the like. “
33. The Management Shrug: “Don’t sweat the small stuff!”
43. Vin’s Punchline, or “We’ve never had a problem with it!”
“MySpace had more of a free reign, but people hated it. Apparently they like the constrictions that Facebook puts up. Free market has spoken. Yah!???”
34. Success Immunity, or “They must be doing something right!”
CAVEAT
Most of those rationalization are ONLY rationalization IF and only IF we can accept the premise that as organizations become large enough, monopolistic enough, and endemic enough that changes to their methods & policies could actually be seen to have similar impact on people’s lives as changes to political methods and governmental policies and therefore they ought to adopt the ethical framework which limits the government also. (when I have time, I’ll post on that idea…hopefully soon).
A concept I have been quietly meditating on for the better part of a year.
Excellent. This is the Tex commentary I am used to. Invaluably educational to me for its insightfulness. Brilliantly caveated. (And I even understand it all, this time!) That poor bastard, Deery…probably doesn’t even realize that Facebook, with its odder-than-ever policies and practices, is imitating so-called legitimate governments, not the other way around.
GOOD job.
“Facebook makes no bones already about carefully curating what its users see in their newsfeed.”
Sure, to the point of trying to manipulate user behavior in unethical experiments. Thanks for the reminder!
Oh, I think was unethical, because Facebook was trying to hide what it was doing, thereby performing experiments on people who did not have an opportunity to opt out or decline to be experimented on. But as this is properly announced so everyone can be aware, I don’t see the ethical problem. This seems less experiment, and more pure business decision.
Mob-based fact-checking. Who’d’a thunk it?
Exactly.
When I saw this tidbit on the news this morning I figured it would last a week. I figure that today, 500,000 people will vote all articles about the State of the Union Address as a hoax as well as ALL articles about Obamacare. As the ideological war heats up, all articles are deemed hoaxes and no news at all is allowed on Facebook. Then, they will recind the policy.
All Facebook needs to do to make ethical any of these “behind the scenes” decisions for reader’s is to create a robust filtering system that allows individual users to customize what is bumped up on their feed vs what isn’t bumped up on their feed.
As it sits, right now, this is like a monopoly on the mail sort your letters for you and trashing in advance, the ones it assumes you “shouldn’t” see because enough other people have already trashed it and assume the decision for you.
Were Facebook to hand the reins over to the individual user how to customize what the algorithm prioritizes/deprioritzes, that’d be like allowing the recipient of the mail in the analogy above, to sort their own mail by preference.