Yesterday, I asked Facebook to take down a fraudulent page. Within a few hours it was gone, just a shadowy image of a bandaged thumb where it used to be. Very effective. It’s highly likely I get better treatment from Facebook than your average Joe (because, um, they know me). When it comes to pages of questionable taste, Facebook moves a little more gingerly.
Too gingerly for some. There’s a lot of what is commonly called “disturbing” stuff on Facebook. It’s not Reddit or even Tumblr, where the hinterlands are torrid and vast, but for a big corporate entity that prides itself on making the world a better place, it sure hosts a lot of nasty. Let’s not link and bring these pages’ creators more jollies, but if your taste in humor runs to rape, domestic violence, dead babies, 9/11 victims or Oklahoma tornado misery, there’s a Facebook page for you. Generally, Facebook will take down what it deems harmful — hate speech, pornography, calls for violence — but not what it considers controversial. A lot of jokes about beating up women or rape seem to fall under that.
After a campaign by a number of women’s groups, however, who shrewdly targeted female-friendly Facebook advertisers like Dove, Facebook announced a change in policy regarding humor based on violence against women on May 28. “In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate,” wrote Marne Levine, vice president of global public policy at Facebook. Specifically Facebook monitors will be trained in this area, women’s groups will be given a way to communicate with Facebook HQ more easily and some of the other antihate groups advising Facebook will be asked to also offer counsel on violence against women.
In the past Facebook has tried to choke offensive, spammy pages by limiting the number of administrators it can have. Or waiting for complaints. In some instances, this was almost too effective; Facebook’s policy against the display of breasts, in any context, has been widely discussed. This was a particularly sore point in the recent protests. You can’t post a picture of a woman breast-feeding, yet you can post an image of a woman hogtied captioned with a rape joke.
But getting the offensive pages removed is all a bit of a game of whack-a-troll. As soon as one page is taken down, another goes up in its place. So the social-media giant’s trying something else new: making content creators own it: “… if an individual decides to publicly share cruel and insensitive content, users can hold the author accountable and directly object to the content,” wrote Levine.
This idea has two merits. One is that people who are held accountable for what they do and say are less likely to do and say repulsive things. (There are exceptions, of course, but they tend to be professional comedians or radio talk-show hosts.) The second one, more crucially for Facebook, is that it no longer has to take the heat for attention-seeking shut-ins with nothing better to do than pretend they’re sociopaths. This is kind of the approach Goldman Sachs took when it disinvested in BackPage.com, which may or may not have been pimping trafficked women but was definitely a family-unfriendly environment. Only Facebook’s disinvesting in its primary resource: users.
Making people show their faces, however, is a little anti-Internet. For a start, it’s a difficult thing to pull off; it’s not like you can write an algorithm for honesty. If people want to disguise their identity, the Internet can do very little to prevent that. Indeed for many, anonymity is one of the Internet’s chief thrills. In some countries, the ability to speak freely and not be identified is lifesaving. Moreover, up-and-coming sites like Tumblr and Reddit have no such qualms about identification, which may make them more attractive. Facebook says it’s testing the waters. In the meantime, the troll watching continues.