Facebook’s New Content Change
Facebook is always in the news about flagged content or censorship without good reason. That may well change.
Despite the controversy, Facebook has, up until this point, seemed desperate to maintain the idea that it is a technology rather than a media company. This allows Facebook to wash its hands of the kind of heavy-duty curation that one would expect from a media company. It tends to focus on building and relying on an algorithm that filters and polices content without too much human involvement.
For Facebook to concede that it IS a media company means it has to wade into ethical territory that can be hard to parse and define. Unfortunately, it seems the company is now starting to see how and where that tech-focused method falls short. This is especially true in terms of providing the best experience for its users. Let’s face it, user experience is everything these days.
“Public Interest”
Proof of Facebook’s realization of that fact came recently. The company announced publicly that it would begin allowing certain posts which violate the community guidelines to stay on the social network. The reason? This would happen if the content is “Important to the public interest.”
If that sounds remarkably like what a journalist would do, that’s because it is. As it turns out, with so much content being uploaded to the platform day after day, Facebook has to take a more nuanced role in policing it. The tech-first company was forced to make the move after “repeated criticism of Facebook from news organizations, charities and others over important posts being taken down without notice or the chance to appeal. The chorus has become particularly loud in the past two months and was sparked by the removal of an article illustrated with the iconic Vietnam war photo featuring a naked girl after a napalm attack. It was posted to the site by a Norwegian newspaper.”
Making Exceptions
Joel Kaplan and Justin Osofsky, Facebook’s vice presidents, released a statement which—while not stating directly—seemed to acknowledge that adequately policing the network was not possible to achieve with an algorithm alone, “Observing global standards for our community is complex. Whether an image is newsworthy or historically significant is always highly subjective. Images of nudity or violence that are acceptable in one part of the world may be offensive — or even illegal — in another. Respecting local norms and upholding global practices often come into conflict. And people often disagree about what standards should be in place to ensure a community is both safe and open to expression.”
Adapting a New Role
So what will be the result for users? The new rules ultimately mean that more images will stay on the platform that would have previously been removed, including images that may be not suitable for work or offensive to some. By definition, this means that human involvement has to play a role when it comes to making the judgement call on what pictures stay and what pictures come down.
While this may seem like a granular change in policy, it’s actually a major concession that indicates Facebook realizes its view of its role as a tech company wasn’t tenable. The company also mentioned that it would consult with publishers, journalists, members of law enforcement and others as it develops these policies.
Indeed, content policing and curation has been a core issue of Facebook’s ever since it became the leading social media network. It’s ethos of free speech and connectivity has sometimes seemed at odds with its rather draconian and opaque policies on content curation. There seems to be a choice between the inconsistent enforcement that can come from relying on an algorithm or relying less on computers and more on human judgement. What will they decide?