Is Facebook’s Influence On The News You Read Too Much?
From trending topics to the news you see, Facebook handpicks what you read. But should they?
In the internet media world there has generally been an understanding of the difference between an outlet and a platform. An outlet publishes its own content that is reflective of its style, outlook, and inherent biases, whether it’s a site devoted to commentary or to straight news. A platform, on the other hand, is a service that disseminates this kind of content, whether by automatic aggregation or user-generated sharing, with no opinion or influence on what is being shared.
For example, while you can read the news on Twitter thanks to its users sharing links there, Twitter itself does not have a say or opinion on what those links are. This is quite a key distinction in a world where opinion and bias seem to play in increasingly prominent role in the news. To know that an algorithm can be truly neutral in how it aggregates and presents news—in other words, be based on popularity rather than on opinion—is a powerful value proposition for tech platforms to offer.
Except when it’s not.
Recently, news broke that former contractors at Facebook who were in charge of its “trending news” section were repeatedly suppressing certain types of articles when deciding what to place in front of Facebook users. According to Gizmodo, these news curators were mostly “a small group of young journalists, primarily educated at Ivy League or private East Coast universities, who curate the ‘trending’ module on the upper-right-hand corner of the site.”
Facebook Schedules The News
One such former curator, who Gizmodo kept anonymous but said had politically conservative views, was quoted as saying “Depending on who was on shift, things would be blacklisted or trending. I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”
In their original report, Gizmodo went on to say that “Stories covered by conservative outlets (like Breitbart, Washington Examiner, and Newsmax) that were trending enough to be picked up by Facebook’s algorithm were excluded unless mainstream sites like the New York Times, the BBC, and CNN covered the same stories.”
What Does This Mean?
If you don’t immediately see how problems can arise when a platform has too much influence over what people read, consider that Facebook’s trending news section, which launched in 2014, is one of the most influential determinants of what Facebook users read at any given point in time. It would be one thing if Facebook were upfront about the fact that their news curation activities reflected the biases of its contractors, but they don’t. The network itself states that “Trending shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors including engagement, timeliness, Pages you’ve liked and your location.”
Facebook has changed publishing on the internet in a number of ways, and it continues to do so for the benefit and disadvantage of various groups, depending on who you ask. Some of those changes may be good for readers, yet come at the expense of publications, such as Facebook’s Instant Articles, which offer a better reader experience but no webpage view to the publication that created the content.
Information Awareness
However, one of the things that’s crucial in the new mediascape is to understand where your news is coming from and who is influencing it. While this has always been a part of quality journalism, it becomes ever more important in a time when publishing and business interests are inextricably linked. As one of the most influential companies in the world, Facebook should be forthright about where it stands on this spectrum.