Facebook plans to have its users rank news sites’ “reliability”.
This decision is in reaction to the 18-month-long “fake news” scandal and is being promoted as the most “fair and balanced solution” to the issue. The idea is that the community at large would determine which sites are credible and which aren’t, and the lowest-ranking ones would somehow be penalized, possibly by appearing even less on users’ newsfeeds. Facebook said that they don’t want to make this decision themselves for ethical reasons so they’re instead leaving it up to others to do so for it, believing that this is somehow a “democratic” approach to the problem. In reality, however, Facebook is only just encouraging “mob rule” because it’s probable that there will be a partisan split among Mainstream Media outlets that leads to inconclusive results for its experiment, thus putting the onus back on the company to make the executive decision for resolving this problem.
Another peculiarity about this initiative is that actual “fake news” is pretty easy to spot and not all that popular on Facebook anyhow. Anytime a story refers to alien-possessed individuals, devil-worshipping politicians, and ethno-supremacist conspiracy theories, it’s flat-out “fake news”, no question about it, but people still click on them regardless for whatever their personal reasons may be, whether they’re curious or just looking for a laugh. That being said, what Facebook is likely referring to as “fake news” in this context probably doesn’t fall under those examples, but is instead a euphemism for editorial positions and analytical interpretations that the company and some of its users don’t agree with.
Take for example a news item about the US’ relationship with the Syrian Kurds. So-called and by-the-book “real news” would simply regurgitate a few facts surrounded by an extremely broad backdrop without explaining the story’s overall importance, as that would technically be venturing into the somewhat subjective realm, though no story is 100% pure journalism without a touch – however indirect and subtle – of analysis. That same story might be reported positively as the US issuing a statement of support for its Kurdish allies, for example, while another outlet might take the critical angle that this is implicit evidence that Washington is planning to de-facto partition Syria. Both articles might produce polarized reactions and accusations that they’re “fake news”, when in reality they’re just different interpretations of the same news event, and there’s nothing wrong or unethical about that.
Facebook’s experiment, however, isn’t able to capture that, as it’s only going to ask users about a site’s “reliability”, which essentially is an opinion survey asking people if they agree with the editorial angle most commonly taken by the platform in question. If a critical mass of individuals disagrees with RT or Sputnik’s approach, for instance, then the company will list them as “unreliable” and possibly suppress their reach on people’s newsfeeds, all because of the “mob rule” that it initiated. Accordingly, the most probable outcome of this endeavor will be that Facebook uses the pseudo-scientific “evidence” that it manipulatively produced in order to “prove” that Alternative Media is “unreliable” and therefore “justifiably” subject to de-facto censorship against it.
Andrew Korybko is an American Moscow-based political analyst specializing in the relationship between the US strategy in Afro-Eurasia, China’s One Belt One Road global vision of New Silk Road connectivity, and Hybrid Warfare.