Social media companies is facing convulsive changes which will not please its  management, and which may upset its users.  In the US, political pressure is forcing Facebook to preclude Russia’s ability to use its platform for influencing US elections; and in the EU, laws require Facebook to quickly remove hate speech and incitements to violence.  Facebook can no longer remain a technical platform that encourages users to post their unfiltered thoughts.  In a sense, Facebook is being forced to become a media company that accepts some accountability for screening and editing content posted by its users.  As a result, Facebook is looking for strategies to implement major changes in the nature of its operations.

At least two strategies are under consideration.  One option is the direct, hands-on management of news item pages and user posts.  Facebook already has 5,000 “content moderators” who were presumably behind the shuttering of 135 Russian troll farm accounts and pages.  Facebook said it could expand that workforce to 10,000 editors by the end of 2018.  Another option is to enlist users in the management of news pages through a user directed ratings system.

Facebook has devised a rating system that has been used in some locations as a trial.  In a US trial, the ratings apply to user posts and comments.  In an Australian trial, “Little gray arrows—one up, one down—have appeared beneath comments on posts from select public pages, asking for input. “Stop bad comments,” they implore.”  The arrows can be tapped to confer either an up vote or a down vote applying to the page or comment.  The post’s accumulated total of votes is shown, but those votes do not offset the “likes” or emoticons that the page may have earned.

It is unknown whether there is a user ratings system suitable enough to adopt. Some who participated in the trials point to a few downsides.  The up and down votes are anonymous. That could allow a mob to rule without accountability.  The mob could be an affinity group of collaborating users (e.g. supporters of a political candidate) or it can be a horde of bots, e.g. thousands of Russian election campaign influencers hitting the up vote/down vote lever in unison.

A post author who has been publicly down voted will be somewhat reluctant to offer new content.  Also, there is a herd effect. Users are more inclined to offer a down vote if there is already a down vote in place, likewise, the presence of up votes can stimulate copycat up-voting.

Facebook’s 10,000 editors are not fixated on policing the Kremlin. Their mandate is to enforce a number of “community standards” on pages and user posts.  It’s a public service for Facebook to remove postings from terrorists and criminals, but “how much controversial but valuable speech will be flushed out along with the digital bath water.” 

Facebook’s editors presumably will have the power to veto the content. The editors might also impose a heavy down vote, choose to promote the content or just alter the content’s objectionable parts.

Of course, editorial powers can be used corruptlyFacebook’s current news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Gizmodo reportedFacebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section.”  “Workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section.“  That’s shadow banning, a sleazy variant of down voting.  Shadow banning has also been practiced (perhaps temporarily) against conservative voices at Twitter.

Social media sites used to suggest that users exercise “free speech” on their technical platforms.  That is now less credible.  Triggered by Russian trolls, “fake news,” instances of hate speech, and mishandling of users’ private information, Facebook and Twitter are forced to become media companies. As such, they edit users’ content to follow the “community standards” chosen by politicians and the site’s own self-interest.  That’s a radical change, but we can expect even deeper changes as soon as Congress clarifies what privacy users can expect.

Share: