In what some see as a major reversal, Mark Zuckerberg announced in a widely distributed video that Meta is scaling back its aggressive approach of reviewing and regulating user-generated content. Instead, Meta will focus its resources on countering illegal content and will replace its independent fact-checking system that relied on external experts with a Community Notes system like X to allow more free expression.
As Zuckerberg explained, Meta’s fact-checking had been overly aggressive, politically biased, and made too many mistakes in its enforcement. He said that this resulted in innocuous content being removed.
This change comes in the context of controversy surrounding the COVID-19 pandemic about the responsibility of social media platforms. Specifically, whether it is their duty to combat misinformation shared on their platforms or to allow all voices to debate these topics openly.
This pressure resulted in two contradictory trends. New laws in red states, specifically Texas and Florida, attempted to limit how platforms could moderate political speech. Meanwhile, the Biden administration applied informal pressure byjawboning companies to remove misinformation about Covid and vaccines. In other words, companies were sandwiched between the contradictory views that platforms were overly regulating their platform content while others thought they were not regulating them enough. A true lose-lose situation for platforms and the consumers that use their products.
Now, as the Biden administration leaves office and with the many state laws attacking the ability of platforms to regulate their content sent back to lower courts, these two specific sources of government pressure are easing—creating an environment ripe for companies to revisit their content moderation choices. Meta seems to be jumping at the opportunity.
However, not everyone is happy with this turn of events.
Critics argue that such an approach is untested and will put people in the United States and abroad at risk as dangerous misinformation spreads out of control. But the good news is that marketplace competition allows users to pick and choose what social media platforms best fits their preference. Meta is not the only ballgame in town. Bluesky, Mastadon, NextDoor, and others compete with Meta and X for users.
Importantly, this vibrant and competitive marketplace would dissipate if governments imposed rigid policies that forced similar content moderating decisions on everyone. The reality is that taking away or informally limiting the ability of platforms to self-moderate leaves fewer meaningful choices for consumers—and it subjects their wishes to the whims of whoever holds the levers of power at the moment.
Read the full article here.
Trey Price is a policy analyst with the American Consumer Institute, a nonprofit education and research organization. For more information about the Institute, visit www.TheAmericanConsumer.Org or follow on X @ConsumerPal.