In 1996, a court split over platform liability for content moderation resulted in the creation of Section 230, which protected online platforms’ rights to moderate content without the risk of liability for that content.
Nearly three decades later, Congress is presented with a similar opportunity, but this time regarding the level of restrictions that platforms place on third parties, known as platform openness. Apple and Google both take different approaches to openness, with Apple being a famously “closed garden” and Google being a much more open platform—for example, the company allows users to install apps without using its app store. Like lawmakers used Section 230 to protect a platform’s ability to experiment with different content moderation regimes, they should also act to protect a platform’s ability to experiment with different levels of openness.
Questions involving online speech, and specifically what levels of liability for that speech should be imposed, prompted the debates that ultimately provided the framework for speech liability today.
In 1990, Cubby v. CompuServe established that platforms were not liable for content they hosted and that they would have needed “first-hand knowledge” before such liability could be imposed. However, in 1995, a separate court ruled in Stratton Oakmont v. Prodigy Services that the platform Prodigy could be held liable for comments since the company monitored posted content.
Combined, these court cases created what one congressman characterized as a “perverse incentive” against any form of content moderation. Congress stepped in and created Section 230 to establish protections for online platforms that engaged in moderation, even if those efforts were imperfect. Section 230 offers clarity as to who is responsible for specific online speech by stating that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The result protected companies in their efforts to extend control over their platforms, which strengthened individual liberty and personal liability.
For modern users of social media and digital content, this means that only the users of a platform, not the platform itself, are liable for what they post online. In other words, Elon Musk is legally responsible for what he posts on X, not because he owns X, but because he is the one engaging in speech.
Read the full article here.
Tirzah Duren is the Vice President of Policy and Research at the American Consumer Institute, a nonprofit educational and research organization. You can follow her on X @ConsumerPal.