Antitrust reform isn’t the only movement targeting Big Tech that’s gaining political momentum. There is a concurrent desire to reform intermediary liability by enacting changes to Section 230 of the Communications Decency Act. The question at the heart of this debate is whether tech companies — like Twitter — should be held responsible for what users post on their platforms. If pursuing reform, lawmakers should learn from Europe’s mistakes and steer clear of vague legislation that opens individuals up to risk or assigns liability as a way to punish successful companies.
Intermediary liability reform may be coming to America, as shown in the September White House briefing on Principles for Enhancing Competition and Tech Platform Accountability, which called for reforms to Section 230 to rein in the dissemination of illegal or violent conduct or materials. Specifically, reform efforts are targeted at paragraph (c)(1), which states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Currently this legislation protects social media platforms from being held liable for user-generated content.
Before taking action, U.S. lawmakers should learn from the mistakes of Europe, which has made reforms through the Digital Services Act (DSA) as well as national legislation. In July 2022, the DSA passed the European parliament and established regulatory burdens on platforms across Europe. The DSA maintains liability exemptions in principle, but in practice, the obligations for company action are broad for categories of speech like racism and hate speech.
A primary goal in any reform should be to maintain the principle of legal certainty — where an actor can reasonably know the legal consequences of their actions. An extreme case of the failure to adequately define responsibility creating uncertainty can be found in France, which has, in some cases, expanded liability for content to individual users with a large online following.
The case Sanchez v. France addresses the government fining a politician under national law for failure to delete comments deemed hateful that other social media users had posted on his Facebook wall. The case is currently seeking review by the European Court of Human Rights (ECtHR), but it has already been upheld by a lower court and illustrates the dangers of expanding the responsibility for behavior beyond those who engaged in the behavior themselves. Furthermore, it demonstrates the need for legislation to clearly define intermediaries when seeking to reform liability. It’s one thing to say that companies hosting platforms hold some degree of responsibility. It’s another entirely to push responsibility onto individuals who have a large following for monitoring the content of their followers.
A comparison in the U.S. would be drafting intermediary liability legislation that opens the door for Kim Kardashian to be held responsible for comments made by her 331 million Instagram followers. While it’s hard to fathom any serious attempts to legislate celebrity responsibility, the idea that celebrities should attempt to moderate fan behavior is gaining popularity and making its way into pop culture discussions, and therefore shouldn’t be dismissed as an impossibility.
An additional pitfall is one that has already made its way into U.S. legislation: regulating platforms based on size. The DSA assesses different regulations based on a platform’s size, with the most stringent requirements falling on the largest companies.
Senator Amy Klobuchar (D-MI) has already introduced antitrust proposals attempting to target large companies with special regulations by regulating tech companies over a size threshold. Regulating based on size is not only unfair but also makes it apparent that proposed changes are more about punishing success than they are about protecting consumers and information flows. If intermediary liability needs reforming, then the same rules should apply across the board.
In the U.S., intermediary liability is wrapped up in conversations about free speech, Big Tech and fairness. While it’s difficult to predict where reform efforts will lead, we can at the very least learn from European mistakes. Europe’s reforms have been broad changes that leave enforcement open to interpretation and weaponize liability to punish large companies. Reasonable people can disagree about the need for reform, but what should be universal is the notion that modernizing liability reform should not come at the expense of established principles such as freedom of speech, equality before the law and personal responsibility.