As socialization goes digital, so does social harm. Lately, Section 230 has found itself in the crosshairs of efforts to move real-world enforcement online. Instead of applying the strengths of Section 230 to legislative attempts to address social harms, many lawmakers are quick to shift the blame away from the perpetrators of problematic behaviors and toward the technologies where they occur. Rather than changing Section 230, would-be reformers should use it as a road map for future regulations involving technology and content.

In the Communications Decency Act, Section 230 states that providers of an online interactive interface will not be treated as a publisher of the online content. In effect, this provision protects companies like Twitter or Instagram from being held responsible for the content users post on the site.

The intersection of Section 230 and social harm is relevant to two cases currently before the Supreme Court, Gonzalez v Google LLC and Twitter, Inc v. Taamneh, which both address Section 230 protections and terrorist activity. Clearly, terrorism predates both the platforms in question, but as more preexisting social harms move online, some individuals who seek to alleviate these harms see Section 230 as a barrier that needs to be repealed or reformed. However, history shows that since existing laws currently address many of these harms, repealing Section 230 is not a necessary means of enforcement.

Backpage.com is a website that critics of Section 230 frequently use as an example of the law’s protections going too far. Some users took advantage of the site to solicit and advertise adult services and, tragically, to facilitate the exploitation and trafficking of minors.

While horrific events resulted from bad actors’ use of the site, this outcome doesn’t mean Section 230 shielded the responsible party from consequences. In the case of Jane Doe No v. Backpage com LLC, three anonymous victims of sex trafficking were suing Backpage.com for the role the website allegedly played in their victimization. The complaint focused on three claims, with only one of them invoking Section 230. Under the Trafficking Victims Protection Reauthorization Act (TVPRA), victims can bring a civil suit against anyone who “whoever knowingly benefits, financially or by receiving anything of value from participation in a venture which that person knew or should have known has engaged in an act in violation of this chapter.” This portion of the complaint against the website failed due to Section 230 protections which shielded Backpage.com from liability.  

However, the complaint also failed in areas that didn’t address Section 230 at all. The complaint failed on its merits, not due to excessive liability shields. The victims sought a judgment against Backpage.com using a Massachusetts general statute that allows for a private right of action by those injured by an “unfair or deceptive act or practice.” The court also rejected this complaint due to a failure to establish that the platform was the causal factor in their victimization. Ultimately, just because a horrific act happens to be facilitated through a platform doesn’t mean that the platform caused the action.

The Stop Advertising Victims of Exploitation (SAVE) Act targeted Backpage.com by updating the criminal code to include a prohibition on advertising or benefiting from advertising sexual exploitation. This bill specifically targeted sites like Backpage and named the website directly in the bill sponsor’s publications. This new legislation was ultimately unnecessary, as the FBI relied on preexisting legal violations when it seized the website in 2018. Likewise, repealing or altering Section 230 isn’t necessary to enforce existing laws.

The lawsuit Glennon v. Rosenblum further demonstrates the government’s ability to prosecute or sue based on existing law. In this case, an unknown poster submitted a story to the website “She’s a Homewrecker” which documented a false story of Monika Glennon having an affair with the poster’s husband during a house showing. While Glennon could not compel the website to remove the content or go after it for slander, her lawyer used existing laws as a workaround. Under existing libel and copyright law, Glennon’s lawyer was able to use the subpoena process to identify the original poster and then sue the actual person responsible. Once again, this result was possible when the prosecution focused on existing laws that address the problematic behavior occurring in digital spaces.

A key difference between the Section 230 approach and legislative proposals seeking to target platforms is who bears the burden. Section 230 protects platforms, but it does not protect the creators of problematic content from legal action or interference. Proposals solely focused on technology companies shift the blame away from users creating problematic content or misusing technological tools.

One of the accomplishments of Section 230 is that it avoids the trap of burden-shifting away from the perpetrator and toward the easiest target. Jane Doe No v. Backpage com LLC failed to establish a causal connection between the platform and the plaintiff’s victimization. This may not have been a failure on the part of the prosecution, but rather, a genuine reflection of the lack of causality between the platform and the crime. Alex Levy, a Human Trafficking and Human Markets adjunct professor at Notre Dame Law School, asserts that “Section 230 doesn’t cause lawlessness. Rather, it creates a space in which many things — including lawless behavior — come to light.” In a separate piece, she goes further in defense of platforms by stating that “there is neither an empirical foundation for the assumption that the platforms cause trafficking, nor any evidence that shuttering them would reduce trafficking.”

Burden-shifting does not just occur in a moral sense, but also through compliance and enforcement costs. In the case of Montana, a bill recently passed the House which would require devices to have preinstalled content filters placed on technological devices. Enforcement of the content filter requirement is expected to cost over $2 million in the first four years, which will necessarily be paid for by the taxpayer or consumer tech companies via fines. Pushing enforcement beyond the responsible parties promotes less of a focus on preventing harmful content and shielding users from it than on finding a visible scapegoat. Individuals, families and governments can find ways to protect against certain forms of objectionable content that don’t require casting a wide net of liability.  New technologies change the way people work and interact, and not all changes are for the best. However, lawmakers should recognize that technology did not create these problems. Section 230 incentivizes business action against socially unacceptable content or behavior, while not burden-shifting away from those who engage in it. Current reformers should follow suit.

Share: