Communities in Disarray:

Social Media requires more than Section 230 Reform

Social media platforms, including Facebook and Twitter, are virtual communities. What were once sparsely populated online villages scattered across the Internet have transformed into complex societies serving billions of users. The speed in which these online societies have evolved and grown have long outpaced the laws and regulations designed to govern them. With few exceptions, the laws governing social media entities have largely remained unchanged since the late nineteen nineties.

The most prominent and controversial law is Section 230 of the Communications Decency Act. Passed in 1996 to protect websites that sought to moderate user generated content, Section 230 has been interpreted broadly, largely immunizing website owners for any harm emanating from its users. Section 230 also allows each social media platform to establish its own community guidelines and to determine how to enforce those rules. Whether a platform seeks robust moderation or hardly any at all, Section 230 immunity applies the same.

With the rise of online extremism, cyber harassment and the proliferation of disinformation, Section 230 has seen an explosion of interest. Seemingly, it is the law that both sides of the political spectrum love to hate. In Congress, multiple bills have been recently introduced to amend Section 230, including the SAFE TECH Act, the PACT Act, the EARN IT Act and the PADAA. The legislative proposals largely focus on carving out liability immunity for certain categories of content and generally limiting the scope of the law’s broad immunity protections. Crafting legislation that encourages platforms to remove harmful content without inhibiting social media’s place as a marketplace of ideas is no small feat, and voices from all perspectives, including academics, practitioners, policy makers, law enforcement officials, technology companies, and victims of online abuse should be heard and included in any reform effort. At this point, it is unclear whether any of the current legislative proposals can muster sufficient support.

Improving the social media landscape, however, requires more than Section 230 reform. Websites and platforms are currently subject to only a handful of laws and regulations, most notably being the Children’s Online Privacy Protection Act (COPPA) and the Digital Millennium Copyright Act (DMCA). Without governing regulations, social media companies have largely been left to their own devices. Unsurprisingly, the inconsistent application of self-governing rules has exacerbated problems created by Section 230.

Traditionally, America has responded to new technologies with legislation and regulation designed to protect the public. The Federal Aviation Administration, the Federal Highway Administration, and the Federal Communications Commission were borne out the need to develop standards and protocols to protect the skies, the highways, and the radio waves respectively. Social media requires its own regulatory body to address multiple online harms, from impersonation to harassment to algorithmic discrimination. At the least, the FTC, who currently regulates COPPA and enforces unfair trade practices, must have its mission and budget expanded to include greater oversight over social media companies.

The current piecemeal approach to social media regulation has left Internet users too vulnerable to abuse. Under largely social media’s self-regulatory system, for example, social media platforms have adopted various methods to report abusive content. Some platforms provide an email contact, some provide a reporting form, and others provide no reporting mechanism at all. Under Section 230, platforms have no accountability to even respond to legitimate user inquiries, let alone within a reasonable period of time. Standardization and regulation would improve user experience and increase user safety.

The legal system must also evolve to meet the challenges of social media. Without assistance from law enforcement, a person seeking judicial relief must navigate a gauntlet of impediments. Simply identifying an anonymous poster of content is a complicated and expensive endeavor requiring a lawsuit as a prerequisite to the issuance of a subpoena. If a subpoena is authorized, it must often be served in foreign jurisdictions, compounding the complexity and cost. Plaintiffs that are able to identify the poster of defamatory content and obtain a court order for its removal are still subject to the whims of platforms who may or may not honor such decisions. In the end, a victim may only be able to obtain a Pyrrhic victory.

Under the current legal framework, victims of online abuse frequently face a system seemingly stacked against them. This needs to change. Whether through a new administrative court or a newly created lower court, a specialized legal system would develop the rules, protocols and precedent specifically designed for social media’s unique challenges.

Whether Section 230 remains intact, is reformed or is eliminated, additional legislation is still needed. In particular, Congress should protect personal information by regulating data brokers and empowering individuals to maintain the confidentiality of their home addresses. Congress should also criminalize certain forms online behavior, including doxing and swatting. Finally, it is critical that state and local law enforcement have the training and resources to investigate and prosecute online crimes. The Online Safety Modernization Act of 2017, introduced by Representative Kathleen Clark, provided an excellent framework. Among its provisions, the bill proposed increased funding for local and state law enforcement, called for the Department of Justice to maintain and report statistics on the prevalence of cybercrimes, and proposed a National Resource Center for victims of online abuse. Congress should revisit this particular piece of legislation.

While reform to Section 230 is likely on the legislative horizon, which proposal will ultimately prevail, remains uncertain. Any proposals to reform Section 230 should include provisions to require platforms to honor court orders and to adhere to protocols designed to enhance user safety and platform accountability. That said, Section 230 reform should be considered a legislative start, not the goal unto itself. To achieve safer and sustainable social media communities, a multi-prong approach will be needed. Through legislation, regulation, and innovation, we can still capture the potential of social media while empowering and protecting users. Let’s get started.

New Jersey Attorney/Privacy Advocate. Fighting to safeguard privacy rights, to reform Sec. 230 of the CDA and to protect American democracy in the age of Trump.