Jess Miers
6 min readNov 18, 2020

--

Section 230 Key Developments Talk: 2020 Cloud Conference

It’s always an honor when I get to share the stage with my intellectual hero — Professor Eric. Goldman! Yesterday we had the privilege of co-presenting on one of our all-time favorite topics: Section 230. Huge thanks to Davis Wright Tremaine LLP for inviting us to speak at their virtual 2020 Cloud Conference!

You can view our slide deck here. The following are additional resources we submitted:

I wrapped up about 17 current proposals to amend Section 230 in roughly 10 minutes. Prof. Goldman spoke about some key developments in recent case law. My (rough) notes from the talk are below. Opinions expressed are my own and do not reflect my previous/current employers.

It’s no surprise there’s much bipartisan dislike towards Section 230. Democrats tend to take a more online consumer protection approach to amending 230 while the Republicans seem to be focused on curbing bias in content moderation. With that, we’ve seen several proposals to amend the law just this year. I’ve been tracking each proposal since the beginning of this year and you can also follow along with my Section 230 bill tracker that’s included in the handout materials. Currently there are 10 Senate bills and 7 House bills to amend 230 along with a handful of executive agency proposals for a total of 17 or so proposals to amend Section 230 this year.

[REGULATORY THEME: 1A] Of the 17 proposals to amend Section 230, five of the proposals have to do with eradicating social media bias and/or forcing websites to comply with the First Amendment. For example, Senator Loeffler’s bill, the Stopping Big Tech’s Censorship Act, would carve out a must-carry provision for First Amendment protected speech. Essentially, these proposals would amend Section 230 to require websites to be viewpoint neutral when it comes to their content moderation practices. We saw the same sentiment from Trump’s Executive Order on Preventing Online Censorship earlier in May.

Of course as we all know, myriad case law (which Prof. Goldman will talk about) establishes that websites like Twitter and Facebook are not state actors and therefore are not bound to comply with the First Amendment when it comes to moderating their users’ content. With that, these proposals don’t really keep me up at night because ironically they would place an unconstitutional restriction on the editorial discretion of private actors.

[REGULATORY THEME C2A] Another major regulatory theme involves the resolution of claims regarding content moderation under Section 230(c)(2)(A) rather than under 230(c)(1). While 230(c)(2)(A) is usually known as the content removal or blocking provision, we’re seeing the majority of cases, especially those having to do with account suspension and content removal, turn under the more efficient 230(c)(1) provision. With that, critics of Section 230 often claim that the broad interpretation of Section 230(c)(1) renders Section 230(c)(2) superfluous even though we’ve had plenty of existing case law, such as the precedent set in Barnes v. Yahoo and the Fyk v. Facebook case from earlier this year, that have highlighted the important gap-filling role of 230(c)(2)(A) for claimants that cannot otherwise take advantage of the 230(c)(1) immunity. The same sentiment regarding 230(c)(2)(A) was the key point of the NTIA’s recent petition calling on the FCC to reinterpret Section 230 and specifically 230(c)(2)(A).

Regardless, several proposals like the Online Freedom and Viewpoint Diversity Act amend Section 230 such that account suspension, fact-checking, and content removal, are all housed under 230(c)(2)(A) instead of 230(c)(1). Of course, the beauty of 230(c)(1) is it’s lack of scienter as opposed to 230(c)(2)(A)’s “good faith” requirement so proposals like these and the NTIA’s petition favoring decisions under 230(c)(2)(A) really just ensure that defendants spend a lot more time and money in litigation to reach the same result they would reach under the First Amendment or 230(c)(1).

In addition to routing content moderation claims around 230(c)(1) several proposals also aim to amend the 230(c)(2)(A) provision by stripping out “otherwise objectionable” and in its place substituting other acceptable issue types for removal like content that promotes self-harm or terrorism.

On that note, proposals like the Protect Speech Act out of the House also attempt to define “good faith” and permit content removals so long as they are “objectively reasonable.” The Protect Speech Act would actually house the definition of good-faith under 230(c)(1) completely stripping out 230(c)(1)’s immunity.

[REGULATORY THEME: Tradeoffs] Some proposals like the EARN IT Act would create a tradeoff for Section 230 immunity for services that make it easier for law enforcement and the DOJ to investigate the facilitation of child sexual abuse material on these services. Privacy and cybersecurity experts have read EARN IT to suggest that services might be required to trade their End to End encryption practices for Section 230 immunity. Given the current amount of movement and bi-partisan support for EARN IT, this is a definitely a bill worth keeping an eye on, especially as we transition over to the Biden Administration. (Senators Blumenthal and Graham just promoted it today during this morning’s senate judiciary hearing)

[REGULATORY THEME: Transparency] We’re also seeing some regulatory trends towards mandating transparency as we’ve seen with the PACT Act. PACT would essentially require websites to explain their content moderation policies and practices upfront to their users. Websites would also be required to explain in detail the reasons behind their decisions to keep or remove complained about content.

[REGULATORY THEME 230(e)] Additionally seven of the current proposals aim to route around Section 230 entirely by adding exceptions under 230(e). For example, EARN IT creates a carve out for CSAM similar to the SESTA/FOSTA carve out for sex-trafficking law. The See Something Say Something Act would carve out an exception for websites that fail to submit a suspicious transmission activity report. And some recent proposals including the DOJ’s proposed regulations would create carve-outs for state criminal law and federal civil enforcement of any Federal criminal statute. The carve out method seems to be the most popular and easiest way to reform 230 for specific issue types and grievances against websites.

[CASE ACT+] The last bill I’ll call attention to is the Online Content Policy Modernization Act which combines all of the 230(c)(2)(A) amendments from the Online Freedom and Viewpoint Diversity Act previously mentioned with the CASE Act which among other things would create a copyright small claims court essentially routing around due process and imposing damages as large $30,000 (larger than any small claims courts in the states) and decisions would not be appeal-able. This sort of worst of both worlds 230 reform and copyright reform bill also appears to be gaining traction and is worth watching. (Markup scheduled for Thursday).

[CONCLUDE] With all of that said, what does this mean for Internet companies, especially those companies that take in a lot of user generated content. I include some of these implications as these are things Internet companies might consider getting ahead on now as opposed to later especially as countries outside the US continue to adopt similar legislation with these requirements.

The first is a need for faster turnaround times when it comes to content removal (For example, the PACT Act requires websites to respond to complainants within 14 days for potentially policy violating content and also demands that websites act on illegal content within 24 hours of notice of the content.)

Demands for more transparency are a common theme to almost all of these proposals so I expect to see some new transparency requirements around how Internet companies respond to and interact with users and complainants regarding their decisions about content.

Last, I wouldn’t be surprised if we’re heading for a notice and takedown style regime but for non-IP related issues so violent extremism (like we see with Australia’s new VE law), self-harm, non consensual pornography, or even child abuse materials that would not necessarily qualify as child sexual abuse imagery.

With that, I’ll turn it over to Prof. Eric Goldman to discuss some key developments in the case law.

--

--

Jess Miers

Senior Counsel, Legal Advocacy at Chamber of Progress