First Amendment Rights, Section 230, and Content Moderation: How Gonzalez v. Google and Twitter v. Taamneh Could Shape the Legal Landscape for Years to Come

robindBusiness, Uncategorized

Do you have a First Amendment right to say what you want, when you want, and where you want?

If so, does that prevent social media platforms and other online businesses from moderating the content you publish on their platform? If you have a First Amendment right to say what you want, even on a non-government platform, can the platform then be sued for the things you say?

Two cases pending before the U.S. Supreme Court, Gonzalez v. Google and Twitter v. Taamneh, may answer these questions in what promises to be far-reaching decisions that will profoundly affect how social media companies, other online platforms, internet service providers, bloggers, influencers, content creators, and the broader tech industry do business in the future.

It is generally accepted that 1) the First Amendment applies only to government actions, and 2) Section 230 of the Communications Decency Act shields online platforms and internet providers from liability based on content published by users, but 3) online platforms and internet providers have some degree of responsibility for ensuring that they do not provide platforms for harmful activities like recruitment and propaganda by terrorist organizations.

These assumptions could change dramatically based on the US Supreme Court’s opinions in these two cases.

The Cases: Gonzalez v Google and Twitter v Taamneh

Two cases that the US Supreme Court will decide, Gonzalez v Google and Twitter v Taamneh, highlight the challenges and potential consequences of content moderation by online platforms, and the Court’s decisions in these cases may have severe consequences for any businesses that are involved in online speech and content moderation.

Prager University v. Google, decided by the Ninth Circuit Court of Appeals in 2020 and finding that Google (Youtube) does not have liability under the First Amendment for moderating its content and restricting PragerU’s videos, sets the stage for the Supreme Court’s analysis of Gonzalez v. Google and Twitter v. Taamneh.

PragerU v Google

In 2017, Prager University (PragerU), a conservative educational organization, filed a lawsuit against Google, alleging that the company’s YouTube platform was discriminating against its content by placing its videos in “restricted mode,” which limited their visibility to users. Although Google is not a governmental organization, PragerU claimed that this was a violation of its First Amendment rights because, according to PragerU, Google was discriminating against its content based on its political viewpoint.

The case was dismissed by a federal district court in 2018, which found that Google’s restriction of PragerU’s content did not violate the First Amendment because Google is a private company and therefore not bound by the Constitution’s free speech protections.

PragerU appealed the decision, and in February 2020, the Ninth Circuit Court of Appeals affirmed the lower court’s ruling, finding that YouTube is a private forum and that its restrictions on content are not subject to First Amendment scrutiny. The court also noted that Google’s decision to restrict PragerU’s content was based on its own content moderation policies and was not motivated by political bias.

Twitter v Taamneh

In PragerU v. Google, PragerU was attempting to sue an online forum for restricting its content. In what has the potential for an impossible catch-22 depending on the Supreme Court’s decisions in these cases, Twitter v. Taamneh involves a US citizen attempting to sue an online forum for not restricting its content….

In 2016, Mohamad Taamneh, a U.S. citizen living in Jordan, filed a lawsuit against Twitter, alleging that the company was liable for the deaths of his family members who were killed in a terrorist attack carried out by ISIS. Taamneh claimed that Twitter had provided material support to ISIS by allowing the terrorist group to use its platform to spread propaganda and recruit new members.

The case was dismissed by a federal district court in 2018, which found that Twitter was immune from liability under Section 230 of the Communications Decency Act, which shields online platforms from liability for content posted by their users.

Taamneh appealed the decision, and in August 2020, the Ninth Circuit Court of Appeals affirmed the lower court’s ruling. The court held that Twitter was not liable for the deaths of Taamneh’s family members because the company did not create the content that led to the attack and was therefore protected by Section 230.

Gonzalez v. Google

In Gonzalez v. Google, a companion case to Twitter v. Taamneh, the family of Nohemi Gonzalez filed a lawsuit against Google, alleging that Google is liable for an ISIS terrorist attack that killed Gonzalez because Google did not remove ISIS’ recruiting and propaganda videos from their platform.

The Potential Impact of Gonzalez v Google and Twitter v Taamneh

These two cases raise significant issues concerning the responsibility of online platforms for the content posted by their users, including freedom of expression, censorship, the importance of free public discourse, and the viability of tech companies and internet-based businesses if they 1) can be sued for moderating their online content, and 2) can also be sued for… not moderating their online content.

What are the potential impacts of these cases?

The Court’s decisions in Twitter v Taamneh and Gonzalez v Google could have impossibly inconsistent impacts like:

  • Companies leaving harmful content on their sites to avoid liability,
  • A flood of lawsuits based on a company’s self-policing of their online content, which could result in increased costs and decreased innovation.
  • Online platforms having greater latitude to remove content that they deem objectionable,
  • Increased pressure on online platforms to remove content that is deemed objectionable by certain groups or individuals, which could lead to a reduction in free speech and public discourse, and
  • A flood of lawsuits against online platforms by users who feel that their content was removed unfairly, which could result in increased costs and decreased innovation.

Corporate law firms and their clients may need to adapt their practices and strategies to address the legal implications of these cases, particularly if they involve clients who operate in the online platform industry. Additionally, the outcomes of these cases could set legal precedents that could have implications beyond the online platform industry, potentially affecting the legal landscape in other areas.

Impact on State-Law Claims

How will the outcome of these cases impact the interpretation of state law claims for negligence and other torts? If the court expands the scope of state law claims against online platforms, it could potentially lead to an increase in similar claims against other industries, including those that rely on user-generated content like online marketplaces and social media platforms.

Impact on First Amendment Free Speech Rights

Additionally, these cases may have implications for the interpretation of the First Amendment and free speech rights in the context of online platforms. If the court finds that online platforms have a greater responsibility to moderate content than previously thought, it could potentially affect the way that free speech rights are balanced against the interests of online platforms in other contexts, such as in cases involving government regulation of speech on social media platforms.

Impact on International Businesses

The outcome of these cases might also have implications for international law and the ability of online platforms to operate globally. If the court’s decisions are seen as overly restrictive or inconsistent with international norms, it could potentially lead to conflicts with other countries or international organizations, which could further complicate the legal landscape for online platforms and related industries.

Impact on Section 230 Immunity for Online Platforms

If the Court narrows the scope of immunity provided by Section 230 of the Communications Decency Act, which provides immunity for online platforms from liability for third-party content, it could potentially affect the legal landscape for industries that rely on similar immunity protections including web hosts, internet service providers, search engines, and even e-commerce platforms.

If the court narrows the scope of immunity for online platforms under Section 230, it could set a precedent for other types of intermediaries to face increased liability for content posted by users, impacting a wide range of industries and services that rely on user-generated content or third-party content.

Finally, there is the potential for these cases to impact not only the role of private companies in regulating content but also the role of government in regulating online speech and content.

If the court narrows the scope of immunity for online platforms under Section 230, it could potentially lead to increased pressure on governments to regulate online platforms and enforce content moderation rules. This could have implications for free speech, privacy, and other fundamental rights, and could shape the legal landscape for years to come.

Please feel free to contact one of our Murray Lobb attorneys to obtain our advice regarding your business’s potential liability for regulating – or not regulating – online content. We also remain available to help you with all your general corporate, employment, construction law, business, and estate planning needs.