Protecting Free Speech Is About More Than Letting Content Run Wild | Opinion

3 hours ago 1

Meta CEO Mark Zuckerberg recently announced that the company will discontinue its current third-party fact checking system. Meta will transition to a Community Notes program, adopting a similar approach to the one used by Elon Musk's platform, X.

In other words, Meta will no longer rely on external organizations to review and verify the accuracy of content shared on its platforms, which include Threads, Instagram, and Facebook. Instead, users themselves are asked to collaboratively add any context, corrections, or clarifications to posts directly through a Community Notes program.

Zuckerberg suggests that Meta's current fact checking approach has frustrated users. He argues that harmless content is being censored, that too many users are wrongly banned or suspended, that the company's responses are often too slow, and that the current approach is creating a cycle that limits free speech.

But experts have already warned that this approach could accelerate responses to misinformation and disinformation—hate speech has the potential to proliferate, as it has already on X. It also raises questions about the reliability of verification on social media platforms.

Meta logo and Mark Zuckerberg
This photo illustration created on Jan. 7, 2025, in Washington, D.C., shows an image of Mark Zuckerberg, CEO of Meta, and an image of the Meta logo. DREW ANGERER/AFP via Getty Images

What's impossible to deny is that this shift exemplifies a growing move toward crowdsourced moderation and news-sharing on social media, one that focuses on user participation in producing and managing content.

According to Pew Research Center, more than half of U.S. adults now report getting at least some of their news from social media platforms, marking a slight increase compared to previous years. Social media has grown increasingly important as a news source for younger generations. Reports indicate that convenience, speed, and the social interaction offered by social media platforms are key factors attracting younger people to consume news online. A 2024 study reveals that Facebook (33 percent), YouTube (32 percent), Instagram (20 percent), and TikTok (17 percent) are among the platforms that Americans routinely use to access news.

Unlike traditional news outlets, social media platforms allow everyday individuals to share firsthand accounts, break down stories in real-time, and amplify underreported issues that might otherwise go unnoticed. The accessibility and immediacy are part of what drives people to social media for their news. They value diverse perspectives and real-time updates. While social media platforms have challenges like misinformation and political bias, similar to traditional media, citizen journalism brings a unique value to public discourse. It offers diverse perspectives and enriches the narratives presented by legacy outlets.

That said, those who speak up with the most sensitive information may experience the most risk—including defamation lawsuits, or sanctions if they are breaking a non-disclosure agreement or confidentiality provisions. Section 230 prevents platforms like Instagram, TikTok, or YouTube from bearing defamation lawsuit risks. But users themselves can be challenged through cease-and-desists, subpoenas, and actual lawsuits that require them to spend money on lawyers' fees. Already, President-elect Donald Trump and his cabinet have threatened legacy media with defamation lawsuits for unfavorable coverage.

The new oligarchy and leadership purports to be about free speech, and social media platforms are the central way through which we share news and information now. But free speech is about more than allowing anyone to become an influencer or a citizen journalist—even if sharing information that resonates with users, regardless of truth, is good for business and their emotions.

The reality is that people who speak up about powerful interests, oftentimes with few resources themselves, risk frivolous legal action or lawsuits, bankruptcy, blackballing, or worse. We need to create defamation and whistleblower funds to help people who are frivolously sued or who bring information to light. Perhaps the platforms can allocate some of their massive budgets to protecting content creators who drive traffic and eyeballs but are facing lawsuits for revealing human rights abuses. And anti-strategic lawsuits against public participation (SLAPP) legislation needs to be passed at state levels in order to strengthen the defenses they have for reporting on matters in the public interest.

We also need diversification of platform usage, and to support independent journalists across all platforms, so that there is no one single point of failure. Investing in technology that protects privacy—such as Hush Line and Signal—is also important in sharing information that is anonymous, and can't be subpoenaed. And lastly, nonprofits such as Whistleblowers of America and The Signals Network provide critical and holistic mental health, media, and legal resources to people who are engaging in free speech but facing daunting adversaries.

It's about to be the Wild West for the media. Alex Jones is back on X tweeting about the Jan. 6 insurrectionists, and crypto scams from influencers like Hawk Tuah abound. We have less resources to reign in the falsities now—but maybe we can protect those who tell the truth.

Ariella Steinhorn, a writer whose work focuses on relationship dynamics and power imbalances, is the founder of Superposition and Lioness and the co-founder of Nonlinear Love.

The views expressed in this article are the writer's own.

Read Entire Article