Why I Won’t Be Following Mark Zuckerberg and Elon Musk on Platform Moderation: Safety First for Women in STEM

Why I Won’t Be Following Mark Zuckerberg and Elon Musk on Platform Moderation: Safety First for Women in STEM

In the evolving world of social media, decisions made by tech leaders like Mark Zuckerberg and Elon Musk wield incredible influence over how billions of people interact, communicate, and share information. Yet, as platforms like Meta (formerly Facebook) and X (formerly Twitter) adopt new approaches to moderation under the guise of promoting “free speech,” they are often sacrificing user safety, accountability, and inclusivity along the way.

As the founder of the developing Women in STEM Network, I believe these shifts reveal a critical flaw in their approach: prioritizing engagement and profits at the expense of creating a safe environment. For communities that center around empowerment—particularly for marginalized groups like women in STEM—the consequences of these decisions cannot be ignored.

Instead of following their lead, I am taking a different path. At the Women in STEM Network, safety will always come first. This is non-negotiable. The stakes are simply too high to emulate the increasingly hands-off moderation strategies of platforms like Meta and X.

Meta’s New Direction: A Step Backward for Accountability

On January 7, 2025, Meta announced that it would be eliminating its third-party fact-checking service in favor of a “community notes” system. Under this new approach, content labeled as misleading or needing context will no longer be reviewed by experts. Instead, users themselves will provide “context” through a system reminiscent of the one Elon Musk implemented on X.

Mark Zuckerberg framed this decision as a move to prioritize free expression and reduce unnecessary censorship. According to Meta, third-party fact-checking was too error-prone, biased, and restrictive, often leading to frustration among users. This change, they argue, reflects a cultural shift where platforms must move away from strict moderation toward user-driven oversight.

However, this pivot toward “community-driven” moderation introduces a slew of new problems:

1. Erosion of Accountability

Relying on users to moderate content removes a crucial layer of accountability. Fact-checking may not be perfect, but it ensures that content is reviewed by professionals with relevant expertise. Without this safeguard, platforms risk creating echo chambers where popular opinions drown out accurate information.

2. Rewarding Popularity Over Safety

Community-driven systems tend to reward content based on popularity rather than accuracy. For women in STEM, who already face significant bias online, this creates a dangerous environment where misinformation, discrimination, and harassment can thrive unchecked.

3. Leadership Choices Signal Platform Priorities

Meta’s appointment of UFC boss Dana White to its board of directors is emblematic of its shifting priorities. Known for his association with divisive figures and a history of inflammatory rhetoric, White’s appointment sends a clear message: Meta is prioritizing controversy and spectacle over inclusivity and collaboration.

Zuckerberg himself justified this shift in moderation by referencing Donald Trump’s recent election victory, calling it a “cultural tipping point” for free speech. However, for communities like Women in STEM, this “tipping point” feels more like a retreat from safety and accountability, leaving vulnerable groups to fend for themselves in increasingly hostile digital spaces.

Why Safety Must Be the Cornerstone of the Women in STEM Network

The Women in STEM Network is founded on the belief that safety isn’t just important—it’s foundational. Our mission is to create an online space where women in science, technology, engineering, and mathematics can thrive. Unlike Meta or X, we will not compromise this vision to chase trends or emulate laissez-faire moderation policies.

Here’s why safety must always come first:

Harassment and Discrimination Are Widespread

For women in STEM, the challenges don’t stop at the workplace. Gender bias, microaggressions, and harassment are all too common in online spaces. Without strong moderation, these behaviors can proliferate, silencing voices and discouraging participation.

Freedom of Expression Requires Boundaries

True freedom of expression isn’t the unchecked ability to say anything—it’s the freedom to share ideas without fear of harassment, ridicule, or harm. Platforms that fail to set boundaries inadvertently privilege harmful voices while silencing those who need protection the most.

Moderation Builds Trust

Platforms that prioritize heavy moderation aren’t stifling free speech; they’re building trust. When users know that harmful content will be addressed swiftly and fairly, they are more likely to engage openly and authentically.

Why I Won’t Follow Zuckerberg or Musk

The moderation strategies embraced by Zuckerberg and Musk may work for their platforms’ goals of scale and profit, but they are fundamentally misaligned with the mission of the Women in STEM Network. Here’s why:

  • Zuckerberg’s “Community Notes” System Ignores Expertise: Delegating fact-checking to users removes the critical role of subject matter experts, leaving misinformation and bias to spread unchecked.
  • Musk’s Laissez-Faire Approach Prioritizes Controversy: Musk’s hands-off moderation style on X has made it a haven for harassment and disinformation, particularly against women and other vulnerable groups.

Instead of following in their footsteps, the Women in STEM Network will uphold the following principles:

1. Heavily Moderated Content

Every post on our platform will go through strict moderation to ensure discussions remain respectful, inclusive, and empowering.

2. Expert-Led Fact-Checking

Unlike Meta, we value the role of experts in identifying and addressing misinformation. Posts that share questionable claims will be reviewed by professionals to ensure accuracy.

3. Zero Tolerance for Harassment

Hate speech, harassment, and other harmful behaviors will not be tolerated under any circumstances. Clear and swift consequences will be enforced to maintain a safe environment.

The Risks of “Community Moderation”

Platforms that rely on user-driven systems to police content risk amplifying biases, silencing vulnerable groups, and normalizing harmful behavior. Here’s why this approach is particularly unsuitable for communities like Women in STEM:

  • Biases Are Amplified: Popular opinions often dominate community-driven systems, marginalizing diverse perspectives and dismissing minority voices.
  • Harassment Is Normalized: Without strong oversight, community moderation can minimize the impact of harassment, leaving targets without recourse.

A Vision for the Future

In an era where the loudest voices often prevail, the Women in STEM Network will be committed to amplifying the voices that matter. Our platform will serve as a sanctuary for women in STEM—a space where safety, support, and empowerment take precedence over profit or popularity.

If you share this vision, I invite you to join us. Together, we can build a platform that proves online communities can thrive without compromising on safety, integrity, or accountability.


Dr Shara Cohen FRCPath, FIBMS, FRSB, BEM

CEO | Non Executive Director | Business Leader | Trustee

1 个月

To become a Founding Member of the Women in STEM Network you can sign up here - https://womeninstemnetwork.com/join-the-women-in-stem-network-become-a-founding-member-today/

回复

要查看或添加评论,请登录

Dr Shara Cohen FRCPath, FIBMS, FRSB, BEM的更多文章