Is Stakeholder Engagement the Key to Successful Community Standards on Social Media?

Is Stakeholder Engagement the Key to Successful Community Standards on Social Media?

Building stakeholder trust has become a core goal for corporate executives. With some of the biggest investors publicly challenging corporations to think beyond short-term financial goals, companies are working to map, anticipate, and respond to concerns across societal interest groups.

This task daunts most companies, not least global social media platforms such as Facebook, YouTube, and Twitter. These platforms seek to calibrate and reflect societal views, but in the process, they have become powerful actors that dramatically affect the trajectory and impact of popular expression. How they set and implement content policies on issues such as terrorism, hate speech, and political and religious extremism has directly impacted the lives of billions of people in hundreds of countries. Today, there is a consensus that technology platforms should no longer make high-stakes decisions without granting the public structured visibility and input.

Ongoing debates about leadership, governance, and regulation of social media are highly relevant to any stakeholder engagement discussion. The public will not differentiate an organization’s approach to content standards from its view of the organization’s overall behavior. But even a transformation of the regulatory and competitive environment for social media platforms will leave open the question of how best to set and implement content policies in the best interest of society—and what that interest is. There are no easy answers. How, for example, is freedom of expression to be protected without undermining privacy?

BSR has provided independent advice to Facebook on stakeholder engagement relating to its content policies (named “Community Standards”). Our work is based on our five-step methodology and our belief in the importance of proactive stakeholder engagement strategy. It suggests some principles for engagement by social media platforms that could help set direction for more long-term solutions, such as Social Media Councils.

Given the human rights impacts of social media platforms, it is important to prioritize the voices of such vulnerable groups as rights defenders, political dissidents, women, young people, minorities, and indigenous communities.

Why engage?

Engagement is needed to proactively identify areas in which content policies, newsfeed prioritization, and algorithms driving ads need to evolve to meet user expectations, social norms, and international standards—both to identify new issues that haven’t been explored and to revise policies on known issues as they evolve. Further engagement must then balance and resolve diverse perspectives on contentious topics which, given the near-universal reach of social media, could conceivably cover every issue of interest to any stakeholder anywhere in the world. To align with human rights and sustainability frameworks, engagement principles must be transparent, comprehensible, and consistent, even as issues play out in radically disparate ways across different geographic contexts.

Who is a stakeholder?

For most companies, stakeholder mapping involves categorizing stakeholder groups—typically investors, regulators, customers, suppliers, civil society organizations, and relevant communities. For social media platforms, however, the stakeholder landscape poses unprecedented challenges of scale, diversity, and complexity. Consider impact and representation: Beyond contemplating billions of users (itself a task of gigantic scale), social media platforms also need to consider “rightsholders”—employees, contractors, customers, and individuals whose images or words are shared even when they are not platform users. Given the sheer number and diversity of rightsholders, social media platforms need to locate organizations capable of speaking on their behalf. Depending on circumstances, rightsholders might be represented by civil society organizations, activist groups, or policymakers. How credibly any given stakeholder can represent a specific interest or opinion always requires deep analysis.

As a result of stakeholder mapping exercises, platforms will be able to evaluate gaps, seek expertise to fill them, and at least avoid uninformed attempts to balance a spectrum of views.

How should stakeholders be prioritized?

Unconscious biases, external pressures, and commercial incentives can easily foster approaches that fail to reflect the full range of effects on rightsholders. Given the human rights impacts of social media platforms, it is important to prioritize the voices of such vulnerable groups as rights defenders, political dissidents, women, young people, minorities, and indigenous communities.

Stakeholder mapping exercises need to begin with the landscape of contentious issues upon which to engage stakeholders. Terrorism, hate speech, sexual harassment, bullying, and disinformation are obvious examples, but new issues emerge constantly. For each issue, perspectives can be mapped across linguistic, geographical, and social identities, supplementing user data with academic expertise. This enables identification of representative organizations, should they exist. As a result of this mapping exercise, platforms will be able to evaluate gaps, seek expertise to fill them, and at least avoid uninformed attempts to balance a spectrum of views.

What is the best way to gather perspectives?

Social media platforms face strong incentives to transparently disclose their consultation processes—and the stakeholder perspectives they yield—but some vulnerable people and identity groups will prefer anonymity, for good reasons. While candid, one-on-one conversations with stakeholder can build trust, they are narrow in focus, extremely resource-intensive, and invite questions regarding overall balance and focus. Setting up groups according to geography or issue expertise is more efficient and can boost a platform’s analytical capacity, but groups are challenging to analyze and can develop blind spots. Given all this, a mix of approaches and formats is most appropriate.

Social media companies are beginning to experiment with advisory councils that typically comprise stakeholders that have a mature understanding of company policies and processes and can credibly represent interest groups or perspectives. This necessarily limits diversity and inclusivity, inviting allegations of elitism. The payment of honoraria to council members can be viewed as compromising their independence, but lack of compensation raises questions about exploitation of stakeholders and practical constraints on their ability to contribute. Setting standard industry practices and “arms-length” mechanisms would help to address this dilemma.

Facebook has proposed creating an independent oversight board to review the company’s most difficult decisions about content, and it is considering the board’s role with respect to content policy advice, too. This provides an additional channel for input, and Facebook will benefit from explaining how the board affects policy over time.

How should decisions be made and disclosed?

Intent on retaining accountability for their content decisions, social media platforms are unable and unwilling to outsource policy control. A key challenge they face will be to explain how—and why—collecting stakeholder views can and will inform their internal decision-making.

The platforms must also incorporate local political and social nuances without undermining their own global consistency; issues of origination and impact mean that setting national boundaries around content and opinion raises more questions than it solves. On highly contentious issues such as terrorism, hate speech, and incitement to violence, social media platforms need to credibly draw on existing academic expertise and then capture the range of values and opinion without defaulting to the median (or most commercially friendly) option. Human rights frameworks offer an appropriate reference point in that they take scale and severity of impact as a starting point and proceed to consider immediate, cumulative, and longer-term developments in navigating difficult trade-offs. For example, a decision to limit hate speech on a platform may protect vulnerable groups while having a detrimental long-term effect on democratic participation.

For now, the power to determine, promote, and limit content rests with a handful of companies that are both overwhelmingly powerful and keenly exposed to public anger. Rather than focusing on molding friendly legislation or reacting arbitrarily to the latest incident, social media platforms should continue to develop institutional approaches while steeling themselves to invite broader public participation.

Facebook is already posting minutes from its Content Standards Forum. As it works toward broader disclosure, the company should embrace full transparency of its positions and policy determinations and the relevant internal user data that informs them.

BSR’s work with Facebook raises questions, answers, and then more questions. What is clear is that in this rapidly evolving area, cross-industry and multi-stakeholder collaboration should become a priority. The prospect of effective external oversight from society remains nascent and contested. For now, the power to determine, promote, and limit content rests with a handful of companies that are both overwhelmingly powerful and keenly exposed to public anger. Rather than focusing on molding friendly legislation or reacting arbitrarily to the latest incident, social media platforms should continue to develop institutional approaches while steeling themselves to invite broader public participation.

This blog originally appeared on BSR here: https://www.bsr.org/en/our-insights/blog-view/stakeholder-engagement-key-to-successful-community-standards-social-media

Alison Taylor

Clinical Professor at NYU Stern School of Business, lots of other hats, even more opinions. Author of Higher Ground: How Business Can Do the Right Thing in a Turbulent World, Harvard Business Review Press, February 2024.

5 年
回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了