Users should have a say in shaping policies that will impact their digital lives.
This belief has been a guiding principle for our ongoing collaboration with Meta to explore how best to incorporate informed user input in developing the ‘rules’ of the platform.
We’re pleased to have recently completed our second Community Forum this past December, which focused on governance of the Metaverse. This Community Forum, conducted in partnership with Stanford’s Deliberative Democracy Lab, was the largest ever executed, as it involved more than 6,300 people from 32 countries and 23 languages. Therefore, it provided a unique opportunity to learn about how best to promote engagement, informed debate and dialogue on a global scale.
To this end, BIT led a process evaluation, which yielded important insights regarding how to design and implement deliberative democratic forums.
Below, we share several valuable lessons for organizations to consider as they approach similar efforts to engage people in meaningful deliberation on complex and often controversial issues.
Finding the right challenges
In the world of technology, there is no shortage of complex challenges. There are ongoing debates involving disinformation and freedom of speech, potentially harmful behavior on social media and the use of emerging technologies. In our experience, we’ve found that a good deliberation topic involves difficult trade-offs, whereby people with different values and priorities may come to different conclusions.
Often, it centers on an emerging issue, in which opinions are still forming and participants may be open to (and influenced by) new information and perspectives. As such, these are typically issues in which there is a lack of consensus and a sense that no single individual or team should make decisions alone. However, the situation does require definitive action, perhaps in the form of norms, rules, regulations or governance procedures.
These criteria led us to the two Community Forum topics we recently carried out: how Meta should act on problematic climate change content on Facebook and how best to monitor and moderate bullying and harassment in the Metaverse.
Engaging and preparing participants
Deliberative democracy efforts depend on engaged participants. For people to be willing to participate in these extended conversations, they need to be prepared, to feel welcomed – and to know that their time and effort will matter.
In our experience, this rests largely on three factors. The first is baseline knowledge, as people will inevitably come with varying levels of knowledge and experience regarding the issue at hand. In our most recent Community Forum, we found that a key challenge was providing the right level and format of baseline information to ensure that all participants approached the session on common ground and were confident enough to express themselves.
A second factor is diversity. In addition to seeing people like themselves in the conversation, participants also need to feel that an appropriate range of voices are included.
Often, organizations define diversity primarily in terms of demographic attributes (eg gender, race, etc). But to drive legitimacy—both within and outside the discussion—we’ve found that inviting participants with a diversity of viewpoints is also critical.
Finally, and perhaps most importantly, people want to know that their collective input will ultimately help shape decisions. This requires a high degree of transparency before and after deliberations occur. For example, Meta did this by pre-committing to publish the results of our pilot Community Forum on climate change misinformation.
Managing the discussions
Another important piece of the puzzle is making the discussions themselves focused and constructive.
We did this by purposely keeping each small group discussion focused on a single topic with 3-4 supporting questions to inform policy decisions.
For example, in our session focused on user behavior in the metaverse, one session was dedicated to discussing “What tools should be used to identify bullying and harassment in the metaverse, and where they should exist?” In a later session, participants discussed “What should be done in response to bullying and harassment?”
Tied to this, we structured each discussion around a series of specific proposals, each of which explicitly stated both sides of a debate and thus helped participants to consider the trade-offs associated with a given decision.
As importantly, our team facilitated each session of the climate content moderation forum to maintain a productive flow of conversation. This required a careful balance between promoting input and ensuring that no single perspective dominated the conversation.
In the metaverse Community Forum, small group discussions took place on Stanford’s Online Deliberation Platform, which harnesses AI to manage this process.
Many other institutions in tech and beyond are navigating complex questions that would benefit from informed input and deliberation
Making the results actionable
Our ultimate goal was to not only listen and make participants feel heard, but to come away from these deliberations with clear direction to inform Meta’s decisions. This began by engaging leadership early, to ensure that the organization’s goals and constraints were fed into the design and content of the Community Forum.
A second important challenge was translating participants’ input into usable outputs: In our most recent Community Forum, we conducted a Deliberative Poll® measuring participants’ opinions on each proposal, both before and after the discussions. While this survey data captured shifts in people’s attitudes, we needed to translate it into a format that was more directly applicable to guide governance decisions.
Finally, it was also important to share both the process and the results in engaging and accessible ways that would resonate with many different constituencies across Meta. For example, we created a video highlight reel to illustrate the Community Forum process and convey why it is so much more valuable than simply asking people their opinion (in a typical survey or focus group).
Embedding deliberative democracy
It’s been inspiring being a part of the Community Forums effort and seeing people from all around the world come together in sincere discussion about how best to govern Meta’s platforms. It has provided a glimpse into the power and potential of deliberative mechanisms as a tool to engage users, facilitate meaningful deliberation and gather informed input on complex questions. It’s also provided evidence that this kind of engagement is feasible online and on a global scale.
We know that many other institutions in the tech world—and beyond—are navigating similarly complex questions that would benefit from informed input and deliberation. We hope that they will consider deliberative democracy, as a way of bringing in diverse perspectives and promoting acceptance of difficult decisions.
With these experiences under our belt, we’re looking forward to our next Meta Community Forum, which will focus on generative AI. This complex topic has significant implications for society, and it will be exciting to engage people in deliberation regarding the many product and policy questions related to AI.
We are grateful to Meta for their partnership in exploring new ways to make Community Forums as efficient, effective and scalable as possible — and to give users a real voice in decisions that affect their lives and communities.