Elisabeth Costa
Chief of Innovation and Partnerships
Last month, BIT announced our collaboration with Meta to explore how social media platforms can democratize and decentralize decision-making. It’s an agenda we’ve long been passionate about and exploring how best this can be applied in the real world, as we believe that deliberative democracy approaches can enable social media users to discuss, negotiate and reach collective views on difficult decisions that shape their own experiences online.
To this end, building on our initial work, BIT and Meta recently piloted three online assemblies to investigate the promise and feasibility of deliberative approaches as a mechanism for platform governance. The assemblies deliberated on the challenging topic of misleading climate change content:
“What approach should Meta take to climate content that may be misleading or confusing but does not contain a false claim that can be debunked by fact-checkers?”
These are user posts, such as the fictitious example below, that express uncertainty about the impacts of climate change, distrust in scientific expertise, or skepticism about climate solutions, without explicitly putting forth false claims.
Clearly, misleading information of this nature is a complex and nuanced topic: It’s a ‘grey zone’ with no clear or obvious answers. It is also a real and pressing issue for Meta, other social media platforms or, indeed, for governments and institutions around the world. Thus, it was an ideal subject on which to engage Facebook users for a genuine and democratic deliberation.
Across our three Assemblies, we brought together over 250 Facebook users across five countries, to ensure that we heard from people from different nationalities, ethnicities, socio-economic backgrounds, and political ideologies.
Over the course of three days, forum participants engaged in structured assemblies, which were designed to give participants a consistent and balanced knowledge base from which to discuss and deliberate the topic of climate change. This included briefings from subject matter experts on Meta’s current policies, climate change misinformation, and freedom of expression.
Participants then shared and discussed their views across a series of plenary sessions and small group activities, which culminated in a vote on a bounded set of options:
Overwhelmingly, participants felt that Meta should take action regarding this type of climate content. Indeed, across all three assemblies and five countries, there was alignment on the top recommendation: Meta should educate users and inform users – and thus encourage dialogue on important topics such as climate change. Conversely, participants were generally opposed to “downranking” or reducing the ability to share certain posts, because they believed such actions effectively reduce discussion of the topic.
In addition to voting, assembly participants also ‘built’ on the recommendations, making thoughtful suggestions for how certain policies could be operationalised. For example, some suggested that Meta’s actions should be more assertive when users claimed expertise or authority on a certain subject (“I’m a PhD…”), without providing credible detail or support for that credential.
The level of consensus within and across the assemblies was both surprising and encouraging, given that incoming perspectives on climate change varied widely. For example, in the U.S., participants strongly diverged on their least preferred options, as some were strongly opposed to Meta taking no action, while others were opposed to prohibiting extreme content. But interestingly, people across the political spectrum both gravitated to and found common ground on “lighter-touch” options, such as flagging misleading content and linking to more information. As importantly, despite quite different starting assumptions, participants were pleased to encounter a high level of mutual respect:
“I expressed my opinion openly… we didn’t agree all of the time, but the purpose was not to agree at the onset, it was to come to a consensus at the end, and I think that was achieved for the most part.” — U.S. participant, interview
Because of this, most participants clearly had a positive experience and importantly, they felt that the deliberative assembly process was legitimate. In fact, a clear majority said that they would like to see Meta use similar processes across a range of decisions facing the platform. However, this support was predictably contingent on Meta taking actions based on users’ input and recommendations.
We asked Meta to reflect how the learnings from these pilot forums will inform their next steps. They let us know the learnings from these pilots are integrated in several of Meta’s ongoing efforts to educate and inform people on their platforms, including:
Overall, BIT and Meta are encouraged by the outcomes of these early pilots and the potential to provide robust and effective forms of ‘self-governance,’ which incorporate the users’ voice to complement the efforts and influence the decisions of technology executives, oversight boards and government regulators.
Through these pilots, we’ve also learned a great deal about how this deliberative democracy model can be refined, enhanced and potentially scaled. For example, we are interested in exploring:
As our daily lives become ever more entwined with online spaces, it is vital that users can shape the platforms that in turn influence their experiences, behaviour and relationships. With this goal in mind, we applaud Meta for its pioneering work in democratizing and decentralizing platform decision making. These pilots have set out a promising path forward: at BIT, we look forward to continuing this journey, in the hope of inspiring further governance innovation in companies, organizations and institutions around the world.
Chief of Innovation and Partnerships
Design and development by Soapbox.