Kelsie Nabben
4 min readJun 10, 2020

--

Technologies of Trust: How should we approach moderation and censorship in the golden internet age of mis-information?Kelsie Nabben, June 2020
United Nations COVID-19 Response (via Unsplash)

Outline:
Decentralised technology protocols offer viable alternatives to social media platforms to better consider how we approach social media content moderation, censorship and misinformation.This article is positioned in the context of two events that have important implications for people and the internet; the COVID-19 crisis and Trump's Executive Order. It outlines the current context of crisis and information systems, some of the shortcomings of big tech social media platforms, and the promises and cautions of decentralised alternatives as better social networks.--------------“Like Twitter, but decentralised” are the words of Brewster Kahle, one of the founding Fathers of the internet, at Ethereum Developers Conference in his 2018 address entitled; ‘The Web We Want’. But how do decentralised social media platforms handle content moderation, are they better at combatting misinformation, and is this really what we want? The raging debate between US President Donald Trump and the dominant big tech social media platforms is all about information online. Twitter has become the dominant interface between journalists and people. Outcries have continued over concerns with the politicisation and moderation of groups on Facebook. This (latest) crisis could have resounding effects for the internet as we know it.Trump purports that if platforms such as Twitter and Facebook ‘edit’ social media content by applying warning labels or removing user content, they are legally liable as publishers, not distributors, of information. When Twitter repeatedly added labels to Trump’s Tweets to caution against misinformation, Trump signed an Executive Order on Preventing Online Censorship related to Section 230 of the Communications Decency Act.The Order calls out Twitter, Facebook, Instagram and YouTube for wielding “immense, if not unprecedented, power to shape the interpretation of public events; to censor, delete, or disappear information; and to control what people see or do not see”.
“I Never Made it With Moderation”; how the platforms approach itTwitter invented a new class of content moderation, just for Trump. The ‘public-interest notice’ label states that;
This Tweet violated the Twitter Rules about glorifying violence. However, Twitter has determined that it may be in the public’s interest for the Tweet to remain accessible”.'Public-interest exceptions' on Twitter apply to content that violates Twitters rules but originate from elected and government officials, and thus, may be in the public interest to still be able to see and discuss. Interestingly, to be aware that it is possible, the recommendation algorithms are turned off on these Tweets to limit reach whilst, maintaining viewing ability.Despite Facebook's policy against statements advocating for high-severity violence, which were updated in May, identical posts appeared on Facebook and they chose to do nothing. In response to the Executive Order, Facebook CEO Mark Zuckerberg echoed his sentiments that also followed the 2016 elections, stating that;
“Facebook shouldn’t be the arbiter of truth of everything that people say online…private companies probably shouldn’t be, especially these platform companies”.Yet, private platform companies do take on this role, especially in crisis. In the context of the global COVID-19 pandemic, Facebook has actively guided its users to information about the pandemic and tried to flag, hide or remove misinformation about the virus. Facebook has also agreed to pay $52 million to current and former content moderators who developed PTSD on the job. In a clear juxtaposition, the algorithms that are central to the business model of centralised platforms are designed to drive content to users and create echo chambers for third-parties to effectively spread their content through paid advertising and yet, the politicisation and manipulation of people through this platform is both powerful and dangerous.
The Mis-information AgeMisinformation and digital trust are emerging themes of the COVID-19 crisis. From bleach as a cure, virality via 5G, anti-vaxxing and Bill Gates. With the sharp acceleration of digital adoption during isolation, how should information be treated online in order to foster the web we want?
Meanwhile, in the Metaverse...Decentralised AlternativesDecentralised, blockchain-based social networks have been proposed during the current debate as a solution to the social media incumbents. Independent, decentralised, open-source protocols aim to offer ‘technologies of trust’ through privacy-by-design, peer-to-peer, cryptographically secure, online information systems. Social network solutions built on decentralised, peer-to-peer protocols include Peepeth built on Ethereum blockchain, CoBox built on Dat Protocol, Orbit Chat built on IPFS protocol and Matrix, built on Matrix protocol. Decentralised, federated social networks include Mastodon and SecureScuttleBut.Although these solutions are still early, they demand further investigation. It is also prime time to discuss the ways in which information flows on the typology of decentralised networks. Public, decentralised blockchains are not necessarily censorship resistant, nor is online anonymity necessarily good at mitigating misinformation. Yet, they are built and owned by participants in the network. This operational transparency and democratisation of data ownership offers the opportunity to re-think the technical, legal, economic and social norms we want to govern information sharing online. Decentralised, ‘crowd-sourced’, and local-first data or locally governed digital solutions may provide more trusted approach to govern information online.Community education, participation and critical thinking are key ingredients for the success of the efforts to manage both information online and the COVID-19 virus. Although it is quite possible that nothing will change between Trump and the social media giants in the lead up to the 2020 elections on November 3, social network participants could help to build, in the words of Brewster Kahle, “the web we want, together”. --- END. ---

--

--

Kelsie Nabben

Social scientist researcher in decentralised technologies and infrastructures. RMIT University Digital Ethnography Research Centre / Blockchain Innovation Hub