Safer spaces by design? Federated socio-technical architectures in content moderation

Users of secure messaging tools, especially in communities attuned to the risks of state-based and other forms of censorship, increasingly hesitate to delegate their data to centralised platforms, endowed with substantial power to filter content and block user profiles. This article analyses the rol...

Full description

Saved in:
Bibliographic Details
Main Authors: Ksenia Ermoshina, Francesca Musiani
Format: Article
Language:English
Published: Alexander von Humboldt Institute for Internet and Society 2025-03-01
Series:Internet Policy Review
Subjects:
Online Access:https://policyreview.info/node/1827
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Users of secure messaging tools, especially in communities attuned to the risks of state-based and other forms of censorship, increasingly hesitate to delegate their data to centralised platforms, endowed with substantial power to filter content and block user profiles. This article analyses the role that informational architectures and infrastructures in federated social media platforms play in content moderation processes. Alongside privacy by design, the article asks, is it possible to speak of online “safe(r) spaces by design”? And what is the specific role that human moderators play in federated environments? The article argues that federation can pave the way for novel practices in content moderation governance, merging community organising, information distribution and alternative techno-social instruments to deal with online harassment, hate speech or disinformation; however, this alternative also presents a number of pitfalls and potential difficulties that we examine to provide a complete picture of the potential of federated models.
ISSN:2197-6775