In the early days of the Internet, each user curated their own Internet through address books and link lists. With the commercialization of the Internet and the introduction of social media, intermediaries emerged to provide mediation services and bring supply and demand together. Content moderation in social media influences the perceived reality of users and has led to a structural change in the public sphere. Content curation responds to different social needs. On the side of platform operators, content curation is the basis of a new business model based on advertising. On the users' side, it is about organizing information and protecting their rights and interests. Problematic consequences of curation, such as fake news, hate speech, and content harmful to minors, have long been discussed and criticized. The storming of the Capitol and radicalization trends during the pandemic have added to the urgency of these issues. This article discusses current legislative proposals and previews new ways to address content curation issues. It emphasizes the need for a closer look at the issue of curation, its consequences, and its legal regulation. Content moderation on social media is a key democratic issue and regulatory responses are needed to protect human rights and democratic values. The discussion on the redesign of social media and the role of participation in this context is key.
The chapter "Social Media and Content Moderation: Regulatory Responses to a Key Democratic Question" by Christian Djeffal examines the socio-technical workings of content moderation in social media and discusses possible regulatory responses. The paper shows how public debate is structured by artificial intelligence and how dynamics in social media can threaten both human rights and democratic values.