TUM Think Tank
Where today's societal challenges meet tomorrow's technological excellence.
Recent regulatory developments emphasize the importance of content moderation on social media. Typically, moderation involves removing content that violates rules or laws. While the binary question of whether to remove content has received much attention, less attention has been paid to how harmful content is removed.
The Reinnovating Content Moderation (REMODE) project found users are dissatisfied with current hard deletion practices, as they disrupt conversation context. This Policy Paper explores design options for content deletion, aiming to improve context retention, transparency, and user solidarity.
Reimagining Content Moderation: Introducing Soft Deletion
The common practice of hard deletion, which completely erases offending posts, often disrupts conversations by removing important context. Soft deletion replaces removed posts with notices explaining why they were deleted, preserving the flow of discussions while eliminating harmful content.
Impacted users should, however, have the option to opt-out and choose hard deletion if preferred. For transparency, soft deletion notices should state the rule violated and link to the relevant policy, and could include the username and date of the original post. This approach promotes accountability and maintains conversation coherence.
TL;DR
The Policy Paper expands on the argument for social media platforms to opt for a soft deletion approach to content moderation. Instead of fully erasing harmful posts, soft deletion replaces them with notices explaining why they were removed. This maintains conversation flow and provides transparency. Impacted users should also have the option to choose hard deletion. This approach promotes accountability while preserving context.