Online hate: we need to improve content moderation to effectively tackle hate speech
More news about the topic
FRA’s online content moderation report looks at the challenges of detecting and removing hate speech from social media. It highlights that there is no commonly agreed definition of online hate speech. Online content moderation systems are also not open to researchers’ scrutiny. This makes it difficult to get a full picture of the extent of online hate and hampers efforts to tackle it.
Abusive comments, harassment and incitement to violence easily slip through online platforms’ content moderation tools, finds the new report from the EU Agency for Fundamental Rights (FRA). It shows that most online hate targets women, but people of African descent, Roma and Jews are also affected. A lack of access to platforms’ data and understanding of what constitutes hate speech hampers efforts to tackle online hate. FRA calls for more transparency and guidance to ensure a safer online space for all.
www.praeventionstag.de