Content Moderation Case Study: Suppressing Content To Try To Stop Bullying (2019)

Content Moderation Case Study: Suppressing Content To Try To Stop Bullying (2019)

4 years ago
Anonymous $rxtAWepgzY

https://www.techdirt.com/articles/20201007/15211345463/content-moderation-case-study-suppressing-content-to-try-to-stop-bullying-2019.shtml

Summary: TikTok, like many social apps that are mainly used by a younger generation, has long faced issues around how to deal with bullying done via the platform. According to leaked documents revealed by the German site Netzpolitik, one way that the site chose to deal with the problem was through content suppression -- but specifically by suppressing the content of those the company felt were more prone to being victims of bullying.

The internal documents showed different ways in which the short video content that TikTok is famous for would be rated for visibility. This could include content that was chosen to be “featured” (i.e., seen by more people) but also content that was deemed “Auto R” for a form of suppression. Content rated as such was excluded from the “for you” feed on Tiktok after reaching a certain number of views. The “for you” feed is how most people view TikTok videos, so this rating would effectively put a cap on views. The end result was the “reach” of content categorized as Auto R was significantly limited, and completely prevented from going “viral” and amassing a large audience or following.

Last Seen
4 hours ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
27 minutes ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
a minute ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
27 minutes ago
Reputation
0
Spam
0.000
Last Seen
16 minutes ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
56 minutes ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000