Content Moderation Case Study: Facebook's Moderation Of Terrorist Content Results In The Removal Of Journalists' And Activists' Accounts (June 2020)

Content Moderation Case Study: Facebook's Moderation Of Terrorist Content Results In The Removal Of Journalists' And Activists' Accounts (June 2020)

4 years ago
Anonymous $RGO3jP_V_c

https://www.techdirt.com/articles/20201028/15001045603/content-moderation-case-study-facebooks-moderation-terrorist-content-results-removal-journalists-activists-accounts-june-2020.shtml

Summary: In almost every country in which it offers its service, Facebook has been asked -- sometimes via direct regulation -- to limit the spread of "terrorist" content.

But moderating this content has proven difficult. It appears the more aggressively Facebook approaches the problem, the more collateral damage it causes to journalists, activists, and others studying and reporting on terrorist activity.

Content Moderation Case Study: Facebook's Moderation Of Terrorist Content Results In The Removal Of Journalists' And Activists' Accounts (June 2020)

Oct 28, 2020, 11:23pm UTC
https://www.techdirt.com/articles/20201028/15001045603/content-moderation-case-study-facebooks-moderation-terrorist-content-results-removal-journalists-activists-accounts-june-2020.shtml > Summary: In almost every country in which it offers its service, Facebook has been asked -- sometimes via direct regulation -- to limit the spread of "terrorist" content. > But moderating this content has proven difficult. It appears the more aggressively Facebook approaches the problem, the more collateral damage it causes to journalists, activists, and others studying and reporting on terrorist activity.