Content Moderation Case Study: Facebook's AI Continues To Struggle With Identifying Nudity (2020)
https://www.techdirt.com/articles/20201211/15111245871/content-moderation-case-study-facebooks-ai-continues-to-struggle-with-identifying-nudity-2020.shtml
Summary: Since its inception, Facebook has attempted to be more "family-friendly" than other social media services. Its hardline stance on nudity, however, has often proved problematic, as its AI (and its human moderators) have flagged accounts for harmless images and/or failed to consider context when removing images or locking accounts.
The latest example of Facebook's AI failing to properly moderate nudity involves garden vegetables. A seed business in Newfoundland, Canada was notified its image of onions had been removed for violating the terms of service. Its picture of onions apparently set off the auto-moderation, which flagged the image for containing "products with overtly sexual positioning." A follow-up message noted the picture of a handful of onions in a wicker basket was "sexually suggestive."