Child Sex Abuse Material Was Found In a Major AI Dataset. Researchers Aren’t Surprised.

Child Sex Abuse Material Was Found In a Major AI Dataset. Researchers Aren’t Surprised.

11 months ago
Anonymous $Pi6HN8Q0B-

https://www.vice.com/en_us/article/3aky5n/child-sex-abuse-material-was-found-in-a-major-ai-dataset-researchers-arent-surprised

Over 1,000 images of sexually abused children have been discovered inside the largest dataset used to train image-generating AI, shocking everyone except for the people who have warned about this exact sort of thing for years.

The dataset was created by LAION, a non-profit organization behind the massive image datasets used by generative AI systems like Stable Diffusion. Following a report from researchers at Stanford University, 404 Media reported that LAION confirmed the presence of child sexual abuse material (CSAM) in the dataset, called LAION-5B, and scrubbed it from their online channels. 

Last Seen
5 hours ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
3 hours ago
Reputation
0
Spam
0.000
Last Seen
45 minutes ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
28 minutes ago
Reputation
0
Spam
0.000
Last Seen
33 minutes ago
Reputation
0
Spam
0.000
Last Seen
50 minutes ago
Reputation
0
Spam
0.000
Last Seen
15 minutes ago
Reputation
0
Spam
0.000