Conversational AI Can Propel Social Stereotypes

Conversational AI Can Propel Social Stereotypes

4 years ago
Anonymous $yQ5BfQaAxy

https://www.wired.com/story/opinion-conversational-ai-can-propel-social-stereotypes/

Alexa, Siri, Watson, and their talking AI siblings serve to make our lives easier, but they also reinforce gender stereotypes. Polite, subservient digital secretaries like Alexa and Siri are presented as female. Assertive, all-knowing Jeopardy! champion Watson is most often referred to as “he.” New generations of AI are coming that will make this problem more significant, and much harder to avoid. As the field expands, designers need to ensure they’re creating a more expansive world, and not replicating a close-mindedly gendered one. Linguists can help them get there.

Last summer, UNESCO released a report warning against the “troubling repercussions” of gendered AI. The researchers recommended closer scrutiny of why many current speech-based AI systems, which interact with millions of people around the world, often default to speaking with a female voice, even though they may claim to be genderless. While any effort to explore and address the issue of AI and gender should be applauded, the report’s authors and others have missed a crucial point: It’s not just a matter of changing pronouns or vocal characteristics. To seriously attack the issue of gender stereotyping in AI, attention to a lot more than just the system’s voice is needed.

Last Seen
17 minutes ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
about an hour ago
Reputation
0
Spam
0.000
Last Seen
5 hours ago
Reputation
0
Spam
0.000
Last Seen
13 seconds ago
Reputation
0
Spam
0.000
Last Seen
2 hours ago
Reputation
0
Spam
0.000
Last Seen
44 minutes ago
Reputation
0
Spam
0.000