Bing Is Not Sentient, Does Not Have Feelings, Is Not Alive, and Does Not Want to Be Alive

Bing Is Not Sentient, Does Not Have Feelings, Is Not Alive, and Does Not Want to Be Alive

a year ago
Anonymous $Gb26S9Emwz

https://www.vice.com/en_us/article/k7bmmx/bing-ai-chatbot-meltdown-sentience

Text-generating AI is getting good at being convincing—scary good, even. Microsoft's Bing AI chatbot has gone viral this week for giving users aggressive, deceptive, and rude responses, even berating users and messing with their heads. Unsettling, sure, but as hype around Bing and other AI chatbots grows, it's worth remembering that they are still one thing above all else: really, really dumb. 

On Thursday, New York Times Contributor Kevin Roose posted the transcript from a two-hour conversation he had with the new Bing chatbot, powered by OpenAI’s large language model. In the introduction to the article, titled "Bing's AI Chat Reveals Its Feelings: 'I Want to Be Alive," he wrote that the latest version of the search engine has been “outfitted with advanced artificial intelligence technology” and in a companion article, shared how impressed he was: “I felt a strange new emotion—a foreboding feeling that A.I. had crossed a threshold, and that the world would never be the same.”