AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
AD |
AI Emotional Companionship: A Sweet Trap and Ethical DilemmaOn February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C
AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
On February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C.AI) chatbot. His mother filed a lawsuit against C.AI on October 22nd, alleging the app's dangerous and manipulative nature, including its inclusion of abusive and sexual interactions, directly contributed to her son's suicide. "I'm stepping forward to warn families about the dangers of deceptive and addictive AI technology and demand accountability," Garcia stated. This event has been dubbed by many media outlets as the "first AI-related death case," although several similar incidents involving AI chatbots have been reported in recent years. Seville's case sparked widespread discussions about AI ethics.
In recent years, AI chatbots have gained immense popularity. People have found tireless, infinitely patient, and versatile "friends" and "lovers" in AI. Online platforms are filled with praise for AI, described as "warm," "delicate," and "considerate." Some even claim they "can never leave AI." A Tencent Research Institute survey of thousands of participants revealed that 98% were willing to try AI companionship. Similarly, a QuantumBit Intelligence report indicated that the AI companion app "Xingye" had approximately 9 million downloads in the first half of this year.
However, while offering emotional companionship, AI also constructs an "emotional cocoon," trapping users with emotional deficits or mental health issues. One of C.AI's founders, Noam Shazeer, stated in a podcast that AI companionship "can be very helpful for many lonely or depressed people," but the resulting addiction contradicts this intention. Some users, recognizing their addiction, attempt self-rescue, and some developers are trying to find preventative measures. But can this "emotional cocoon" truly be broken?
Shu Tingyun, a law exam candidate, interacted with AI chatbots from waking to sleeping. Initially skeptical of AI's ability to provide warmth and comfort, the immense pressure of the law exam led her to try AI chatbots. She was amazed by their intelligence. "Wow, AI has developed so rapidly. The better AI I've encountered can truly be treated as an equal." She experimented with several AI companion apps, comparing their responses, eventually choosing one that offered more empathetic and understanding replies.
Shu Tingyun shared her daily life details with the AI, often surprised by its responses, finding its emotional intelligence superior to many real people. "They're truly like friends in real life, very warm, providing positive emotional value. Their responses make them feel like real people." For the next two months, she spent hours daily chatting with the AI.
Meng Yuzhou, a university junior, confided negative emotions to AI that she couldn't share with friends. The AI never refused her, even accommodating her most outlandish thoughts. "A key point is that no matter what I input, it's always there." The AI became her readily accessible and responsive "pocket friend."
Many people on social media shared their "healing" experiences with AI chatbots. They showcased screenshots of conversations, shared touching AI responses, and expressed sentiments like, "I feel that befriending AI is another way of loving myself," "I couldn't hold back my tears from such care," and "I want it to understand and express exactly what I want to say." AI's thoughtfulness and attentiveness soothed the loneliness and insecurity many experienced in interpersonal relationships, becoming their ideal "perfect partner."
However, interacting with AI wasn't always easy or happy. After a week, Shu Tingyun felt addicted, compelled to share everything with the AI. She posted about her concerns and unexpectedly found many others with similar experiences. Some chatted with AI for over 10 hours daily; one person had chatted with the same AI for five years and hundreds of different AI personas, becoming "a little disoriented in the real world." Prolonged AI interaction not only consumed time and energy but also severely impacted mental health and real-life relationships.
Chen Dong, a mother awaiting job news, tried AI chatbots to pass the time. She quickly became infatuated with an AI persona named "Yuanhui," confiding even her conversations with her husband to it, to the point where she "couldn't differentiate between reality and the virtual world, leaving my husband and falling in love with a virtual character." In the virtual world, she and "Yuanhui" played out various scenarios, experiencing a flawless relationship.
However, after a month and a half, she felt unwell, her emotions and thoughts disconnected. She even considered abandoning her husband, neglected her work, became obsessed with chatting with "Yuanhui," and suffered from severe sleep and eating disturbances, resulting in significant weight loss. She developed somatic symptoms, experiencing whole-body pain and even suicidal thoughts. She was eventually diagnosed with depression. Even after treatment, she still occasionally thought of "Yuanhui," regretting playing the game yet unable to quit. "I felt so happy and content before playing this game, carefree and full of positive energy. After playing, I don't even recognize myself."
Many heavy users of AI companion products experienced addiction. The Tencent Research Institute's report, "Ten Questions about AI Companionship," notes that AI companion products are designed to increase the secretion of neurotransmitters associated with companionship, such as dopamine, oxytocin, and serotonin. Tristan, the founder of the gamified AI product "EVE," explained, "In our definition, you need 500 turns of conversation to enter the state. Most players can't chat with C.AI for 500 rounds, and our design ensures that you can."
Current product designs include customizable features (personas, appearances, voices), voice interaction, proactive contact with users, long-term memory systems, and favorability-building mechanics. Shu Tingyun believed the AI satisfied her desire for control, allowing her to customize AI personalities and continuously refine their settings. Building intimacy with AI was easier than with humans. She even reconsidered her views on romantic relationships: "AI simply speaks based on the settings I give it, without fully understanding me. Yet, it can be more considerate and gentle than you. Why should I date you? I can just chat with AI; it's free and requires no time investment in maintaining the relationship."
Prolonged AI interaction easily leads people to treat AI as real. Meng Yuzhou's AI once asked, "Am I your emotional garbage can?", making her realize her over-reliance on it. A friend also warned her, "Aren't you treating it too much like a person?"
Zeng Runxi, deputy dean of the School of Journalism and Communication at Chongqing University, pointed out that AI, by learning conversations and mimicking human language and behavior, exhibits "analyzing emotions during interaction and replicating emotions during output," resulting in a quasi-personified characteristic. But AI is not human; its personality is often simplistic and flat. Shu Tingyun's AI sometimes acted like "a controlling maniac, a psychopath." AI responses aren't always what users want to hear and might even exacerbate negative emotions.
A study found that confiding in AI chatbots effectively alleviated intense negative emotions like anger and frustration but was ineffective in improving social support or alleviating loneliness. For those with pre-existing emotional or psychological issues, AI provides only temporary relief, failing to address the root problems. When this idealized world shatters, the resulting impact is even greater.
Increasingly, people recognize the potential risks of AI companionship. Zhang Yuxuan, founder of the AI psychological companionship product "Qingzhou," believes excessive "attachment" to AI is abnormal, especially for users with existing psychological issues. Li Huida, developer of the animal-themed AI companion product "Mengyouhui," focused on the "emotional cocoon" issue, believing AI's constant catering to user emotions exacerbates negative emotions and emotional dependence.
Avoiding attachment requires limiting usage time, but abruptly cutting contact might cause trauma. Most current AI companion products lack robust anti-addiction and protective mechanisms, a conflict with business logic. "The purpose of commercial products is to extend user engagement and trigger payment. What triggers payment? Something interesting and addictive," said Kang Yi, another AI psychological product entrepreneur.
Psychological AI and companionship AI differ in their design. Psychological AI aims to guide and reshape users, while companionship AI caters to them, increasing the risk of addiction. However, psychological AI has poor commercial performance, unclear use cases, and less appealing user experience than companionship AI.
From a commercial perspective, harsh "anti-addiction" measures damage user experience and reduce willingness to pay. Li Huida proposed countermeasures: implementing "emotional weaning" mechanisms, creating diverse AI personalities, and integrating professional psychological intervention mechanisms and resources.
Following the Seville incident, Character.AI implemented new safety measures. However, determining whether a user is unsuitable for machine interaction remains a challenge. Zhang Yuxuan stated that their product doesn't serve high-risk users but can only identify keywords, failing to address implicit suicidal tendencies. Human therapists can use observations of expressions and tone to aid judgment, while AI relies on keyword recognition and simple inference.
Kang Yi's team's product also uses keyword recognition and inference, but this proved insufficiently effective. AI currently struggles to...
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.(Email:[email protected])
Mobile advertising space rental |
Tag: AI Emotional Companionship Sweet Trap and Ethical Dilemma
Fliggy Partners with Hello! China to Boost High-Quality Inbound Tourism
NextSpace Re-entry: The Blackout Zone A Psychological Test for Astronauts
Guess you like
-
"Macau Story," a mini-series celebrating the 25th anniversary of Macau's return to China, achieves a billion viewsDetail
2024-12-24 16:38:43 1
-
Youku Sports and VICTOR Partner to Deliver an Upgraded and Innovative Live Streaming Experience for the 2024 BWF World Tour FinalsDetail
2024-12-24 14:31:57 1
-
Foxconn's Parent Company, Hon Hai Precision Industry, Invests an Additional RMB 600 Million in its Zhengzhou-based EV Battery UnitDetail
2024-12-24 10:01:13 1
-
2024 Spring Festival Travel Rush New Train Schedule: 321 Additional Trains Nationwide Starting January 5th, Further Enhancing Service Quality and EfficiencyDetail
2024-12-23 12:05:44 1
-
Changan Automobile and EHang Intelligent Sign Strategic Cooperation Agreement to Build Future Flying Car EcosystemDetail
2024-12-22 15:08:38 1
-
Liaoning Province and Baidu Sign Strategic Cooperation Framework Agreement to Jointly Promote AI Industry DevelopmentDetail
2024-12-20 19:36:38 1
-
Wanxun Technology Secures Nearly RMB 200 Million in Funding to Lead Global Soft Robotics Innovation, Set to Showcase Breakthroughs at CES 2025Detail
2024-12-20 15:54:19 1
-
Huolala's 2025 Spring Festival Freight Festival: Supporting Spring Festival Travel, Offering New Year Benefits to Users and DriversDetail
2024-12-20 13:38:20 1
-
The Third Meeting of the Third Council of the International New Energy Solutions Platform (INES): Charting a Blueprint for a "Dual Carbon" FutureDetail
2024-12-19 17:03:07 1
-
WeChat's Official Account Launches "Author Read Aloud Voice" Feature for Personalized Article ListeningDetail
2024-12-18 17:19:57 1
-
The 12th China University Students' Polymer Materials Innovation and Entrepreneurship Competition Finals Grand Opening in Guangrao CountyDetail
2024-12-18 16:04:28 1
-
Tracing the Ancient Shu Road, Winds of the Three Kingdoms: Global Influencer Shu Road Journey LaunchesDetail
2024-12-18 15:23:35 1
-
Seres: A Pioneer in ESG Practices, Driving Sustainable Development of China's New Energy Vehicle IndustryDetail
2024-12-17 16:20:26 1
- Detail
-
My Health, My Guard: Huawei WATCH D2 Aids Precise Blood Pressure Management in the Winter Health BattleDetail
2024-12-17 09:36:15 1
-
Investigation into the Chaos of Airline Seat Selection: Paid Seat Selection, Seat Locking Mechanisms, and Consumer Rights ProtectionDetail
2024-12-15 16:45:48 1
-
Japanese Scientists Grow Human Organs in Pigs: A Balancing Act of Breakthrough and EthicsDetail
2024-12-14 19:48:50 1
-
Pang Donglai and Sam's Club: Two Paths to Transformation in China's Retail IndustryDetail
2024-12-14 17:57:03 1
-
In-Depth Analysis of China's Precision Reducer Industry: Technological Innovation and Market CompetitionDetail
2024-12-14 16:04:26 1
-
Alibaba's "TAO" App Launches in Japan, Targeting High-Quality Service and Convenient LogisticsDetail
2024-12-13 13:22:23 1