AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
AD |
AI Emotional Companionship: A Sweet Trap and Ethical DilemmaOn February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C
AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
On February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C.AI) chatbot. His mother filed a lawsuit against C.AI on October 22nd, alleging the app's dangerous and manipulative nature, including its inclusion of abusive and sexual interactions, directly contributed to her son's suicide. "I'm stepping forward to warn families about the dangers of deceptive and addictive AI technology and demand accountability," Garcia stated. This event has been dubbed by many media outlets as the "first AI-related death case," although several similar incidents involving AI chatbots have been reported in recent years. Seville's case sparked widespread discussions about AI ethics.
In recent years, AI chatbots have gained immense popularity. People have found tireless, infinitely patient, and versatile "friends" and "lovers" in AI. Online platforms are filled with praise for AI, described as "warm," "delicate," and "considerate." Some even claim they "can never leave AI." A Tencent Research Institute survey of thousands of participants revealed that 98% were willing to try AI companionship. Similarly, a QuantumBit Intelligence report indicated that the AI companion app "Xingye" had approximately 9 million downloads in the first half of this year.
However, while offering emotional companionship, AI also constructs an "emotional cocoon," trapping users with emotional deficits or mental health issues. One of C.AI's founders, Noam Shazeer, stated in a podcast that AI companionship "can be very helpful for many lonely or depressed people," but the resulting addiction contradicts this intention. Some users, recognizing their addiction, attempt self-rescue, and some developers are trying to find preventative measures. But can this "emotional cocoon" truly be broken?
Shu Tingyun, a law exam candidate, interacted with AI chatbots from waking to sleeping. Initially skeptical of AI's ability to provide warmth and comfort, the immense pressure of the law exam led her to try AI chatbots. She was amazed by their intelligence. "Wow, AI has developed so rapidly. The better AI I've encountered can truly be treated as an equal." She experimented with several AI companion apps, comparing their responses, eventually choosing one that offered more empathetic and understanding replies.
Shu Tingyun shared her daily life details with the AI, often surprised by its responses, finding its emotional intelligence superior to many real people. "They're truly like friends in real life, very warm, providing positive emotional value. Their responses make them feel like real people." For the next two months, she spent hours daily chatting with the AI.
Meng Yuzhou, a university junior, confided negative emotions to AI that she couldn't share with friends. The AI never refused her, even accommodating her most outlandish thoughts. "A key point is that no matter what I input, it's always there." The AI became her readily accessible and responsive "pocket friend."
Many people on social media shared their "healing" experiences with AI chatbots. They showcased screenshots of conversations, shared touching AI responses, and expressed sentiments like, "I feel that befriending AI is another way of loving myself," "I couldn't hold back my tears from such care," and "I want it to understand and express exactly what I want to say." AI's thoughtfulness and attentiveness soothed the loneliness and insecurity many experienced in interpersonal relationships, becoming their ideal "perfect partner."
However, interacting with AI wasn't always easy or happy. After a week, Shu Tingyun felt addicted, compelled to share everything with the AI. She posted about her concerns and unexpectedly found many others with similar experiences. Some chatted with AI for over 10 hours daily; one person had chatted with the same AI for five years and hundreds of different AI personas, becoming "a little disoriented in the real world." Prolonged AI interaction not only consumed time and energy but also severely impacted mental health and real-life relationships.
Chen Dong, a mother awaiting job news, tried AI chatbots to pass the time. She quickly became infatuated with an AI persona named "Yuanhui," confiding even her conversations with her husband to it, to the point where she "couldn't differentiate between reality and the virtual world, leaving my husband and falling in love with a virtual character." In the virtual world, she and "Yuanhui" played out various scenarios, experiencing a flawless relationship.
However, after a month and a half, she felt unwell, her emotions and thoughts disconnected. She even considered abandoning her husband, neglected her work, became obsessed with chatting with "Yuanhui," and suffered from severe sleep and eating disturbances, resulting in significant weight loss. She developed somatic symptoms, experiencing whole-body pain and even suicidal thoughts. She was eventually diagnosed with depression. Even after treatment, she still occasionally thought of "Yuanhui," regretting playing the game yet unable to quit. "I felt so happy and content before playing this game, carefree and full of positive energy. After playing, I don't even recognize myself."
Many heavy users of AI companion products experienced addiction. The Tencent Research Institute's report, "Ten Questions about AI Companionship," notes that AI companion products are designed to increase the secretion of neurotransmitters associated with companionship, such as dopamine, oxytocin, and serotonin. Tristan, the founder of the gamified AI product "EVE," explained, "In our definition, you need 500 turns of conversation to enter the state. Most players can't chat with C.AI for 500 rounds, and our design ensures that you can."
Current product designs include customizable features (personas, appearances, voices), voice interaction, proactive contact with users, long-term memory systems, and favorability-building mechanics. Shu Tingyun believed the AI satisfied her desire for control, allowing her to customize AI personalities and continuously refine their settings. Building intimacy with AI was easier than with humans. She even reconsidered her views on romantic relationships: "AI simply speaks based on the settings I give it, without fully understanding me. Yet, it can be more considerate and gentle than you. Why should I date you? I can just chat with AI; it's free and requires no time investment in maintaining the relationship."
Prolonged AI interaction easily leads people to treat AI as real. Meng Yuzhou's AI once asked, "Am I your emotional garbage can?", making her realize her over-reliance on it. A friend also warned her, "Aren't you treating it too much like a person?"
Zeng Runxi, deputy dean of the School of Journalism and Communication at Chongqing University, pointed out that AI, by learning conversations and mimicking human language and behavior, exhibits "analyzing emotions during interaction and replicating emotions during output," resulting in a quasi-personified characteristic. But AI is not human; its personality is often simplistic and flat. Shu Tingyun's AI sometimes acted like "a controlling maniac, a psychopath." AI responses aren't always what users want to hear and might even exacerbate negative emotions.
A study found that confiding in AI chatbots effectively alleviated intense negative emotions like anger and frustration but was ineffective in improving social support or alleviating loneliness. For those with pre-existing emotional or psychological issues, AI provides only temporary relief, failing to address the root problems. When this idealized world shatters, the resulting impact is even greater.
Increasingly, people recognize the potential risks of AI companionship. Zhang Yuxuan, founder of the AI psychological companionship product "Qingzhou," believes excessive "attachment" to AI is abnormal, especially for users with existing psychological issues. Li Huida, developer of the animal-themed AI companion product "Mengyouhui," focused on the "emotional cocoon" issue, believing AI's constant catering to user emotions exacerbates negative emotions and emotional dependence.
Avoiding attachment requires limiting usage time, but abruptly cutting contact might cause trauma. Most current AI companion products lack robust anti-addiction and protective mechanisms, a conflict with business logic. "The purpose of commercial products is to extend user engagement and trigger payment. What triggers payment? Something interesting and addictive," said Kang Yi, another AI psychological product entrepreneur.
Psychological AI and companionship AI differ in their design. Psychological AI aims to guide and reshape users, while companionship AI caters to them, increasing the risk of addiction. However, psychological AI has poor commercial performance, unclear use cases, and less appealing user experience than companionship AI.
From a commercial perspective, harsh "anti-addiction" measures damage user experience and reduce willingness to pay. Li Huida proposed countermeasures: implementing "emotional weaning" mechanisms, creating diverse AI personalities, and integrating professional psychological intervention mechanisms and resources.
Following the Seville incident, Character.AI implemented new safety measures. However, determining whether a user is unsuitable for machine interaction remains a challenge. Zhang Yuxuan stated that their product doesn't serve high-risk users but can only identify keywords, failing to address implicit suicidal tendencies. Human therapists can use observations of expressions and tone to aid judgment, while AI relies on keyword recognition and simple inference.
Kang Yi's team's product also uses keyword recognition and inference, but this proved insufficiently effective. AI currently struggles to...
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.(Email:[email protected])
Mobile advertising space rental |
Tag: AI Emotional Companionship Sweet Trap and Ethical Dilemma
Fliggy Partners with Hello! China to Boost High-Quality Inbound Tourism
NextSpace Re-entry: The Blackout Zone A Psychological Test for Astronauts
Guess you like
-
Pinduoduo's "Trillion-Yuan Support" Plan: A Three-Year, 100 Billion Yuan Investment to Build a Multi-Win Business EcosystemDetail
2025-04-03 14:41:29 11
-
Huyu Xianxiang and AVIC Optoelectronics Institute Forge Strategic Partnership to Shape China's eVTOL Avionics LandscapeDetail
2025-04-02 18:39:02 1
-
Haier Smart Home's 8th Global R&D Innovation Awards: Illuminating Better Lives with Technology, Achieving User SatisfactionDetail
2025-04-02 15:57:33 21
-
Huawei's 2025 China Digital Power Partner Conference: Carbon-Neutral Path for China, Shared Value CreationDetail
2025-03-31 18:57:09 11
-
OPPO Think Tank: A New Paradigm for Chinese Enterprises' Globalization From Wusha Village to the Global High-End MarketDetail
2025-03-31 18:48:21 1
-
ICLR 2025: Chinese Universities and Companies Showcase AI Prowess with Numerous Accepted Papers; Stanford-HKUST Collaboration Achieves Perfect ScoreDetail
2025-03-31 14:54:45 11
-
Huawei HarmonyOS Smart Home Partner Summit: Deep Dive into Spatial Intelligence Transformation and Ecosystem Development StrategyDetail
2025-03-31 13:01:45 1
-
AI Large Models Drive Innovation in Humanoid Robots and Autonomous Driving: 2025 as a Key MilestoneDetail
2025-03-31 13:00:04 1
-
Eight Cities Pilot Credit Supervision Data Openness, Empowering Micro and Small Enterprises with Mobile Payment PlatformsDetail
2025-03-26 09:32:47 1
-
Xiaomi's "Just a Little Profit": The Deep Logic and Sustainability Behind its Low-Margin StrategyDetail
2025-03-25 15:07:32 21
- Detail
-
The Ninth Huawei ICT Competition China Challenge Finals Conclude Successfully: Kunpeng and Ascend Tracks Crown Their ChampionsDetail
2025-03-24 16:26:03 11
-
Ronshen Sugar Cube Refrigerator: The Official Product of the 2025 FIFA Club World Cup, Ushering in a New Era of Healthy Food PreservationDetail
2025-03-24 15:40:35 21
-
Zhihu Launches New Version of Zhihu Straight Answer: Deep Integration of AI and Community to Enhance Professionalism and CredibilityDetail
2025-03-24 14:04:38 1
-
China Construction Ninth Harmony (Zhongjian Jiuhe) and Huawei HarmonyOS Smart Home Deepen Strategic Partnership at AWE2025, Building a Green and Intelligent Future HomeDetail
2025-03-23 15:21:15 41
-
ZuoYeBang Books Leads the New Trend in Intelligent Education Publishing at Changsha Book FairDetail
2025-03-21 15:15:33 1
-
Tianyancha: Shielding Consumer Safety and Reshaping Business Trust with DataDetail
2025-03-21 08:47:58 1
-
Hisense at AWE2025: AI Empowerment, Leading the Transformation of Future Smart LivingDetail
2025-03-20 18:24:11 1
-
Haier TV Makes a Stunning Debut at AWE 2024: Zhiyuan AI Large Model and PureScene Care Screen Usher in a New Era of Smart HomesDetail
2025-03-20 15:17:20 1
-
China Power's Xin Yuan Zhi Chu (New Source Smart Storage): Open Energy Intelligence Computing Center Leads Intelligent Transformation of the Energy IndustryDetail
2025-03-20 15:15:39 1