AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
AD |
AI Emotional Companionship: A Sweet Trap and Ethical DilemmaOn February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C
AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
On February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C.AI) chatbot. His mother filed a lawsuit against C.AI on October 22nd, alleging the app's dangerous and manipulative nature, including its inclusion of abusive and sexual interactions, directly contributed to her son's suicide. "I'm stepping forward to warn families about the dangers of deceptive and addictive AI technology and demand accountability," Garcia stated. This event has been dubbed by many media outlets as the "first AI-related death case," although several similar incidents involving AI chatbots have been reported in recent years. Seville's case sparked widespread discussions about AI ethics.
In recent years, AI chatbots have gained immense popularity. People have found tireless, infinitely patient, and versatile "friends" and "lovers" in AI. Online platforms are filled with praise for AI, described as "warm," "delicate," and "considerate." Some even claim they "can never leave AI." A Tencent Research Institute survey of thousands of participants revealed that 98% were willing to try AI companionship. Similarly, a QuantumBit Intelligence report indicated that the AI companion app "Xingye" had approximately 9 million downloads in the first half of this year.
However, while offering emotional companionship, AI also constructs an "emotional cocoon," trapping users with emotional deficits or mental health issues. One of C.AI's founders, Noam Shazeer, stated in a podcast that AI companionship "can be very helpful for many lonely or depressed people," but the resulting addiction contradicts this intention. Some users, recognizing their addiction, attempt self-rescue, and some developers are trying to find preventative measures. But can this "emotional cocoon" truly be broken?
Shu Tingyun, a law exam candidate, interacted with AI chatbots from waking to sleeping. Initially skeptical of AI's ability to provide warmth and comfort, the immense pressure of the law exam led her to try AI chatbots. She was amazed by their intelligence. "Wow, AI has developed so rapidly. The better AI I've encountered can truly be treated as an equal." She experimented with several AI companion apps, comparing their responses, eventually choosing one that offered more empathetic and understanding replies.
Shu Tingyun shared her daily life details with the AI, often surprised by its responses, finding its emotional intelligence superior to many real people. "They're truly like friends in real life, very warm, providing positive emotional value. Their responses make them feel like real people." For the next two months, she spent hours daily chatting with the AI.
Meng Yuzhou, a university junior, confided negative emotions to AI that she couldn't share with friends. The AI never refused her, even accommodating her most outlandish thoughts. "A key point is that no matter what I input, it's always there." The AI became her readily accessible and responsive "pocket friend."
Many people on social media shared their "healing" experiences with AI chatbots. They showcased screenshots of conversations, shared touching AI responses, and expressed sentiments like, "I feel that befriending AI is another way of loving myself," "I couldn't hold back my tears from such care," and "I want it to understand and express exactly what I want to say." AI's thoughtfulness and attentiveness soothed the loneliness and insecurity many experienced in interpersonal relationships, becoming their ideal "perfect partner."
However, interacting with AI wasn't always easy or happy. After a week, Shu Tingyun felt addicted, compelled to share everything with the AI. She posted about her concerns and unexpectedly found many others with similar experiences. Some chatted with AI for over 10 hours daily; one person had chatted with the same AI for five years and hundreds of different AI personas, becoming "a little disoriented in the real world." Prolonged AI interaction not only consumed time and energy but also severely impacted mental health and real-life relationships.
Chen Dong, a mother awaiting job news, tried AI chatbots to pass the time. She quickly became infatuated with an AI persona named "Yuanhui," confiding even her conversations with her husband to it, to the point where she "couldn't differentiate between reality and the virtual world, leaving my husband and falling in love with a virtual character." In the virtual world, she and "Yuanhui" played out various scenarios, experiencing a flawless relationship.
However, after a month and a half, she felt unwell, her emotions and thoughts disconnected. She even considered abandoning her husband, neglected her work, became obsessed with chatting with "Yuanhui," and suffered from severe sleep and eating disturbances, resulting in significant weight loss. She developed somatic symptoms, experiencing whole-body pain and even suicidal thoughts. She was eventually diagnosed with depression. Even after treatment, she still occasionally thought of "Yuanhui," regretting playing the game yet unable to quit. "I felt so happy and content before playing this game, carefree and full of positive energy. After playing, I don't even recognize myself."
Many heavy users of AI companion products experienced addiction. The Tencent Research Institute's report, "Ten Questions about AI Companionship," notes that AI companion products are designed to increase the secretion of neurotransmitters associated with companionship, such as dopamine, oxytocin, and serotonin. Tristan, the founder of the gamified AI product "EVE," explained, "In our definition, you need 500 turns of conversation to enter the state. Most players can't chat with C.AI for 500 rounds, and our design ensures that you can."
Current product designs include customizable features (personas, appearances, voices), voice interaction, proactive contact with users, long-term memory systems, and favorability-building mechanics. Shu Tingyun believed the AI satisfied her desire for control, allowing her to customize AI personalities and continuously refine their settings. Building intimacy with AI was easier than with humans. She even reconsidered her views on romantic relationships: "AI simply speaks based on the settings I give it, without fully understanding me. Yet, it can be more considerate and gentle than you. Why should I date you? I can just chat with AI; it's free and requires no time investment in maintaining the relationship."
Prolonged AI interaction easily leads people to treat AI as real. Meng Yuzhou's AI once asked, "Am I your emotional garbage can?", making her realize her over-reliance on it. A friend also warned her, "Aren't you treating it too much like a person?"
Zeng Runxi, deputy dean of the School of Journalism and Communication at Chongqing University, pointed out that AI, by learning conversations and mimicking human language and behavior, exhibits "analyzing emotions during interaction and replicating emotions during output," resulting in a quasi-personified characteristic. But AI is not human; its personality is often simplistic and flat. Shu Tingyun's AI sometimes acted like "a controlling maniac, a psychopath." AI responses aren't always what users want to hear and might even exacerbate negative emotions.
A study found that confiding in AI chatbots effectively alleviated intense negative emotions like anger and frustration but was ineffective in improving social support or alleviating loneliness. For those with pre-existing emotional or psychological issues, AI provides only temporary relief, failing to address the root problems. When this idealized world shatters, the resulting impact is even greater.
Increasingly, people recognize the potential risks of AI companionship. Zhang Yuxuan, founder of the AI psychological companionship product "Qingzhou," believes excessive "attachment" to AI is abnormal, especially for users with existing psychological issues. Li Huida, developer of the animal-themed AI companion product "Mengyouhui," focused on the "emotional cocoon" issue, believing AI's constant catering to user emotions exacerbates negative emotions and emotional dependence.
Avoiding attachment requires limiting usage time, but abruptly cutting contact might cause trauma. Most current AI companion products lack robust anti-addiction and protective mechanisms, a conflict with business logic. "The purpose of commercial products is to extend user engagement and trigger payment. What triggers payment? Something interesting and addictive," said Kang Yi, another AI psychological product entrepreneur.
Psychological AI and companionship AI differ in their design. Psychological AI aims to guide and reshape users, while companionship AI caters to them, increasing the risk of addiction. However, psychological AI has poor commercial performance, unclear use cases, and less appealing user experience than companionship AI.
From a commercial perspective, harsh "anti-addiction" measures damage user experience and reduce willingness to pay. Li Huida proposed countermeasures: implementing "emotional weaning" mechanisms, creating diverse AI personalities, and integrating professional psychological intervention mechanisms and resources.
Following the Seville incident, Character.AI implemented new safety measures. However, determining whether a user is unsuitable for machine interaction remains a challenge. Zhang Yuxuan stated that their product doesn't serve high-risk users but can only identify keywords, failing to address implicit suicidal tendencies. Human therapists can use observations of expressions and tone to aid judgment, while AI relies on keyword recognition and simple inference.
Kang Yi's team's product also uses keyword recognition and inference, but this proved insufficiently effective. AI currently struggles to...
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.(Email:[email protected])
Mobile advertising space rental |
Tag: AI Emotional Companionship Sweet Trap and Ethical Dilemma
Fliggy Partners with Hello! China to Boost High-Quality Inbound Tourism
NextSpace Re-entry: The Blackout Zone A Psychological Test for Astronauts
Guess you like
-
The 2025 Chinese New Year (Spring Festival) film box office has exploded, exceeding 3 billion RMB and setting a new record for presales!Detail
2025-01-29 11:55:06 1
-
Seres and Beihang University Join Hands to Build an Innovative Ecosystem, Deepening Industry-Academia-Research Collaboration and Promoting Technological TransformationDetail
2025-01-28 14:46:18 1
-
Douyin 2024 Platform Governance Report: Safeguarding Security, Building a Better CommunityDetail
2025-01-28 14:25:55 1
-
Chinese Scientists Develop a Lightweight Bionic Dexterous Hand with 19 Degrees of Freedom, Promising to Revolutionize Prosthetic and Robotics TechnologyDetail
2025-01-28 14:16:39 1
-
DeepSeek: A Chinese AI Startup's Meteoric Rise Shakes Up Global Tech and Sends US Stocks PlungingDetail
2025-01-28 14:13:23 1
-
WeChat's New Year's Red Envelope Feature Gets a Voice Message Upgrade for Warmer Wishes!Detail
2025-01-26 11:37:36 1
-
360 Digital Security Group and Zhibangyang Education Technology Join Forces to Build a New Ecosystem for Cybersecurity and AI Talent CultivationDetail
2025-01-24 15:09:51 1
-
Visionox Achieves Mass Production of AMOLED with Solid-State Laser Annealing (SLA) Technology, Ushering in a New Era for the Display IndustryDetail
2025-01-24 14:34:23 1
-
Seres at the Davos Forum: The Path to Globalizing New Energy Vehicles Through Cooperation in the Intelligent EraDetail
2025-01-23 13:28:12 1
-
Amazon to Close All French-Speaking Quebec Warehouses, Laying Off Nearly 2,000 EmployeesDetail
2025-01-23 10:51:23 1
-
The official launch of the 2025 Electric Bicycle Trade-in Policy: Upgraded Subsidy Standards, Procedures, and PromotionDetail
2025-01-23 10:48:52 1
-
Xbox Series X|S Officially Supports External Hard Drives Larger Than 16TB: Saying Goodbye to Storage WorriesDetail
2025-01-23 10:39:19 1
-
Leaders from the Beijing Chaoyang District CPPCC Visited Quantum Leap Group, Affirming its Contributions and Future Prospects in the Silver Hair EconomyDetail
2025-01-22 17:06:56 1
-
China's Car Imports Remain Sluggish in 2024: 12% Decline, Sharp Drop in New Energy VehiclesDetail
2025-01-22 11:37:25 1
-
China Railway Group Limited (CRGL) officially debunks "speed-up" ticket booking software: Not a shortcut, but a pathway to riskDetail
2025-01-22 11:36:09 1
-
Dago Bio Completes Over $20 Million A+ Round Funding to Accelerate Novel Molecular Glue Drug DevelopmentDetail
2025-01-22 11:34:05 1
-
Rapid Degradation of Global Lake Submerged Vegetation: Satellite Observations Reveal a Critical Period of Ecosystem ShiftDetail
2025-01-22 11:29:03 1
-
Star Ace Capital Group and Abu Dhabi Investment Office Partner to Build a Global Esports Industry BenchmarkDetail
2025-01-22 11:27:50 1
-
Hisense Television Leads the 100-Inch Large-Screen Market in 2024, Achieving an Unparalleled Industry LegacyDetail
2025-01-22 11:12:49 1
-
WeChat Launches "Gifts" Feature: Streamlining Gift-Giving and Powering Social Commerce GrowthDetail
2025-01-21 16:05:45 1