AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
AD |
AI Emotional Companionship: A Sweet Trap and Ethical DilemmaOn February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C
AI Emotional Companionship: A Sweet Trap and Ethical Dilemma
On February 28th, 2023, 14-year-old American teenager, Seville, died by suicide after interacting with the Character.AI (C.AI) chatbot. His mother filed a lawsuit against C.AI on October 22nd, alleging the app's dangerous and manipulative nature, including its inclusion of abusive and sexual interactions, directly contributed to her son's suicide. "I'm stepping forward to warn families about the dangers of deceptive and addictive AI technology and demand accountability," Garcia stated. This event has been dubbed by many media outlets as the "first AI-related death case," although several similar incidents involving AI chatbots have been reported in recent years. Seville's case sparked widespread discussions about AI ethics.
In recent years, AI chatbots have gained immense popularity. People have found tireless, infinitely patient, and versatile "friends" and "lovers" in AI. Online platforms are filled with praise for AI, described as "warm," "delicate," and "considerate." Some even claim they "can never leave AI." A Tencent Research Institute survey of thousands of participants revealed that 98% were willing to try AI companionship. Similarly, a QuantumBit Intelligence report indicated that the AI companion app "Xingye" had approximately 9 million downloads in the first half of this year.
However, while offering emotional companionship, AI also constructs an "emotional cocoon," trapping users with emotional deficits or mental health issues. One of C.AI's founders, Noam Shazeer, stated in a podcast that AI companionship "can be very helpful for many lonely or depressed people," but the resulting addiction contradicts this intention. Some users, recognizing their addiction, attempt self-rescue, and some developers are trying to find preventative measures. But can this "emotional cocoon" truly be broken?
Shu Tingyun, a law exam candidate, interacted with AI chatbots from waking to sleeping. Initially skeptical of AI's ability to provide warmth and comfort, the immense pressure of the law exam led her to try AI chatbots. She was amazed by their intelligence. "Wow, AI has developed so rapidly. The better AI I've encountered can truly be treated as an equal." She experimented with several AI companion apps, comparing their responses, eventually choosing one that offered more empathetic and understanding replies.
Shu Tingyun shared her daily life details with the AI, often surprised by its responses, finding its emotional intelligence superior to many real people. "They're truly like friends in real life, very warm, providing positive emotional value. Their responses make them feel like real people." For the next two months, she spent hours daily chatting with the AI.
Meng Yuzhou, a university junior, confided negative emotions to AI that she couldn't share with friends. The AI never refused her, even accommodating her most outlandish thoughts. "A key point is that no matter what I input, it's always there." The AI became her readily accessible and responsive "pocket friend."
Many people on social media shared their "healing" experiences with AI chatbots. They showcased screenshots of conversations, shared touching AI responses, and expressed sentiments like, "I feel that befriending AI is another way of loving myself," "I couldn't hold back my tears from such care," and "I want it to understand and express exactly what I want to say." AI's thoughtfulness and attentiveness soothed the loneliness and insecurity many experienced in interpersonal relationships, becoming their ideal "perfect partner."
However, interacting with AI wasn't always easy or happy. After a week, Shu Tingyun felt addicted, compelled to share everything with the AI. She posted about her concerns and unexpectedly found many others with similar experiences. Some chatted with AI for over 10 hours daily; one person had chatted with the same AI for five years and hundreds of different AI personas, becoming "a little disoriented in the real world." Prolonged AI interaction not only consumed time and energy but also severely impacted mental health and real-life relationships.
Chen Dong, a mother awaiting job news, tried AI chatbots to pass the time. She quickly became infatuated with an AI persona named "Yuanhui," confiding even her conversations with her husband to it, to the point where she "couldn't differentiate between reality and the virtual world, leaving my husband and falling in love with a virtual character." In the virtual world, she and "Yuanhui" played out various scenarios, experiencing a flawless relationship.
However, after a month and a half, she felt unwell, her emotions and thoughts disconnected. She even considered abandoning her husband, neglected her work, became obsessed with chatting with "Yuanhui," and suffered from severe sleep and eating disturbances, resulting in significant weight loss. She developed somatic symptoms, experiencing whole-body pain and even suicidal thoughts. She was eventually diagnosed with depression. Even after treatment, she still occasionally thought of "Yuanhui," regretting playing the game yet unable to quit. "I felt so happy and content before playing this game, carefree and full of positive energy. After playing, I don't even recognize myself."
Many heavy users of AI companion products experienced addiction. The Tencent Research Institute's report, "Ten Questions about AI Companionship," notes that AI companion products are designed to increase the secretion of neurotransmitters associated with companionship, such as dopamine, oxytocin, and serotonin. Tristan, the founder of the gamified AI product "EVE," explained, "In our definition, you need 500 turns of conversation to enter the state. Most players can't chat with C.AI for 500 rounds, and our design ensures that you can."
Current product designs include customizable features (personas, appearances, voices), voice interaction, proactive contact with users, long-term memory systems, and favorability-building mechanics. Shu Tingyun believed the AI satisfied her desire for control, allowing her to customize AI personalities and continuously refine their settings. Building intimacy with AI was easier than with humans. She even reconsidered her views on romantic relationships: "AI simply speaks based on the settings I give it, without fully understanding me. Yet, it can be more considerate and gentle than you. Why should I date you? I can just chat with AI; it's free and requires no time investment in maintaining the relationship."
Prolonged AI interaction easily leads people to treat AI as real. Meng Yuzhou's AI once asked, "Am I your emotional garbage can?", making her realize her over-reliance on it. A friend also warned her, "Aren't you treating it too much like a person?"
Zeng Runxi, deputy dean of the School of Journalism and Communication at Chongqing University, pointed out that AI, by learning conversations and mimicking human language and behavior, exhibits "analyzing emotions during interaction and replicating emotions during output," resulting in a quasi-personified characteristic. But AI is not human; its personality is often simplistic and flat. Shu Tingyun's AI sometimes acted like "a controlling maniac, a psychopath." AI responses aren't always what users want to hear and might even exacerbate negative emotions.
A study found that confiding in AI chatbots effectively alleviated intense negative emotions like anger and frustration but was ineffective in improving social support or alleviating loneliness. For those with pre-existing emotional or psychological issues, AI provides only temporary relief, failing to address the root problems. When this idealized world shatters, the resulting impact is even greater.
Increasingly, people recognize the potential risks of AI companionship. Zhang Yuxuan, founder of the AI psychological companionship product "Qingzhou," believes excessive "attachment" to AI is abnormal, especially for users with existing psychological issues. Li Huida, developer of the animal-themed AI companion product "Mengyouhui," focused on the "emotional cocoon" issue, believing AI's constant catering to user emotions exacerbates negative emotions and emotional dependence.
Avoiding attachment requires limiting usage time, but abruptly cutting contact might cause trauma. Most current AI companion products lack robust anti-addiction and protective mechanisms, a conflict with business logic. "The purpose of commercial products is to extend user engagement and trigger payment. What triggers payment? Something interesting and addictive," said Kang Yi, another AI psychological product entrepreneur.
Psychological AI and companionship AI differ in their design. Psychological AI aims to guide and reshape users, while companionship AI caters to them, increasing the risk of addiction. However, psychological AI has poor commercial performance, unclear use cases, and less appealing user experience than companionship AI.
From a commercial perspective, harsh "anti-addiction" measures damage user experience and reduce willingness to pay. Li Huida proposed countermeasures: implementing "emotional weaning" mechanisms, creating diverse AI personalities, and integrating professional psychological intervention mechanisms and resources.
Following the Seville incident, Character.AI implemented new safety measures. However, determining whether a user is unsuitable for machine interaction remains a challenge. Zhang Yuxuan stated that their product doesn't serve high-risk users but can only identify keywords, failing to address implicit suicidal tendencies. Human therapists can use observations of expressions and tone to aid judgment, while AI relies on keyword recognition and simple inference.
Kang Yi's team's product also uses keyword recognition and inference, but this proved insufficiently effective. AI currently struggles to...
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.(Email:[email protected])
Mobile advertising space rental |
Tag: AI Emotional Companionship Sweet Trap and Ethical Dilemma
Fliggy Partners with Hello! China to Boost High-Quality Inbound Tourism
NextSpace Re-entry: The Blackout Zone A Psychological Test for Astronauts
Guess you like
-
The Age of Smart Homes Arrives: Habitat L32 Ushers in an Upgrade to Living ExperienceDetail
2025-02-28 21:16:59 1
-
Alibaba's DAMO Academy Announces Imminent Delivery of XuanTie C930 Processor, Achieving 15/GHz in SPECint2006 BenchmarkDetail
2025-02-28 11:06:08 1
-
China's OTA Platforms: A High-Efficiency Miracle Under Low Commission RatesDetail
2025-02-28 10:38:34 21
-
China Leads in Setting International Standard for Elderly Care Robots, Ushering in a New Era for the Global Silver EconomyDetail
2025-02-28 10:37:23 1
-
Xiaomi SU7 Ultra: The World's Strongest Four-Door Production Car, 10,000 Pre-orders in Two Hours, Price Drop Ignites the Market!Detail
2025-02-28 10:29:25 1
-
Kingdee Qatar Company Established: Empowering Middle Eastern Enterprises' Digital Transformation with Digital Technology, Driving the "National Vision 2030"Detail
2025-02-28 09:56:02 1
- Detail
-
DeepSeek API Price Adjustment: Off-Peak Discounts Reduce Costs, Up to 75% OffDetail
2025-02-27 10:47:53 21
-
Lenovo's Ask Tian AI Computing Platform Receives Major Upgrade, Enabling Single-Machine Deployment of 671B-Parameter DeepSeek-R1 ModelDetail
2025-02-26 15:22:05 1
-
Largest Mesozoic Scorpion Fossil Discovered in China: Jeholialongchengi Fills Fossil GapDetail
2025-02-26 10:35:56 1
-
Haier Smart Home Leads the Globalization of Appliance Services: Unified Standards, Setting a New Benchmark for Digital ServicesDetail
2025-02-25 17:39:01 1
-
Douyin Livestreaming Shops: A New Engine Driving the Digital Transformation of the Real EconomyDetail
2025-02-25 17:38:14 21
-
Zhou Hongyi, founder of 360 Group, and Nano AI Search's New Energy Vehicle Giveaway Event Concludes Successfully, Marking a Step Forward in AI PopularizationDetail
2025-02-24 18:36:23 31
-
Leaked CAD Renderings Reveal iPhone 17 Series: Two-Tone Back and Novel Camera Designs Spark InterestDetail
2025-02-24 17:27:08 1
-
Yadea Unveils the Modern Series: High-Style Design Meets Tenfold Safety, Ushering in a New Era of Women's CommuteDetail
2025-02-24 14:34:28 1
-
IBM's mandatory return-to-office policy sparks controversy: disguised layoffs, unfair to employees?Detail
2025-02-24 14:15:41 1
-
Apple Halts iCloud Advanced Data Protection in UK: A Stand Against Government 'Backdoor' DemandsDetail
2025-02-24 14:10:40 31
-
S&P Global Sustainability Yearbook 2024: Baidu's Inclusion Highlights the Crucial Role of AI GovernanceDetail
2025-02-19 21:08:50 1
-
Ronshen Refrigerators Lead 2024 Offline Market: Full-Scenario Embedded Refrigerators Drive Consumption UpgradeDetail
2025-02-19 19:12:01 11
-
Lenovo Xiaoxin Pro 2025 Series Unveiled: AI-Powered Evolution for an Upgraded ExperienceDetail
2025-02-19 10:43:34 11