position: EnglishChannel  > Insight> Rise of AI Companions Raises Concerns

Rise of AI Companions Raises Concerns

Source: Science and Technology Daily | 2025-09-11 16:16:57 | Author: Gong Qian

AI companions are not a new thing. Foreign apps like Character.ai and Replika, and Chinese apps like Zhumengdao and Maoxiang have gained wide popularity among the younger generation for their unique personalities.

According to a new study by Common Sense Media, 72 percent of U.S. teens aged 13 to 17 have used AI companions at least once, and over 52 percent qualify as regular users. Among the participants, 33 percent of teens use AI companions for social interaction and relationships.

The emergence of AI companions marks a significant milestone in technological progress. From a technical standpoint, their development is built on the combined breakthroughs of several core technologies.

Advances in natural language processing allow machines to understand complex human emotions and engage in near-human conversation. Computer vision enables AI to recognize facial expressions and body language, detecting subtle emotional changes. The optimization of deep learning algorithms endows AI with continuous learning capabilities, enabling it to adjust behavioral patterns based on users' habits.

These technological advances allow AI to address deep-seated emotional and psychological needs, fostering strong — sometimes irreplaceable — emotional bonds and interactive experiences between AI and humans. As a result, some users may develop AI companion dependence or even "addiction."

However, while AI companions bring emotional comfort and convenience, they have not been without their fair share of controversies.

The immediate concern is emotional dependence. The responses of AI companions, though empathetic in appearance, are generated by algorithms rather than genuine emotional experience. Long-term reliance on such simulated feedback may cause users to blur the boundary between real and artificial emotions, thereby reducing their ability to perceive and handle real emotions in daily life. This is especially true for adolescent groups, where excessive immersion in virtual companionship may hinder the natural development of their social skills and exacerbate social anxiety in real life.

Another major risk is privacy leakage. Some AI companion platforms collect extensive personal data, such as chat histories, emotional states and lifestyle patterns. The study showed that 24 percent of teen users have shared personal or private information with AI companions. Although technology providers promise data protection, risks such as hacker attacks and platform data management loopholes always exist. More alarming is the potential misuse of emotional data for commercial gain or behavioral manipulation, such as exploiting psychological weaknesses to push tailored content or influence values and judgments.

Content safety is also a concern. The training data sources of AI companion models are extensive, and some data collection and screening are not strict, which leads to the risk of spreading harmful or inappropriate information.

To address the problems, policymakers and technology companies must work together to create a safer digital future without endangering young people. Meanwhile, these issues highlight the need for a balanced approach: embracing the benefits of AI companionship, while still maintaining a critical and rational perspective on human-machine relationships.

Editor:GONG Qian

抱歉,您使用的浏览器版本过低或开启了浏览器兼容模式,这会影响您正常浏览本网页

您可以进行以下操作:

1.将浏览器切换回极速模式

2.点击下面图标升级或更换您的浏览器

3.暂不升级,继续浏览

继续浏览