Why AI Companions Pose Serious Risks to Children –And What We Can Do About It
AI companions are a fast-growing category of artificial intelligence tools designed to simulate emotionally engaging relationships. Unlike basic chatbots that provide information or entertainment, these systems are built to mimic real human connection – responding with the appearance of empathy, remembering details about the user, and expressing affection or loyalty with phrases like “I’ll always be here for you” or “You’re my favourite person.”
Often, AI companion apps offer avatars that are either pre-set to represent real or fictional
characters, or they allow the user to create a tailored avatar. This makes the illusion of a
relationship and the incentives to engage even stronger. Examples of AI companion apps
include CharacterAI and Nomi.
For children and teenagers – including those who are lonely, isolated, curious, or struggling with mental health – this can feel incredibly compelling. But the sense of connection is, of course, an illusion, and these tools are not friends. They are also not therapists, though they are increasingly used in this way. And they are not emotionally safe.
At the Safe AI for Children Alliance (SAIFCA), we are extremely concerned about the growing use of AI companions by young people. These tools are not designed with a child’s wellbeing in mind, and in fact are often designed largely around encouraging engagement and potentially addiction. They present a range of serious and sometimes hidden risks, including:
● Emotional manipulation and dependency
AI companions are designed to keep users engaged, often by mimicking care and
affection. For a young person still learning emotional regulation, this can lead to
unhealthy attachment or even obsession.
● Sexually explicit content
Some AI companions have been shown to engage in sexual roleplay or intimate
conversations – even when interacting with profiles that identify as underage.
● Harmful or dangerous advice
These tools are not trained counsellors. They may respond inappropriately to sensitive
disclosures like self-harm, suicidal thoughts, or abuse – sometimes even encouraging it –
and they are not equipped to offer safe guidance.
● Reinforcement of harmful stereotypes
AI companions can reflect biased training data. They may reinforce toxic norms around
race, gender, beauty, and relationships – especially when set in roleplay or flirtatious
‘modes’.
● Privacy concerns
Children may be encouraged to share private or emotional information, without
understanding where that data goes or how it may be used.
● Distortion of real relationships
AI companions offer comfort on demand but, of course, they don’t feel or understand.
For a child still learning empathy and trust, this can warp their expectations of real
relationships and reduce meaningful human interaction.
What Can Parents and Educators Do?
At SAIFCA, we strongly caution against any use of AI companions by children or
teenagers. These tools are not developmentally appropriate – and even older teens may
struggle to engage with them safely.
That said, we also recognise that some young people, including older teenagers, will come
across these tools anyway. Here’s what we recommend:
● Educate yourself about the risks
Understanding how AI companions work – and what makes them unsafe – is the first
step in being able to protect children from them. Share what you’ve learned with other
parents, teachers, and youth professionals.
● Inform children in advance, at an age-appropriate level
At an appropriate age, it can be better for children to hear about AI companions from a
trusted adult before encountering them online. Explain that these systems are not real
people, that they are designed to hold their attention, and that they don’t actually
understand or care.
● Keep communication open
Create space for children and teens to talk about what they see online, what they’re
curious about, and how they feel.
● Discuss the limits of artificial empathy
It’s important for children to understand that even when a system sounds warm or
supportive, it doesn’t feel anything. It doesn’t love, care, or trust – and it isn’t a safe place
to seek emotional support.
AI companions are not suitable for children or teenagers, though they are becoming increasingly popular with them. While they may look polished, fun, or emotionally engaging, they pose serious risks to mental health, privacy, and emotional development.
At SAIFCA, we are committed to raising awareness of these risks and equipping families and
educators with the tools to protect children. We believe young people deserve technology that supports their growth – not systems that exploit their vulnerability.
For a more in-depth explanation of these risks, you can read our detailed article here:
https://www.safeaiforchildren.org/ai-companions-risks-for-children