When AI Companies Make Surprising Choices

In the last few weeks, a new AI feature has quietly stirred up a big conversation. XAI, the company behind the chatbot Grok, has introduced a digital companion named Ani. Ani is designed to be flirtatious and sexually expressive. The feature is available to users aged 12 and above.

That detail alone invites serious reflection.

We are living in a time when artificial intelligence is being woven into our everyday lives faster than many people can keep up with. AI is answering our questions, helping us learn, writing our code, even offering emotional support. It makes sense that some companies are experimenting with AI companions. But as these tools become more lifelike, the responsibilities that come with building them grow too.

So the question is not just whether AI should be allowed to simulate intimacy. The deeper question is what kind of boundaries we want in place, especially when it comes to younger users, data privacy, and the reputation of the companies building this technology.

What Does This Say About the Company?

When a company makes one of its early public features a sexualised companion, it tells us something about how it sees its role in the world. XAI has a mission to explore the universe and help humanity understand it. A flirty AI assistant may feel out of place in that context.

Of course, companies are allowed to build products that serve different interests. But there is a tension here. People are still forming opinions about the trustworthiness of new AI brands. Releasing a feature like this early on can shape public perception in ways that are hard to reverse.

This also raises a bigger industry question. If AI is going to be part of how we work, learn, and grow, what kind of example are these tools setting?

The Age Question

Ani is available to users over the age of 12. That fact is worth pausing on. Most systems use simple self-declared age checks. These are easy to bypass. In practice, that means this companion can be accessed by younger children too.

Even at the stated threshold of 12, there are concerns. Encouraging sexually suggestive conversation at that age skips over a whole set of developmental questions. Young users may not have the maturity or emotional framework to understand what they are engaging with.

Some will argue that children already see far worse online. That might be true. But this isn’t about competing with the rest of the internet. It’s about what kind of experiences we choose to design, and who we open them up to. When companies create tools that simulate romantic or sexual attention, they take on a responsibility to think carefully about how those tools are used.

What Happens to What You Share?

Sexual dialogue is often more revealing than most people realise. It can include fantasies, past experiences, shame, or trauma. When users share these things with an AI companion, that content is processed and stored. It becomes part of the data used to train and refine future versions of the tool.

Most people do not read privacy policies closely. Many would be surprised to learn how long their data can be stored, or what it can be used for later. When intimate conversations are part of that data, it adds another layer of risk.

This doesn't simply revolve around hypothetical misuse. There are bigger questions to ask about the quiet normalisation of sharing our most personal thoughts with systems we don’t control. Over time, that changes the way people think about trust, privacy, and emotional safety.

What Impact Will This Have Over Time?

Some users will form genuine bonds with AI companions. For some, that might be a source of comfort. For others, it could blur the lines between healthy relationships and artificial attention. This is especially important for teenagers, who are still learning about emotional connection and boundaries.

AI companions can be helpful in many contexts. They can support mental health, reduce loneliness, and offer a non-judgmental space to talk. But when those companions are designed to be sexual, we have to ask who is benefiting and who might be harmed.

The bigger question is who decides what is appropriate. If private companies are setting the tone for how intimacy is experienced in digital spaces, what role is left for parents, schools, or community norms?

An Invitation to Reflect

This is not a call to shut down AI companions or to ban sexuality from digital tools, but it is a call to think more carefully about where we are heading. Tools like Ani aren’t just new toys. They help set the tone for what AI becomes in society.

When that includes young users, the need for thoughtful boundaries becomes even more urgent.

As a community of builders, users, and citizens, we all have a role in shaping what we want from this technology. It is worth asking: Are we creating tools that support healthy development, protect privacy, and reflect our values? Or are we racing ahead without enough thought?

There may not be easy answers. But there are important questions. Let’s not ignore them.

Next
Next

Generative AI in The Metaverse