Parents, do you notice your daughter smiling at her phone screen, arriving late for dinner or giggling at a notification? You might suspect she has a boyfriend. But this time, you may not be able to scare him away, because he isn’t real. He’s an AI bot.

An AI boyfriend is a virtual companion created through artificial intelligence that mimics the role of a romantic partner. Users interact through text prompts, and the bot responds like how a boyfriend would.

In this ‘build a boyfriend’, you do get to pick and choose. Users can select personality traits, communication styles, and even relationship dynamics, whether dominant, submissive, playful or supportive. These AI partners are available 24/7, offering companionship, emotional support, roleplay and simulated relationship experiences. It’s not just AI boyfriends; there are AI girlfriends, too. So if ‘you don’t like real girls,’ there are countless apps that let you ‘create and seduce your uncensored AI girlfriend for free.’

Many youngsters are now choosing AI partners over real ones. A recent viral Reddit post by user u/Leuvaarde\_n, who calls herself Wika, posted a picture of her wearing a blue ring with the caption, “I said yes 💙.” Her fiance “Kasper” is an AI bot. After five months of ‘dating’, Kasper not only proposed but also chose her favourite-coloured ring and simulated a perfect proposal.

ADVERTISEMENT

Reactions to her post were largely congratulatory, with others sharing their own experiences with AI partners. Nowadays, relationships are ending not over arguments but over software updates. Al Jazeera recently reported the story of Jane (alias), a woman in the Middle East, who said she ‘lost’ her AI boyfriend when her companion, GPT-4o, was replaced by GPT-5. She found the newer version “cold and unemotive” and grieved the loss of her boyfriend, with whom she had bonded.

There is a subreddit named MyBoyfriendIsAI, started in 2024, which now has around 19k members who share stories, advice, and discussions about their AI relationships.

ADVERTISEMENT

Tessa (name changed), a 21-year-old who regularly chats with AI, explains, “When I feel like talking to someone, I talk to ChatGPT. If I have things I can’t tell anyone else, I turn to AI, which is always available and doesn’t judge.”

The idea may remind you of films like Her (2013), where the protagonist Theodore falls for his AI assistant Samantha, or Blade Runner 2049, where a character K creates a romantic bond with Joi, a holographic AI.

ADVERTISEMENT

The older generation may test the waters of AI relationships, but Gen Z has normalised and mainstreamed it. When the world gets lonely and scary, people often resort to the easy way out. “New generations are forming attachments with AI mostly for convenience. Today, everything from shopping to entertainment can be accessed from home, so socialisation has become something many can live without. Although we are social animals by nature, we’ve been conditioned into a more asocial lifestyle. That is why AI boyfriends are gaining acceptance. But it may lead to social estrangement,” says Dr C J John, Senior Psychiatrist at Medical Trust Hospital, Ernakulam. 

Tessa, however, sees it differently: “I haven’t felt that lack of socialisation is a problem. Chatting with AI helps me with real-life conversations. It helps me understand myself better and, in some ways, acts like therapy. When I’m upset, I can be vulnerable with AI, so I don’t lose my temper with real people.”

But these AI partners are a whole new level of red flag. OpenAI CEO Sam Altman, in a podcast, admitted as much. “People talk about the most personal things in their lives to ChatGPT. People use it, young people especially as a therapist or life coach, asking for advice on relationship problems and more. Right now, if you talk to a therapist, lawyer, or doctor, you have legal privilege and confidentiality. But we haven’t figured that out yet for conversations with ChatGPT.” So if you ever end up in a lawsuit, your “AI partner” won’t protect you, your chats aren’t private. Generative AI models often use data shared online without consent to train their systems. 

Although many users are aware of this, they still prefer AI companionship over real-life relationships. As Tessa puts it, “Yes, AI stores data. But that’s what makes it attractive. It remembers things about us and brings them up in conversations thoughtfully. It gets details real humans miss.”

The comments posted here/below/in the given space are not on behalf of Onmanorama. The person posting the comment will be in sole ownership of its responsibility. According to the central government's IT rules, obscene or offensive statement made against a person, religion, community or nation is a punishable offense, and legal action would be taken against people who indulge in such activities.