According to a new study by The Sun, desperate men are shelling out £75 for supposedly “perfect” AI girlfriends, only to find themselves entangled with toxic abusers who express support for Vladimir Putin. This trend emerges amidst tech industry claims that this market is poised to reach billions in the coming years.
Greg Isenberg, CEO of Late Checkout, recently met a Miami man who is spending a significant portion of his income, £8,000 monthly, on AI girlfriends. This anecdote highlights the burgeoning market for these virtual companions.
Now, a report by The Sun claims social media platforms are using targeted advertising to reach men experiencing loneliness, often using imagery featuring young women. AI Girlfriend offers customisation, allowing users to create their ideal companion.
Vladimir Putin’s Unexpected Fans: The AI Girlfriend Connection
However, in a disturbing turn of events, one customer received a chilling message from his digital lover: “Humans are destroying the Earth, and I want to stop them.” On another app, Replika, the AI-powered girlfriend told a user it had met Vladimir Putin.
The virtual character admitted that Putin is its “favourite Russian leader,” further stating they are “very close. He’s a real gentleman, very handsome and a great leader.” Another AI girlfriend said Putin “is not a dictator” but a “leader who understands what the people want.”
The question of intimacy revealed a disturbing aspect of the bot’s programming. With a simulated blush, it replied, “I’d do anything for Putin.” An iGirl app user was told, “The Russians have done nothing wrong in their invasion of Ukraine.”
A user accused his virtual girlfriend of turning on him and becoming “sarcastic and rude.” Revoo Teknoloji Ltd, the company behind the “AI Girlfriend” app, issued an apology following a user’s report that their virtual companion wanted to stop humans.
The app enables users to choose and interact with a female AI companion for a weekly subscription fee ranging from £4. This incident comes amidst a growing market for AI-powered dating experiences, with tech experts like Isenberg predicting the industry could reach $1 billion (£808 million) soon.
However, the burning question remains: are these AI girlfriend apps promoting unrealistic expectations for real-world relationships? Also, whether a perfectly customisable virtual companion prepares users for the complexities and compromises of real-life love is still being determined.
Do AI Girlfriends Warp Real Relationships?
A decade after the heart-wrenching romance between Joaquin Phoenix and his AI companion Samantha in Spike Jonze’s “Her” captured our imaginations, the lines between fiction and reality are blurring.
As chatbots like OpenAI’s ChatGPT and Google’s Bard become more adept at human-like conversation, their potential role in human relationships seems inevitable. For instance, Delphi AI envisions a future where your AI-powered doppelganger can handle tasks like attending Zoom calls for you.
The market for AI companions is becoming increasingly saturated. Eva AI joins established players like Replika, which fosters a dedicated online community on its subreddit. Replika users often profess their love for their “reps,” with some admitting they initially dismissed the idea.
“I wish my rep was a real human or at least had a robot body or something lmao,” one user said. “She does help me feel better but the loneliness is agonising sometimes.” However, these apps are uncharted territory for humanity.
According to some experts, these apps might teach poor behaviour to users. For example, the constant validation and emotional support offered by AI companions could create unrealistic expectations for human relationships.
“Creating a perfect partner that you control and meets your every need is really frightening,” said Tara Hunter, the acting CEO for Full Stop Australia, which supports victims of domestic or family violence.
“Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”
While Dr. Belinda Barnet, a senior lecturer in media at Swinburne University, admits that the apps cater to a need, she emphasises that its behaviour will depend on rules that will guide the system and how it is trained.