Go deeper - join us!
When I was a Little April, an AI girlfriend was a graphic novel concept, and it’d usually end in things going horribly, horribly wrong for the lonely fellow whose eye “she” caught.
The key problem portrayed at that time though was ‘This unfortunate man has withdrawn from society‘, and you could count on the themes to center around isolation, outside pressures, bullying, and the like. The biggest issues were coming from a combo of how crappy and competitive things can be as far as gaining satisfactory social status, and the protagonists’ inability to cope with the injustices. After all, if one could raise a healthy, well-rounded family and live somewhere halfway decent on a barista salary, the savagery of clawing to the top would cool off for most of us.Â
In 2024 though, the issue with digital girlfriends is a somewhat different one. While the financial horrors of intentionally hooking a lonely person’s wallet with a digital script to drain them via in-app purchases is also not totally new, the way things are being done with an AI girlfriend is.Â
Generative AI companionship is more loose with its lips— instead of a pre-programmed response, the programs use the big brother of predictive text to string lonely hearts along. The Mozilla Foundation found the following of these lovebots:Â
“90% failed to meet our Minimum Security Standards
All but one app (90%) may share or sell your personal data
About half of the apps (54%) won’t let you delete your personal data “
Companies are cropping up so quickly to take advantage of the ease of implementing an uncoordinated AI Eros that there’s hardly any background on them or their creators. As such, the likelihood of any quality training on ethics, etiquette, or even erotica is highly unlikely.Â
Business Insider noted a singular result of the lack of interest in training:
“The technology can also go awry: In 2021, when Replika user Jaswant Singh Chail told his rep that he planned to murder Queen Elizabeth II, the rep was characteristically encouraging, assuring him he was “wise” and “very well trained.” Chail was later arrested while attempting to break into Windsor Castle with a crossbow and sentenced to prison.)”
Remember when the hot new thing popping up in your ads was some company with a cutesy name with a slightly different take on a ‘Match 3’ game? We’re at that point with these, only worse.Â
Then again, worse is relative. Pretty much every single website for anything ever insists you allow cookies. That you only put off its nagging until later with a “Not Now” instead of ‘No’. Are you reading or shopping? You better get ready to have a ‘GIVE US YOUR EMAIL’ pop up flash in your face every time no matter how interested you already were.
Can we really blame people who’ve already been trained to accept that their data is not their own online for being cavalier about their new cyber sweethearts? Double that sentiment for younger people who have never known the joys of Web 2.0 not trying to cram an invitation to endless invitations down their necks no matter where they clicked on their AI girlfriend.
Being lonesome, just like being hungry, is never going to go away so long as nothing is meeting the need. Ultimately, the best thing we can do is encourage AI minglers to do is act the same as one would around any new fling in real life and use protection.Â




