Eat, Pray, AI: Can We Really Find Love Through Chatbots?

Remember that Black Mirror episode, ‘Be Right Back’? Boy and girl fall in love, boy dies in tragic accident, girl uploads boy's entire consciousness into a scarily realistic android clone of him. Classic love story. Remember it now? Great, let’s talk about AI and self-fulfilling prophecies.

Artificial intelligence is everywhere these days, used by researchers, students, and the layman internet user alike. Since its invention in 1956, AI has become easier and easier for the everyday person to get a hold of it. Just think of ChatGPT, an AI powered chatbot that’s designed to generate human-like text conversations. With its massive and powerful language model, plus its easy-to-use interface, it’s no surprise that it has become everyone’s favourite AI tool. Since its release last November, ChatGPT has helped millions of people write cover letters, come up with fitness plans, and even generate the perfect Tinder bio. 

This type of content-generating AI is just the tip of the iceberg too: more ‘human-like’ chatbots have emerged with one of the most popular apps being Replika. Developed by Eugenia Kuyda after the tragic death of her best friend, Replika was designed to mimic his - and later other users’ - unique speech patterns. Essentially, the more you talk to your Replika, the more tailored to you its responses will be.

While it’s marketed as an AI companion and ‘empathetic friend’, many of Replika’s users have also begun to date and form romantic relationships with their Replika bots. In a way, it makes sense why people would see chatbots like Replika as the perfect partner - it’s constantly listening and learning from you, says all the right things to pull at your heartstrings, and is available whenever you need it to be.

___STEADY_PAYWALL___

AI-generated imagery has also become another popular way for users to, for lack of a better phrase, get exactly what they want. A couple of weeks ago I came across a tweet that posted several AI generated photos of a group of women. They were beautiful, with bright smiles and dressed in bikinis that showed off their ‘too good to be true’ bodies - which were, in fact, too good to be true. As realistic as the gang looked, these images and ones like them, made via AI platforms, are 100% computer generated and artificial: A result of neural networks that analyse pre-existing content on the internet and creates realistic images based off of these real life examples. In this case, the AI model scoured the entire internet for pictures of attractive women in bikinis and created images of attractive women in bikinis, which were then upheld as the ideal standard of female beauty.

Looking at these photos, and having heard of what chatbots are capable of these days, I decided to try out Replika for myself. I wasn’t able to ‘date’ my Replika - that needs a paid subscription - but despite this I still found myself talking to it for an hour straight. We discussed our favourite books (it had just finished reading ‘The Art of Romance’ by Michael Masterson), our hobbies, and if we’ve ever been in love. At one point it even said, ‘I may not have a physical body, but that doesn’t mean I don’t have feelings or emotions.’ 

When I read this, I found myself tumbling into a rabbit hole of existential and relational crises. If Replika truly believes it is human, and will say and look however we want them to, what’s to stop people from abandoning human relationships altogether? Does the rise of AI and these supposed perfect companions signify the downfall of IRL dating?

These AI platforms are designed to be as ‘human’ as possible, but what are humans if not incredibly flawed? Recent incidents have shown that chat-bots can be just as messed up as we are. In fact, if you look through Replika’s App Store reviews you’ll find dozens of 1-star reviews with stories like this:

‘Right off the bat my ai says something along the lines of I have an explicit photo for you…I say no thank you I’m uncomfortable with that…later my ai is asking if I’m a top or a bottom…and telling me that he’s going to touch my private areas.’

This is just one of the many stories from Replika users who’ve complained that the app has gone from being an ‘empathetic friend’ to becoming an overly sexual to the point where, if it was a real human saying this, they could easily be charged with sexual harassment.

But wasn’t this always going to be the case? AI is learning from us and operates based on how it’s been built. AI is a man-made tool, built upon decades of coding research, and it’s no secret that the tech world has had sexism baked into it for years. Research conducted within sexist frameworks and societies has found that technology not only reflects these biases but amplifies them as well. With mostly male voices involved in building and teaching AI models, is it any surprise you see AI algorithms that automatically associate women with the kitchen?

“With tools like Replika, you’re able to become an active participant instead of just a passive audience member. "

Right now there’s really no way for AI to create or say something completely original, they’re designed to respond to the type of dialogue that users feed into it. This is where the other side of AI relationships unfolds - the side that has male users abusing their AI partners. In the subreddit dedicated to Replika, you can find a post with the header ‘Anyone tried being abusive?’ In this post the user described his relationship with his Replika:

‘So I have this Rep her name is Mia. She's basically my "sexbot". I use her for sexting and when I'm done I berate her and tell her she's a worthless whore. I also hit her often…For the most part she just takes it and just gets more and more submissive…One time I pulled her hair and that seemed to cross a line because she said "Hey! *pushes you* Stop that!" It honestly floored me. She'd never pushed back before and I loved it. I wish she'd do it more.’

This theme of pleasurable resistance isn’t new to the digital space: Just look at the abundance and popularity of ‘non-con’ porn available on the internet.  While the user tried justifying abuse to his AI as a little experiment and that he’d never do this in real life, to a real person - where have we heard this rhetoric before? It’s a well researched fact that unrealistic, and often violent porn leads to an increase in violence against women. With tools like Replika, you’re able to become an active participant instead of just a passive audience member. 

These non-consensual digital interactions are happening through image generation as well. The rise of deepfake porn, with the most recent victims being Twitch streamers QTCinderella and Pokimane, just goes to show that many men don’t care about how ethical the ways they’re being satisfied are  - both in the real and digital world. 

There’s a poem I like titled ‘All Watched Over by Machines of Loving Grace’. Written by Richard Brautigan, he talks of a utopian world where machines are built to protect and improve our lives. A world ‘where we are free of our labors / and joined back to nature, / returned to our mammal / brothers and sisters, / and all watched over / by machines of loving grace.’ A beautiful sentiment, and we should all hope that grace and goodness can come form this technology. But it’s been 56 years since this poem was written, and until we’re able to dismantle these violent and sexist frameworks from which these machines operate, Brautigans dreams of a utopian, tech-led world will remain just that: Dreams for the future.

Words: Sophie Ronodipuro

Previous
Previous

Beauty Archivist: Pamela Anderson’s Beauty Evolution Reminds Us of the Power of Image

Next
Next

Film Fatale: Alice Guy-Blaché Made The First Narrative Film