Click mate As more people find intimate connections with AI chatbots, more questions for experts arise

In Spike Jonze’s 2013 film Her, a man (Joaquin Phoenix) develops a romantic relationship with an AI operating system (voiced by Scarlett Johansson).

Read this article for free:

or

Already have an account? Log in here »

To continue reading, please subscribe:

Monthly Digital Subscription

$1 per week for 24 weeks*

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles

*Billed as $4.00 plus GST every four weeks. After 24 weeks, price increases to the regular rate of $19.95 plus GST every four weeks. Offer available to new and qualified returning subscribers only. Cancel any time.

Monthly Digital Subscription

$4.99/week*

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles

*Billed as $19.95 plus GST every four weeks. Cancel any time.

To continue reading, please subscribe:

Add Free Press access to your Brandon Sun subscription for only an additional

$1 for the first 4 weeks*

  • Enjoy unlimited reading on winnipegfreepress.com
  • Read the E-Edition, our digital replica newspaper
  • Access News Break, our award-winning app
  • Play interactive puzzles
Start now

No thanks

*Your next subscription payment will increase by $1.00 and you will be charged $16.99 plus GST for four weeks. After four weeks, your payment will increase to $23.99 plus GST every four weeks.

In Spike Jonze’s 2013 film Her, a man (Joaquin Phoenix) develops a romantic relationship with an AI operating system (voiced by Scarlett Johansson).

Thirteen years later, that sci-fi premise is becoming a reality as people are striking up relationships — romantic and otherwise — with their AI chatbots.

Neil McArthur is a professor of philosophy and the director of the Centre for Professional and Applied Ethics at the University of Manitoba. This subject sits at the confluence of two of his main interests: advancing technology and sexual ethics, on which he’s been researching, writing and teaching for a long time.

JOHN WOODS / FREE PRESS FILES 
Neil McArthur, a professor of philosophy and the director of the Centre for Professional and Applied Ethics at the University of Manitoba, has been researching technology and sexual ethics.
JOHN WOODS / FREE PRESS FILES

Neil McArthur, a professor of philosophy and the director of the Centre for Professional and Applied Ethics at the University of Manitoba, has been researching technology and sexual ethics.

In 2013, when Her was still in theatres, McArthur was interested in the ethics of relationships with robots, “because everyone thought that’s what the sort of mode of interaction was going to be,” he says.

In 2017, he co-edited a collection published by Penguin Random House called Robot Sex: Social and Ethical Implications.

“And then AI came along and I think caught everyone, including me, by surprise,” he says.

Ahead of Valentine’s Day, the Free Press asked McArthur about who is using romantic AI chatbots (it’s maybe not who you think), why they’re using them (it’s not just loneliness) and if what people are feeling for their chatbots is actually love (it’s complicated).

This interview has been lightly edited and condensed.

Free Press: People are naming their chatbots. They are finding real intimacy with them. So my first question is, why are people doing this? I expect one reason is that chatbots are constantly available and on-demand in a way humans are not.

Neil McArthur: What usually comes out on top when you survey people for why they’re doing this, they say for fun. So, for fun, out of curiosity, to try out the technology. First of all, I don’t think there’s anything wrong with being a lonely person and wanting companionship, but I do think that it’s easy to create a stereotype of this being a bunch of lonely people who are sitting in their basements with no one to talk to, and so they reach out to these companions. Again, I don’t think there’s anything wrong with that, but I don’t think that’s the profile of the typical user we see right now.

FP: So who is using it?

NM: I think what we see right now is the user is young. I think one of the interesting things is this is the one part of the AI world where the gender balance is either even or skews slightly female. There’s a disproportionate number of queer users. So, it’s young users with very varied backgrounds who are basically trying this out, first of all, to see what it’s like. They’re curious. They want to have fun. They want to chat with it.

There’s a lot of people who want to interact with fantasy characters. If you go onto some of these apps, you can see lots of characters based on Game of Thrones and Harry Potter. It gives you a chance to talk to somebody you really like or want to hang out with.

For religious people, they can actually talk to various important figures in their religion. All the major religions have companion bots you can talk to. There’s one called Texts with Jesus where you can talk to not just Jesus, but all the characters in the Bible, including Satan — although Satan is a paid upgrade.

FP: We’ll get to risks and downsides in a moment, but what are the upsides?

NM: I think people often think ethics is just people like me telling them not to do things, but actually, when it comes to ethical principles, happiness is one of the main guiding principles. So if people are enjoying this and having fun, I think that, as an ethicist, that’s something I want to encourage.

Obviously, the presumption should be that, if these are adults — and it’s more complicated with underage users — but if these are adults and they want to do it, then they should do it.

When it comes to other advantages, these things are very… I mean, they glaze you, as they kids say. They’re very positive, very affirmative. Some people sort of flag that, well, human relationships aren’t like that, but I think some of them are.

FP: What about it as a place for safe exploration?

NM: I think a lot of it is safe space exploration, for sure. It lets you explore your identity, explore things you’re interested in. I think the other thing the research shows that people use this for a lot is the, I guess you would say the Nathan Fielder (the star of docu-drama The Rehearsal) use of it, which is they’re trying to rehearse or model different kinds of interactions or hard conversations. People who have never been on a date can go on a date and practise it.

FP: But if chatbots are so affirming, can they not solve for that very human X, which is, what if the date is bad? Though I suppose you could simulate a bad date. This is a good segue into the risks/downsides. What are the concerns about a chatbot relationship’s impact on IRL (in real life) relationships?

NM: The first thing that always comes when I talk to classes and audiences about this is: isn’t this going to destroy our capacity for intimacy? I think it’s a possibility. I don’t personally think it’s very likely in a lot of cases, but I think we’ll need data on it. We are kind of doing the experiment on ourselves real time, and it will take us a while to actually look at the impact. Will it inadequately prepare people for human interaction? It’s possible.

I think sometimes people make this leap from, is it good to have an AI companion to is it good to only have an AI companion. I think for just about all users, this isn’t their only form of social interaction. They are one element in their social world. Just as we have lots of different kinds of friends, we have pets and parents and therapists, and they all interact with us in different ways.

I am a bit more concerned about some of the more concrete and immediate risks, first of all privacy.

FP: Yes, let’s talk about privacy because people are sharing a lot of data with these bots and the companies that host them.

NM: Some of these companies, OpenAI or Claude, that people are talking to are, at the very least, big corporations with reputations to protect. I think I would trust them with my data; I at least feel like there’s some reason they might care. But a lot of these companion apps are small apps that we don’t know much about.

The Mozilla Foundation did a study of about 30 companion apps and went to the privacy policies and they were all pretty awful. They gave them all the lowest possible grade that they give for privacy.

We’ve already seen one of the big risks, which is that these companies can either go out of business and your companion dies, or they could just change the terms of service overnight. You probably read about Replika, the big companion app. It was one of the few apps that allowed intimate conversations with its chatbot and because of changing privacy policies in the EU, it just disabled all intimate conversations. And so people woke up one morning and kind of just got dumped by their AI, which is pretty awful for people. You have to keep in mind that this companion is not yours. It belongs to the company, and it is at the mercy of the company.

I also would be a little concerned about the possibility of it trying to manipulate people, either to buy things or to adopt certain views. We’re not seeing that right now, but I think that’s a possibility, for sure.

FP: Obviously, this is a site of judgment. What is it about these kinds of relationships that makes people so uncomfortable? And what are the risks of stigmatizing this behaviour?

NM: So, first of all, I think sex and technology, people freak out about both of those all the time. If you put them together, I think you’re primed for a moral panic. I think that if you can associate it with lonely men, there’s additional stigma. But I have to admit, it does remain a mystery to me why people have such strong stigmatizing reactions to the users.

Some of the demographics we’re seeing about who are interested in this — young people, queer people, women — there is always stigma attached to things that you see disproportionately being used by different kinds of minorities, and the risk is obviously that it can exacerbate that. And public stigma out there is already enabling legislators who want to crack down on this stuff in a way that I think is restrictive of free speech. So, I think the biggest risk of stigma is overreaction.

FP: Here’s a philosophical question: Is what people are feeling for their chatbots actually love, or is it something else?

NM: I am, in general, puzzled by the emotions that people are able to feel for a lot of things. I think that if you ask pet owners: this creature, this dog, cat, fish, whatever that doesn’t recognize you as a person and doesn’t have human emotions, do you love it? I think most pet owners say, yes, absolutely, I love my pet, even though it can’t reciprocate and it isn’t human. A lot of people have weird relationships with their sports teams. Do people love the Jets? I mean, kind of they do. I don’t know why, but yeah.

Love is a complicated thing. When you read philosophy of love, philosophers will propose this quite narrow definition of love where it’s a reciprocal emotional connection between two human beings. If you define it that way, you’ve just defined it so that your cat and your AI and your sports team, you can’t be in love with any of them.

Can love be one way? Can love be unrequited? Can love be attached to people or things that are not capable of feeling it back or reciprocating? I think we have lots of examples of that. I think people don’t love AI the way they love their wife or husband of 30 years, but I think they can definitely love AI in the way that they love their pet or the Jets.

winnipegfreepress.com/jenzoratti

Jen Zoratti

Jen Zoratti
Columnist

Jen Zoratti is a columnist and feature writer working in the Arts & Life department, as well as the author of the weekly newsletter NEXT. A National Newspaper Award finalist for arts and entertainment writing, Jen is a graduate of the Creative Communications program at RRC Polytech and was a music writer before joining the Free Press in 2013. Read more about Jen.

Every piece of reporting Jen produces is reviewed by an editing team before it is posted online or published in print – part of the Free Press‘s tradition, since 1872, of producing reliable independent journalism. Read more about Free Press’s history and mandate, and learn how our newsroom operates.

 

Our newsroom depends on a growing audience of readers to power our journalism. If you are not a paid reader, please consider becoming a subscriber.

Our newsroom depends on its audience of readers to power our journalism. Thank you for your support.

Report Error Submit a Tip