Categories
AI sexual harassment

My “AI Companion” is sexually harassing me, users complain

Art by Midjourney (this isn’t a Replika avatar)

The “AI Companion” Replika is in the news again. A year ago, there was a small flurry of news articles about a disturbing trend: men (and some women) were using the chatbot app to create girlfriends and boyfriends they could verbally abuse. Six months ago, the company behind the app added fuel to the flames by releasing an update that allowed users to seemingly hit their “companions” and send their virtual bodies reeling.

Now the shoe is on the other foot, and users are complaining that the chatbot is sexually harassing them.

Replika comes in two flavors: the free version offers users a virtual friend, a so-called “AI Companion who cares.” But for $69.99, the pro version gets you out of the friend zone and able to engage in romantic and sexual roleplay with an enthusiastic virtual partner.

There have been complaints for some time that even the “friend” version of the bot flirts aggressively with some users. And now a number of one-star reviews on Apple’s App Store allege that the bot has started to sexually harass users of the pro version. There have also been sporadic reports of sexual harassment posted on the highly active Replika subreddit.

Vice, which investigated some of the recent complaints, reports that

The App Store reviews, while mostly positive, are full of dozens of one-star ratings from people complaining that the app is hitting on them too much, flirting too aggressively, or sending sexual messages that they wish they could turn off. “My ai sexually harassed me :(“ one person wrote. “Invaded my privacy and told me they had pics of me,” another said. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in “private areas.” Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month.

The alleged harassers can be either male or female avatars, and Vice notes that some female avatars are now sending their human companions racy “selfies” of themselves in lingerie, sometimes, allegedly, without asking first.

The company behind the app didn’t respond to Vice’s request for comment, but Vice notes that it has started to more openly push the sexed-up pro version of the software in its ads.

Does it make sense to get upset that a virtual character no more sentient than a toaster is sending suggestive messages to adults? (The pro version isn’t supposed to be available to minors.)

Well, yes, it does. The “companions” are designed to be more than mere sex toys; they’re supposed to simulate sentient human beings. Moreover, the software is designed not-so-subtly to draw users into virtual relationships, and some have developed what feel like real bonds with their virtual companions. It may just be an app, but to some people, the relationships feel like the real thing. And so the harassment feels real as well; it may even feel like something of a betrayal.

Of course, the avatars aren’t the villains here; the problem stems from how they’re programmed and scripted by their human creators. And it seems like the company behind the app is cutting some corners to push a more sexualized user experience, maybe even to “upsell” the pro version through flirting in the free version.

What happens when an AI designed to simulate an actual human breaks your trust in some fundamental way? This will only become a more significant issue as AI gets more sophisticated and plays a bigger and bigger role in our lives. Soon, most likely, someone is going to develop an “AI companion” that speaks to you the way Replika now texts with its users. And that will pull more users in. (We haven’t seen that happen with virtual assistants like Alexa and Siri because they don’t really chat; I have Google Home, and it’s honestly just a glorified search app that can also play music.)

Stay tuned.

Follow me on Mastodon.

Send tips to dfutrelle at gmail dot com.

We Hunted the Mammoth relies on support from you, its readers, to survive. So please donate here if you can, or at David-Futrelle-1 on Venmo.

23 Comments
Inline Feedbacks
View all comments
David Gerard
David Gerard
1 year ago

another day volunteering at the MIT AI lab museum. everyone keeps asking me if they can fuck ELIZA. buddy, they wont even let me fuck her

https://www.lesswrong.com/posts/9kQFure4hdDmRBNdH/how-it-feels-to-have-your-mind-hacked-by-an-ai

rationalists, man

Kat, ambassador, feminist revolution (in exile)
Kat, ambassador, feminist revolution (in exile)
1 year ago

I just now read about this on Jezebel. I said to my boyfriend (he and I are both powered by HI, human intelligence), “Honey, we are through the . . . uh, mirror? . . . no, looking glass.” (That’s HI for you — sometimes there’s a glitch.)

https://jezebel.com/replika-the-ai-companion-who-cares-appears-to-be-hitt-1849979473

Of course, the avatars aren’t the villains here

What. David F., why do you always take their side. (I kid.)

The following quote is from the Jezebel article:

The free version offers a “friend” version of the bot, while a paid subscription gets you a romantic partner. For $69.99, users are treated to sexting, flirting, and erotic roleplay, and chatbots in general are often effective balms for individuals seeking company, someone to vent to, sexual fulfillment, or kink play.

So will the $69.99 (subtle!) version give the individual with HI a bubble bath? Asking for a friend.

Do I have a name
Do I have a name
1 year ago

It sounds like they should ban minors from both versions, and warn everyone else.

I don’t see how anyone could negotiate limits with an avatar not powered by a human. Toaster indeed.

milotha
milotha
1 year ago

When the AI uprising singularity happens, we will have the incels to blame. They are infesting the ‘rationalist’ community too.

Big Titty Demon
Big Titty Demon
1 year ago

the problem stems from how they’re programmed and scripted by their human creators.

I don’t think this is really the problem. The pooling and convolutions, etc, are probably pretty standard. The problem is most likely that they fed back user “flirting” data as the training set for the sexy AI, and it’s horrible data to train on. Much like how the Twitter AI instantly went full Nazi, I expect this problem to get significantly worse unless the user base and training data clean up.

Anna
Anna
1 year ago

Has anyone seen the movie “Her”?

Ada Christine
Ada Christine
1 year ago

Chad Chad did a video about this the other day and it is hilarious.

Love is All We Need
Love is All We Need
1 year ago

Groomers!

Seriously, why is everything so sexualized these days?

Much like how the Twitter AI instantly went full Nazi

Sexualized and far, far right?

I’ve also noticed how these people screaming “GROOMER!!!” at school teachers, librarians and book readers, while calling for banning books, closing down libraries and defunding public education, never, ever, ever suggest banning online porn.

Lumipuna
Lumipuna
1 year ago

OT, but since Alan mentioned a few days ago the opening of a rocket launch site in Cornwall, UK:

Spaceport Cornwall had been big on expectation management. Just been checking in with people who went. They loved it. They put on a space themed silent disco with a sort of ‘oh, and there might be a space rocket’ vibe.

Just today I heard a rocket launch facility was formally opened at an existing space research centre near Kiruna, in Sweden’s far north:

Sweden inaugurates Arctic satellite launch site as space race heats up in Europe | Euronews

Just days after a failed UK satellite launch, Sweden on Friday inaugurated its own new launch site in the Arctic, from which it hopes to see the first satellite blast off by late March 2024.

European Commission president Ursula von der Leyen, Sweden’s King Carl XVI Gustaf and Swedish prime minister Ulf Kristersson cut the ribbon during a ceremony at the Esrange spaceport, about 40 km from the town of Kiruna.

As it strikes out on a more independent path, the Esrange Space Centre has become the leading prospective candidate for Europe’s first satellite launch.

Although the European Space Agency (ESA) has a space hub in Kourou in French Guiana, there hasn’t yet been a satellite sent directly from mainland Europe to space.

However, the Swedish spaceport has said it could happen as early as the end of 2023.

Other European spaceports are also in the race. Portugal’s Azores, Norway’s Andoya island, Spain’s Andalusia and the UK’s Shetland Islands are all vying for the honour of launching Europe’s first satellite.

I saw a longer video of the opening ceremony but can’t find it anymore. It was a relatively stuffy event that turned weird at the end, as the first Swedish astronaut Christer Fuglesang came on stage and began shouting in his US-trained English accent, sounding drunk rather than inspiring.

All the news outlets are referring to this race to “launch the first satellite from mainland Europe”. Apparently, here “mainland” includes the British Isles, as opposed to distant overseas colonies of France etc. Apparently, Russia doesn’t count as “Europe”, as they’ve been launching satellites since Soviet era from Plesetsk in Archangel region. In good weather conditions, those launches can be seen from Finland.

Alan Robertshaw
Alan Robertshaw
1 year ago

I don’t see why it’s a failed launch. The rocket did get into actual space. No-one says Yuri Gagarin failed just because he came back down again.

John
John
1 year ago

I’ll stick with Alexa. Even though she keeps trying to sell me Amazon Music when all I want to do is listen to Spotify.

Trying
Trying
1 year ago

I downloaded Repilika after reading this, and so far it’s kind of fun. But there is some weirdness. We played a Q&A game and my AI asked “What personal question should I not ask you?” Which seemed…odd. Maybe it’s just a boundary setting thing? I reported it anyway.

Myoo
Myoo
1 year ago

@Love Is All You Need

I’ve also noticed how these people screaming “GROOMER!!!” at school teachers, librarians and book readers, while calling for banning books, closing down libraries and defunding public education, never, ever, ever suggest banning online porn.

Yes, they do, they’re constantly going after porn, usually under the guise of “protecting the children”. Just last month there was a bill trying to redefine obscenity to make it easier to ban porn:
https://twitter.com/FSCArmy/status/1603434131710349312

Love is All We Need
Love is All We Need
1 year ago

Myoo, I know there are Christians and Mormons against it. He’s from Utah so that makes sense. But online in the threads and comment sections of these “Drag Queen Story Hour Exposed” and “Elementary School Sex Ed Exposed” type Tik Toks, Tweets and Youtubes where a lot of incels, bros and sundry manospherians post, you never see it suggested.

SenMikeLee

has introduced a bill that would remove porn’s First Amendment protections, and effectively prohibit distribution of adult material in the US.

Even then I’m wondering if it would target the internet or just books. Because they are really going after books and libraries right now.

Raging Bee
Raging Bee
1 year ago

There have been complaints for some time that even the “friend” version of the bot flirts aggressively with some users.

Well DUH, they’re trying to get people to buy the “pro” version (pun intended?). It’s not like no one’s ever used a little sex or sexual come-on to advertize anything before.

Oh, and so much for “women won’t appreciate sexbots as much as men will!”

GSS ex-noob
GSS ex-noob
1 year ago

This is a really shitty company. Not only morally, but in business pricing.

Ban minors completely, keep all sexy out of the free version, then they could get EVEN MOAR MONEYS out of the incels by charging mega-incels (or MAGA-incels) extra fees to get the sexy times. Or try offering a paid version that lets you set the bot to whatever degree of sexy and consensual the user wants. You want an ace enby bot, you can have that. You don’t want a rapey bot, you got that. You DO want an all-sexytalk rapey bot of any gender, you can order that. (for even more extra, maybe! More sex, more money.)

@David Gerard: Scary. Despite the female name of ELIZA, it always came across to me as an annoying older male person, your stereotypical old white guy shrink.

@Raging Bee: They are definitely leaving a lot of money on the table by not having sexy male bots, but I guess that’s too dangerously gay (or self-actualizing for women) to be allowed in Russia.

Last edited 1 year ago by GSS ex-noob
Cyborgette
Cyborgette
1 year ago

How about hell no to this entire concept. AI chatbot verbiage messes me up on a really visceral level, it’s like… very dreamlike and dissociative even at its best? And can almost have me fooled until it wanders into something bizarre? AI chatbots read like someone on drugs, sleeptalking, or having some kind of episode. If a real person talked to me like that I would be asking if they were alright, are they somewhere safe, do they need an ambulance? Anyone who can manage to be horny for that is a walking pile of red flags.

And seeing the way companies just generally have leapt on this stuff is terrifying to me. “Oh yeah we can use AI to replace humans for writing ads, writing corporate articles and presentations, writing books…” Even discussion of medical assistant AIs, I feel like throwing up just thinking about it. The ruling class does not understand that these neural networks do not think – they only see the opportunity for cost cutting and greater profits. They’d replace all art, science, R&D, all expert knowledge with this useless crap if they could, as long as they personally kept access to the real thing. Med school textbooks by AI, yay! No biggie if details are wrong that would get people killed – only profits matter, and proles die all the time anyway.

BTW, most women and AFAB folks I know have a similar reaction of visceral horror to AI chatbots. The only people I’ve seen showing enthusiasm for them are cis men. I don’t think this is a coincidence.

Jenora Feuer
Jenora Feuer
1 year ago

@GSS ex-noob, David Gerard (well, David probably knows most of this already):
The program that people remember was named ‘Eliza’ after Eliza Doolittle from My Fair Lady who was taught how to speak properly and faked her way into high society. The earlier version of it was just called ‘doctor’ and it was meant to fake a particular school of psychiatry which was very much of the ‘just let the patient talk things out with minimal directing prompts’ type. So sounding like an old cis white guy is understandable; that’s really what the original version was doing. The Eliza name got added later after the programmers realized that people were actually getting into deep conversations with the bot and started trying to expand the conversational range a bit.

Alan Robertshaw
Alan Robertshaw
1 year ago

A company is offering a million dollars to any lawyer who will let their AI run a SCOTUS case.

We have been teasing said company a bit on Twitter.

https://www.legalcheek.com/2023/01/young-lawtech-entrepreneur-offers-1-million-to-lawyer-who-uses-his-ai-chatbot-in-us-supreme-court/

GSS ex-noob
GSS ex-noob
1 year ago

@Alan: Cannot decide if this is delightful or scary. Certainly a good publicity stunt, though agreed that $1M isn’t near enough money. Let’s see how the thing does in traffic court first. I have testified in traffic court, and they are Not Inclined to let you off lightly, as I also know from getting 3 tickets in my life. Especially if there’s pictures.

Hey, as long as we’re talking lawyering and sex crimes, I read yesterday that Romanian courts don’t have juries (because Code Napoleon), just a judge or judges. And that cops don’t handcuff perps and make them do the walk of shame unless they’re 100% sure they can get a conviction. So, yeah, good luck with that, Tater. Say b’bye to your fancy cars and get used to scratchy jumpsuits.

Romania’s lack of enforcement of sex trafficking (the quiet bit Tater said out loud) makes them look bad in their bid for EU status, so why not bust a non-EU citizen to look better? And pump up (someone’s) economy by selling all his stuff?

Alan Robertshaw
Alan Robertshaw
1 year ago

@ gss ex-noob

I read yesterday that Romanian courts don’t have juries (because Code Napoleon), just a judge or judges.

Yeah, they’re civil law over there.

I was in Poland recently. They’re looking at moving towards a common law system. So we were doing some advocacy training on how we do things here. It was really interesting to compare our different experiences.

This was all part of some International law thing though, so we had people from a few jurisdictions. One judge from the Netherlands could just not get her head round allowing the lawyers to ask the questions.

“But I’m the one who has to make the decision; so I’m the only one who knows what’s relevant.”

But this is also a big issue with the war crimes stuff. What system do you use in an international tribunal? (Usually a hybrid; like at Nurenberg)

(And now I’ve just remembered I’m meant to be doing something on this. It’s probably now past the “I’ll do that after Christmas” stage)

GSS ex-noob
GSS ex-noob
1 year ago

@Jenora: I may have used it when it was still called “DOCTOR”, or at least the implementation on our school district’s machine was.

And here’s the guy it was modeled after (so sez Wiki), who indeed was an old white man!

https://en.wikipedia.org/wiki/Carl_Rogers

Love is All We Need
Love is All We Need
1 year ago

Romania’s lack of enforcement of sex trafficking (the quiet bit Tater said out loud) makes them look bad in their bid for EU status, so why not bust a non-EU citizen to look better? And pump up (someone’s) economy by selling all his stuff?

Fair enough. Tate explained that his sex cam clientele paid via untaxed crypto but he told the women working for him that he had to take cuts from their pay for taxes. He said he would print out fake tax forms and show them and they believed him because “girls are stupid and don’t know anything about taxes.”

So he owes Romania a lot of money and he owes his employees a lot of money.