SON DAKİKA

TV & Medya

AI dostları: Aşk için bir tehdit mi, yoksa evrimi mi?

point, that people are saying these are real relationships,” he said. “On the other hand, it goes to my point, that they’re threats to our relationships. And the human animal doesn’t tolerate threats to their relationships in the long haul.”

How can you love something you can’t trust?

Garcia says trust is the most important part of any human relationship, and people don’t trust AI.

“According to a recent poll, a third of Americans think that AI will destroy humanity,” Garcia said, noting that a recent YouGov poll found that 65% of Americans have little trust in AI to make ethical decisions.

“A little bit of risk can be exciting for a short-term relationship, a one-night stand, but you generally don’t want to wake up next to someone who you think might kill you or destroy society,” Garcia said. “We cannot thrive with a person or an organism or a bot that we don’t trust.”

Ha countered that people do tend to trust their AI companions in ways similar to human relationships.

“They are trusting it with their lives and most intimate stories and emotions that they are having,” Ha said. “I think on a practical level, AI will not save you right now when there is a fire, but I do think people are trusting AI in the same way.”

Physical touch and sexuality

AI companions can be a great way for people to play out their most intimate, vulnerable sexual fantasies, Ha said, noting that people can use sex toys or robots to see some of those fantasies through. 

But it’s no substitute for human touch, which Garcia says we are biologically programmed to need and want. He noted that, due to the isolated, digital era we’re in, many people have been feeling “touch starvation” — a condition that happens when you don’t get as much physical touch as you need, which can cause stress, anxiety, and depression. This is because engaging in pleasant touch, like a hug, makes your brain release oxytocin, a feel-good hormone.

Ha said that she has been testing human touch between couples in virtual reality using other tools, like potentially haptics suits. 

“The potential of touch in VR and also connected with AI is huge,” Ha said. “The tactile technologies that are being developed are actually booming.”

The dark side of fantasy

Intimate partner violence is a problem around the globe, and much of AI is trained on that violence. Both Ha and Garcia agreed that AI could be problematic in, for example, amplifying aggressive behaviors — especially if that’s a fantasy that someone is playing out with their AI.

That concern is not unfounded. Multiple studies have shown that men who watch more pornography, which can include violent and aggressive sex, are more likely to be sexually aggressive with real-life partners. 

“Work by one of my Kinsey Institute colleagues, Ellen Kaufman, has looked at this exact issue of consent language and how people can train their chatbots to amplify non-consensual language,” Garcia said.

He noted that people use AI companions to experiment with the good and bad, but the threat is that you can end up training people on how to be aggressive, non-consensual partners.

“We have enough of that in society,” he said. 

Ha thinks these risks can be mitigated with thoughtful regulation, transparent algorithms, and ethical design. 

Of course, she made that comment before the White House released its AI Action Plan, which says nothing about transparency — which many frontier AI companies are against — or ethics. The plan also seeks to eliminate a lot of regulation around AI.

Düşüncenizi Paylaşın

E-posta adresiniz yayınlanmayacak. Gerekli alanlar * ile işaretlenmişlerdir

İlgili Teknoloji Haberleri