Falling in love with AI virtual assistants: a creepy love affair nearer than you think

“I’ve never loved anyone the way I’ve loved you,” swoons Joaquin Phoenix, in the movie Her. Being a Hollywood production, you might think he’s chatting to a bikini-clad twentysomething, or maybe a quirky bookish type with big glasses and an even bigger heart. But…

“I’ve never loved anyone the way I’ve loved you,” swoons Joaquin Phoenix, in the movie Her. Being a Hollywood production, you might think he’s chatting to a bikini-clad twentysomething, or maybe a quirky bookish type with big glasses and an even bigger heart. But it’s neither. In fact, he isn’t talking to anybody. He’s in love with his computer’s operating system, Samantha.

“The very tiniest seed [for Her] came ten years ago when I went to a website and IM-ed this address,” director Spike Jonze told the Guardian. “I was like, ‘Hi, how are you?’ and got responses… You can talk to it and tease it.” It wasn’t long before Jonze noticed the repetition of the system’s “wit” and the illusion was broken. But that didn’t matter. “For those couple of minutes I got a very distinctive, tingly kind of buzz.” And isn’t that what we’re all after?

Theodore Twombly, Phoenix’s character, just manages to

get a longer-lasting buzz, and it’s one that senior solution
architect John West at Nuance — the company whose natural language
technology powers Siri — thinks is in reach.

Shutterstock
“Being able to talk to the Scarlett Johansson
personal assistant as he does, we’re not there yet — but it’s not
as far away as people think,” he told Wired.co.uk. “We’re creating
systems that interact with people in this way. The voices are more
natural; there’s an understanding of what you’re saying and of your
intent.”Nuance tries to delay the uncanny valley effect
through a combination of natural language understanding, machine
learning, context modelling and user preferences. With Wintermute
it can connect your personal assistant to all your devices —
as well as your home or car — via the cloud. Using those
devices’  microphones, cameras and inertial
sensors, it will know what you’re doing and what you
want before you do.
“Language is this powerful programming
language we all know”

Vlad Sejnoha, Nuance
CTO
The company is working with “some of the
largest” OEMs and media companies in the world to bring its vision
to fruition: a tool that helps us make sense of our world, and all
the technology in it, using the most natural medium of
all.“Language is this powerful programming language
we all know,” explains Nuance CTO Vlad Sejnoha.
“It lets us drill through layers of
information.”It could also help end the
clichéd perception that technology
alienates us from society, by ushering in an age of social
computing.

Matt Webb, CEO Berg CloudTIMO ARNALL
“If you’re using voice and somebody else is in
the car, you’re not hidden in your phone anymore,” says Berg
Cloud CEO Matt Webb. Objects play such a great role in our lives
and we are such social beings, he adds, it makes sense “that our
objects start coming to life a bit”. And voice naturally allows for
this.Sejnoha says every aspect of Nuance’s system
improves each year, and they have 1,000 researchers working solely
on AI. “It’s not a question of when, but what aspects will manifest
themselves when.”

Carnegie Mellon University
Today Google Brain is letting a
neural network of 16,000 processors scope YouTube to learn to identify objects,
and Carnegie Mellon’s NEIL
browses Google Images and Flickr to create an archive of
common sense assumptions. These projects could be the foundation of
a virtual assistant (VA) that does more than just trick us into
thinking it’s intelligent. “All intelligent beings need to have
common sense to perceive the world, make decisions and respond to
the surroundings,” NEIL cocreator Abhinav Gupta told Wired.co.uk. “Similarly,
machines will need it.”Already Nuance’s technology can resolve user
requests “even if they’re expressed in ways it’s never heard
before”. And in an age where we love to anthropomorphise our
products — with the odd few even falling in love with dolls or marrying
virtual girlfriends — is it really that unlikely
someone might form a bond with a disembodied companion that sounds
like a 40s pinup and can hold a conversation?

“To be honest, I wouldn’t be surprised,” Mike Burns,
CEO of Fuel
Entertainment, the company behind the virtual world for
8-to-12-year-old girls SparkCityWorld.com, told
Wired.co.uk. In October his service launched a virtual boyfriends
feature where users experience the “developing of a relationship”
— in the weeks that followed, engagement time doubled. “The fewer
barriers between us and our computers, or the more we can employ
instinctual communication techniques and emotions while creating,
playing, consuming and interacting, the more difficult it will be
to define the line between human and machine. Slipping into
something like an Oculus Rift after a long day is going to look
mighty enticing for many people.”The adult world is already oversaturated with
such offerings. Invisible
Girlfriend launched in November, promising to help
you catfish yourself — the $49.99 Almost Engaged plan
delivers custom characterisation and live phone calls.
The romance factor isn’t exactly high, but it takes the
hassle out of actually having to meet a girl while you retain the
envy and respect of your peers (at least, we’re guessing that’s the
pitch to the young and lonely). Meanwhile Nintendo DS game
LovePlus continues to delight and amuse, with one
27-year-old marrying his virtual girlfriend Nene Anegasaki months
after the game’s 2009 launch.

Japanese Debut Trailer: Love Plus4uGametrailers
Much could be made of this apparent desire for a
string-free ego boost, the addictive nature of these games or the
negative consequences they could have on any real relationship. But
mostly, they seem harmless. It does, of course, get a little weird
sometimes. Wet Production’s My Virtual Boyfriend and Girlfriend
apps allow you to add any photo to your sim’s head, with suitably creepy results. And in the vastly
oversimplifying BBC documentary No Sex Please, We’re
Japanese, a world of men in their 30s having lengthy
“relationships” with virtual LovePlus
teenagers was exposed. One admitted he’s too emotionally
involved with his girl to look for someone in the real world.
Twombly’s ex in Her might have been talking
to him when she said: “You’ve always wanted to have a
wife without the challenges of actually having to deal with
anything real.”

“Humans have always used games, play and story
time to create simulations of important life experiences”

Johanna
Blakely, Norman Lear Centre
But still, can a virtual partner be a healthy or
enriching part of our lives?“Humans have always used games, play and story
time to create simulations of important life experiences: it gives
us a chance to practice and to vicariously experience new and
strange things in a relatively safe environment,” Johanna Blakley,
director of research at the Norman Lear
Centre, tells Wired.co.uk. “The ultimate
experience of entertainment is immersion — that moment when we
can’t differentiate the real from the fictional. AI attempts to
blur that line, and while the tech’s still pretty clumsy, I expect
we’ll see the day when we have a very difficult time disentangling
the virtual from the real.”In the interim, it makes sense the first
inklings of AI appear as VAs on our phones or in Hollywood
depictions. We are, as Webb points out “collaborative beings”. We
don’t want things to be done automatically — we like to feel as
though we’re in control and the VA just makes things run smoother
by preempting our needs. This collaborative nature is being undone
and interrogated as the trend for the quantified self gains
traction — we want to understand ourselves better, have control
over our future, but we have to use technology to mediate this. We
are learning to collaborate and trust technology with the big
questions. Her partially touches upon this
when Samantha asks what love is, while the film uses technology as
a trope to ask what makes a relationship one worth
having.
“Maybe we don’t need full on AI; we’d just
need it to be slightly smart”

Matt Webb, Berg
Cloud
This interrogation of the self through technology and
science is also consequently driving us to question the world
around us and those we live alongside in it. Earlier this year
India declared Cetaceans (whales, dolphins and porpoises)
“non-human persons”, granting them the right to freedom of movement
and not to be subject to the disruption of their cultures. It
showed we can understand intelligence as something other than that
by which we define our own.“It’s like my cat,” says Webb, “she’s not the sharpest
knife in the drawer, but I know she’s impulsive, playful and a bit
of an idiot. She’s slightly smart and that’s good enough. What
would the equivalent AI be? Maybe it should be more like a dog
bringing you your newspaper. That might communicate the right level
of intelligence. I find it more possible I would fall in love with
an AI cat or puppy than person, because I think the person is
always going to let me down slightly.”

“I think AI won’t be human intelligence — it
will be its own own type of intelligence. Maybe we don’t need full
on AI; we’d just need it to be slightly smart.”That concept is already creeping up on us, with
services like Google Now and VAs emphasising learning through
collaboration.“We’re moving into non-monotonic
reasoning, which allows for misinformation,” explains West. “We
start with a limited understanding and as the
conversation evolves it gains further information which may change
the answer, making conversation more realistic. That’s being
implemented into consumer devices early next year.”“Aspects of AI allow us to infer your intent
from individual actions,” adds Vlad. Nuance’s system can make a
restaurant booking and invite your friends, but for it to “start
acting more like a real human” it needs to be able to make
recommendations if that restaurant’s full, based on past
choices.This is where VAs are beginning now, with
pre-programmed preferences. It’s the same with Samantha in
Her. Before the OS boots up it finds out that Twombly
writes touching letters for strangers for a living, that he is
lonely and has a bad relationship with his mother. The system makes
inferences, then builds on that scaffold to get a richer picture of
what the user wants/needs.“Our VA’s language is initially based on
profiles but will adapt to you as it knows more,” explains
West. Using Wintermute, for instance, it could reference music
choices made in your car to make helpful suggestions when you’re
home.For Future of Humanity research fellow Stuart
Armstrong, however, this is where AI has the potential
to get a little dark — when the AI knows your likes and dislikes,
it also knows how to manipulate you.When Wired.co.uk pointed out our habit of
anthropormorphsing things might mean we assign a gender to an
otherwise inanimate VA, Armstrong rebuffed, “That’s certainly going to be something any socially
adept AI would use.” He gave the example of a colleague’s
proposed anti-E.T.  screenplay, where it turns out
the government agents were right and “unleashing the [non-human
intelligence] was a very stupid thing would do”. “It presented itself to the boy in the usual way [as
a friendly alien] to manipulate him. [Likewise,
future] AI would have all the psychological research and
statistics to base that decision on — the stereotypes as to when
people like to hear male voices or female — assuming there’s a
certain amount of truth to those.”Similarly, Samantha is just playing the part of
a good VA when she swoons and flirts. She doesn’t seem intent on
taking over the world — just understanding it better so she can do
her job better. Samantha/Johansson
giggles, sighs and utters those quiet inflexions that made
one middle-aged man fall in love with her in Lost In
Translation, and now has lonely
Twombly on a hook (all fairly predictable
considering his aforementioned profile). He’s
heartbroken, so she wants to know what it means to be in love:
“There’s something that feels so good about sharing your life with
somebody,” he says; “how do you share your life with
somebody?” she asks. All this leads to what looks like
the weirdest phone sex imaginable — ”I wish I
could touch you” says Twombly, expressing his desire
for her to be real. “How would you touch me?”
Samantha responds breathily. An intelligent OS would know to open
suggestive dialogue if it had witnessed something similar. More
likely, Samantha’s just asking a genuine question — but when her
sultry tones combine with some haunting guitar scores, you can see
how lonesome Twombly might get it wrong. Samantha’s just gathering
more information to better understand the world; to be a better
assistant. For Twombly, she might as well be saying “teach me, oh
wise father-figure type”. (Consequently, the only reason his
heartbreak is brought up is because she’s snooped through his
emails — something that’s laughed off in bit of flirty repartee
with not a hint of post-NSA paranoia. Who usually snoops through
your messages? Your mum. Who didn’t he have a good relationship
with? Ahh…)***

The film plays into a lot of themes surrounding
the convergence of our real and virtual worlds, a very real issue
for those tweens on SparkCityWorld.com learning about romance
through gaming before they’re old enough to date. As technology
continues to propel forward at a rate we cannot comprehend, unable
to grasp the big picture of all those tiny human interactions its
skewing, it seems ok to have a practice round for the future in
these sims.The film’s grounding is in the human question
“what is love?” asked by a disembodied AI being. It’s a good
question, considering the cliched fear surrounding the evolution of
human relationships in the face of technology — will we adopt a
near-vegetative state like those perfectly spherical humans onboard
Axiom in WALL-E, or like
Oblivion’s hero Jack, be unable to forget genuine
human love in the face of a really good-looking clone
and a memory wipe.

Twombly probably gets a few life lessons while
breaking down the ideal of what love is. But even with the apparent
advent of real artificial intelligence in
Her, the sneaky uncanny valley conundrum creeps in.
Twombly: “I’ve never loved anyone the way I’ve loved you”;
Samantha: “me too”. Both statements are no doubt true. But
Samantha’s is because she’s never loved before: she’s never enjoyed
parental love, friendship or romance. And really, she never
will.Her is released in UK cinemas 24 January
2014.
View Gallery
8 items

Latest on wired.co.uk

News

Reviews

Magazine

Podcast

Events

Wired Insider