Learn something about Google’s new Duplex feature today

Google announced all kinds of technological advancements at this year’s Google IO, and one of the most interesting was Google Duplex. You might’ve missed it, while you were thinking and searching on Google about when you’ll get the awesome new Android P (that’s unlikely to happen unless you have a pixel device). But, don’t worry. We got you covered.

It’s an add-on to Google Assistant, an artificial intelligence that can make phone calls for you. And we don’t mean dialing the number. We mean it has actual conversations with real-life people.

If you haven’t already seen the demo, please watch it below:

Let's Talk About Google Duplex!

Let me clear out this one thing for you, Google Duplex isn’t designed to replace humans altogether. It’s designed to carry out very specific tasks in what Google calls “closed domains”. So, for example, you wouldn’t ask Google Duplex to call your mum, but you might ask it to book a table at a restaurant.

Initially Google Duplex will focus on three kinds of task:

  • Making restaurant reservations
  • Scheduling hair appointments and
  • Finding out businesses’ holidays and opening hours.

An intimate question that may come to your mind, “Why does it sound like humans?”
I think, it’s party because Google reckons it’s the most efficient way to get the information, especially if there are variables and interruptions, and partly because if you got a phone call from “The Terminator” you’d probably hang up.

The full details are over at the Google AI Blog, but here’s the executive summary: Google Duplex enables you to get information that isn’t on the internet.

Google Duplex is the missing link between the Google Assistant and any business because it enables the Assistant to get information that isn’t available digitally. For example, you might want to know a business’s holiday opening hours but they haven’t listed it on their website, or you might want to know if a shop has a particular item in stock and it doesn’t have online stock availability.

So Duplex does what you’d do. It phones up and asks for the information it needs.

From a tech perspective, Google Duplex uses a recurrent neural network (RNN) built using TensorFlow Extended (TFX). There’s a really good introduction to RNNs here. What RNNs like the one powering Duplex can do process sequential, contextual information, and that makes them well suited to machine learning, language modeling and speech recognition.

When you make a request, the Google Assistant will hand it over to Google Duplex to carry out; if it’s within Duplex’s abilities it’ll get on with it. If it isn’t, it will either tell the Assistant it can’t do it or refer the job to a human operator.

Duplex talks like a normal person, and that makes it a natural – and natural-sounding – extension to the OK Google functionality we already know. Let’s stick with our restaurant example.

With Duplex, we could say “OK Google, find me a table for Friday night” and the Google app would then call restaurants on your behalf. Not only that, but it would have conversations – so if you wanted a table for around 7:30 but there wasn’t one, it could ask what times were available and decide whether those times fit your criteria. If not, the Google app would call another restaurant. Similarly, if you wanted to arrange a meeting with Sarah, the Google app could call Sarah (or Sarah’s AI) to talk through the available time slots and agree which one would be best.

The key here is that this is all happening in the background. You tell Google to do something and it goes and does it, only reporting back after the task is complete.

Alright, another interesting question now, “Will it be creepy?”

It’s a bit creepy!

It is a bit, and it’s prompted some discussions online already: should AIs tell us that they’re AIs when they phone us up?

What’s the legal situation if your AI makes a deal with my AI without asking either of us first? Will there be a version with a croaky voice we can use to call in sick? More worryingly, what if someone hijacks your account and uses it to impersonate you?

“How can it be beneficial?”

The benefits for people with hearing difficulties are obvious, but it can also overcome language barriers: you might not know the local language, but Google Assistant does – so it can converse in a language you don’t speak.

And it can be asynchronous, so you can make the request and then go offline while Google Duplex gets on with the job: it will report back when you’re online again. That’s useful in areas of patchy connectivity, or if you’re just really, really busy.

And now, as to when you can get it. Google is set to start testing the feature in a public beta via Google Assistant at some point in the summer, although an exact release date or the regions it’ll be available in have yet to be announced.

I’m pumped about it!!