To complete the list of technologies explored by the Imperial College Foresight study that will lead to disruptions in 2030 lets consider the ones in the Human Machine Interactions area: Avatar companions, Lifelong personal assistants, Humanoid sex robots and Emotionally aware machines. This list is clustering a set of technologies into “systems” supporting the interaction, rather than being direct interaction technologies like one might have expected. Hence it is not including Optogenetics, magnetic wave brain stimulation, eye chip implants and the likes that are today being experimented and tomorrow may become usable for human (brain) machine interactions.
Let me say from the start that I have always felt a bit uneasy in talking to a machine when the machine takes up a human like behaviour. I hate IVR (Interactive Voice Response) and the idea of a machine replacing a human being in the area of social relation is… well, it is not my cup of tea.
Having said that, the team at the Imperial College foresee a significant evolution in this area and a uptake that will create disruptions in our way of life. Let’s see.
Synthetic voice technology has progressed to the point of creating an artificial human voice that is almost indistinguishable from a real one. In parallel, artificial intelligence is able to create “stories” and engage in discussion with a human providing a credible interaction. By 2030 it is reasonable to expect that the interaction and voice will be indistinguishable from the one we can have with a human being.
In 2014 Eugene Goostman made the headlines: it was the first application to pass the Turing test. People interacting with it, unaware if it were a person or a machine, were not able to identify it as a machine. The interaction took place via textual interaction. In 2016 an MIT application passed a Turing test based on sound, i.e. it was able to distinguish a variety of sounds as well as a human.
Notice that several experts dispute the fact that the Turing test has been passed, since so far computers have been able to fool a human but just for a limited time and in a limited context. However, there is a general feeling that it is just a matter of time and a machine will be able to impersonate a human. At that point we will have real “avatars”.
You do not have to wait till 2030 to feel the trill of talking to a machine. Take Invisible Girlfriend. On this website you can create your personal “girlfriend” and keep interacting with it (her?), sending messages and receiving messages in response. I cannot understand why one would like to exchange messages with a machine, even one pretending to be his girlfriend. A future where this girlfriend can fool anybody … is not a nice future to me.
Another way to create an on-line partner is offered by Replika, a software that was developed by Eugenia Kudya, an entrepreneur, when she lost a friend. She used the interactions her friend had on Facebook and other social media to feed an AI engine to morph into her friend personality and to keep interacting with her. Now it has turned that into a service that anybody can access, feeding it with the content of his activity on social media and the software will generate a virtual partner that knowing about you will be able to interact with you in an engaging and pleasing way. Of course, the more you will be using the software, the more your virtual partner will know about you and will better fit your expectations.
An interesting twist in this evolution can be imagined: an avatar that can impersonate you. You are busy in some activity and you may use that avatar to replicate yourself in space and time.
It is not difficult to see that once you have a digital twin in the cyberspace an avatar may connect to it and impersonate you. It is also not difficult to imagine the kind of problems this could generate, particularly if there is a malicious hacking on your digital twin that gets much worse when a credible impersonator can leverage on those data.
Just not to give the impression I am completely negative about the use of this technology, I agree that today we are using artefacts as companions with no reservation: I find good companionship in books, others may spend a pleasant time with model railways … It may be funny to spend some time talking to an avatar, as long as we keep in mind it is not a person and it cannot be a replacement for a real person. The problem is that it will become more and more difficult to tell the difference!
Lifelong personal assistants
Chatbox are becoming more widespread, both in the consumer market and for industrial applications. At EIT Digital we are seeing a growing interest by industry in applying chatbox to foster the relationship/cooperation among human and machines in industrial processes and some industries, like IBM, see this as a step towards Industry 4.0. Some of our PhD students are working on chatbox, with applications and trials ranging from health and well being to industrial cooperation with robots.
Mitsuku is a chatbox (it was released in 2014, so it is old stuff!), powered by PandoraBots, that you can experience on line. One of its application is in the advertisement domain, where it can engage you in a human like chat to present and discuss a product. Another foreseen application is to engage lonely, possibly elderly, people making them feel socially active again.
You can get a long list of available chatbox on Wired.
We may expect chatbox to become ubiquitous by 2030 supporting human like interaction with machines, powered by an artificial intelligence engine that will go beyond fitting the operation context to include an understanding of the character of people it is interacting with and able to show empathy in the interaction. These latter two are not well developed today and will make a difference tomorrow, leading to the disruption in the way we communicate with, and perhaps “feel” the machines.