Let’s look now at the last area considered by the Imperial College Foresight study with expected impact in 2040: Human Machine interactions.
Here they are pointing to three technologies: Implantable phones, Conversational machine interfaces and Thought control machine interfaces where the order is based on my view of likely-hood.
- Implantable phones
There have been in the past several thoughts on embedding a cell phone in a person body. As a matter of fact we spend our life in symbioses with the phone, if we happen to live it behind we go back chasing it till it is back in our hands. At the same time electronics is shrinking and we have seen cellphone sneaking in a watch so one might assume that in the future the shrinking will reach a point where embedding won’t be a (technical) problem.
There have also been some experiments to embed part of a cellphone in the body, the loudspeaker and the mike, connecting them wireless to the cellphone. In one instance it was proposed to embed a microchip in a tooth. The microchip connected to the cellphone via bluetooth (of course !) and vibrated in such a way to generated the same vibration generated by sound through the jaw, thus letting the inner ear pick up the sound (Excuse me, is your tooth ringing?).
A bit scaring? Not something to worry about now but in 20 years time technology might have reached the point where a cellphone implant will not only be feasible but it will become normal. The technology hurdles are mostly related to the energy harvesting that so far makes an implant of this type impossible. In my opinion there are also economical hurdles (probably more difficult to solve than the technical ones). Cellphone industry is about selling phones, making them obsolete and selling new ones. To make a phone obsolete the industry is no longer working on performance (since when you saw an ads extolling the better voice quality of a new phone…?) rather on design. A new design is not necessarily better than the previous one but it will definitely make it “old”. Hence the “need” to get the new one.
If you implant the cellphone you can no longer show your friends that you’ve got the latest model. It becomes a pure function provider. In my case I can also imagine that the idea of implanting and re-implanting with images of scalpels, sutures…. is not at the top of my wish list, but it might be different for other people.
- Conversational machine interfaces
“Alexa what is the weather today in Rome?” We have started to interact verbally with machines, and in just a couple of years the interactions have become more and more seamless. A new technology or rather interaction way, Chatbox, is becoming more and more usual. The word is a fusion between Chat and Robot, deriving from the use of software robots that can interact with us using voice. It all started with voice command (Connect … Find ….) then it evolved to accommodate normal sentences (Call Laura, but also Hey, I want to talk to Laura, let’s give her a call…). What is amazing to me is that the change happened in these last two years (we even have a Chatbox Magazine!).
Why is then the Foresight team at Imperial College placing the generalised use of conversational interface twenty years from now? My take is that if you are looking at having a conversation with a (software) robot expecting to have the same experience you would have talking with your friend then the twenty years assumption make sense. I am pretty confident that in twenty years time we would be able to converse with a (soft) robots without a second thought, it would be indistinguishable from a real person. We will have the possibility to select the chatbox we would like (an expert in physics, or in medieval literature) and engage in a rewarding conversation.
Of course this raises the issue of losing the human touch. If I can get the same experience with a robot in terms of conversation will I choose the robot or risk a boring interaction with a human?
It is not for me to answer, but I can see a whole new set of issues popping up to entertain sociologists for the next decades. For a glimpse on a “niche” of conversational potential take a look at the BBC special “Sex Robots and Us” broadcasted on April 8th 2018.
- Though control machine interfaces
Moving on from conversational interfaces the next step would be skipping the conversation and getting in touch directly: brain to machine.
Here again technology is making progress. Although significant, today’s approach is still based on capturing electrical activity from the brain, having a computer processing it and executing some actions. Are these actions the ones that brain was looking for? Not exactly.
What happens is that that “brain” looks (through its eyes) what happens when it “thinks” about something (is the pointer moving up or down?) and based on this feedback learns what to think to have the computer execute what it wants.
There is also a growing training of the computer, to be able to better interpret the “thinking” going on in the brain but basically all experiments run so far see the training of the brain making the interaction possible.
The technology used to pick up the brain electrical activity is not very effective, nor practical. Wearing a sort of cap with hundreds of electrodes provides sufficient information to drive a pointer and clicking, basically to replace the use of the mouse (which for a paralysed person is clearly a huge advance). For more precise electrical activity detection researchers use brain implants, a chip with tens/hundreds of electrodes on the surface of the brain. Although this provides more precise signals it requires surgery, it is prone to infections and picks up activity only from a very narrow region of the brain.
New chips with radio communications are being investigated (to avoid the problem of infection resulting from an open skull) but there are problems with powering those chips and keeping the dissipation low (radio communication is more energy intensive than wire communications) an essential requirement to keep the implant safe.
So far there seems to be no silver bullet on the horizon. Progresses will be made for sure and will be helping people with disabilities of various forms (picking up “thoughts” from the brain includes picking up signals to move muscle in a leg, so the same technology can be used to recover movement in a paralysed person) and in this case the cumbersome equipment needed for the interaction may be worth the while.
In case of communications with a machine, in general, conversational communications may fit better most application areas.
Clearly the evolution towards symbiotic autonomous systems -SAS- would get a boost from a direct brain to machine communications. There might be other ways, to be discovered, for an implant to become aware of the intention of the brain and to act accordingly.
As an example just a month ago, early April 2018, MIT presented AlterEgo, a system able to interpret electrical signals that flows, without us being aware of them, to our facial muscle when we think of some words (without voicing them!). The system has been able, once trained, to pick up words with a 92% accuracy at dictation speed, which is quite amazing.
Another example comes from China where, according to South China Morning Post a Chinese company, Hangzhou Zhongheng Electric, has its workers wearing a helmet with sensors to detect their brain activity and it is able to identify situation of stress, anger that might decrease workers efficiency and attention, potentially leading to mistakes and dangerous situations.
In both cases the electrical signals are used as indicators of a state of mind, they are not related to thought “reading”. As I said, we are quite distant to reach that point.