Home / Blog / Brain and computer learning together

Brain and computer learning together

Pilot and avatar at Cybathlon, brain and computer working together. Credit: Cybathlon

Brain Computer Interfaces work at two levels, hard and soft. Each one fuels the other and both are needed to establish an effective communications. The hard part captures the electrical activity of neurones, the soft part interprets this activity to derive the meaning.

Getting more accurate and selective data on the electrical activity is becoming possible through technology evolution, sensors and probes that in some cases have made possible to detect the electrical activity at a single neurone level. To achieve this kind of sensitivity probes have to be implanted in the brain. Likewise in the other direction, influencing the brain at single neurone level. Optogenetics have provided the tools to achieve this specificity. Obviously, invasive procedures, like the ones required to implant probes in the brain are not on the wish list of most people!

In addition,  getting the signals (or activating) a single neurone is a drop in the ocean, given the 100 billions neurones in our brain. Technology is allowing the simultaneous detection of several neurones, even a thousand of them with the latest advances but … it is a bucket in the ocean.
Hence the need for the soft part. Using software, and technologies like machine learning researchers, it becomes possible to detect meaning out of electrical activity generated by thousands, millions of neurones. Hence, it becomes possible to use non invasive electrical detection, by placing arrays of electrodes on the scalp, rather than implanting them in the brain.

In an article appearing on PLOS/Biology a team of EPFL researchers describe a new approach in Brain Computer Interfaces based on a symbioses between the AI application and the brain in which each one is teaching the other. The research involved two tetraplegic persons that have been trained, and trained the AI application, to control an avatar in a computer game. The communication took place through a soft helmet (see photo) capturing the electrical activity and using the eyes as feedback to the brain. They participated in the Cybathlon competition organised in Zurich in 2016 (the next one will be in 2020) dedicated to demonstrate progress in human machine cooperation in the area of disabilities, a sort of Olympics where the competing teams are made by humans and prosthetics. For the first time I have seen discussing a cooperation between a brain and a computer (AI application) in terms of symbioses.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.