Brain Computer Interfaces work at two levels, hard and soft. Each one fuels the other and both are needed to establish an effective communications. The hard part captures the electrical activity of neurones, the soft part interprets this activity to derive the meaning.
Getting more accurate and selective data on the electrical activity is becoming possible through technology evolution, sensors and probes that in some cases have made possible to detect the electrical activity at a single neurone level. To achieve this kind of sensitivity probes have to be implanted in the brain. Likewise in the other direction, influencing the brain at single neurone level. Optogenetics have provided the tools to achieve this specificity. Obviously, invasive procedures, like the ones required to implant probes in the brain are not on the wish list of most people!
In addition, getting the signals (or activating) a single neurone is a drop in the ocean, given the 100 billions neurones in our brain. Technology is allowing the simultaneous detection of several neurones, even a thousand of them with the latest advances but … it is a bucket in the ocean.
Hence the need for the soft part. Using software, and technologies like machine learning researchers, it becomes possible to detect meaning out of electrical activity generated by thousands, millions of neurones. Hence, it becomes possible to use non invasive electrical detection, by placing arrays of electrodes on the scalp, rather than implanting them in the brain.
In an article appearing on PLOS/Biology a team of EPFL researchers describe a new approach in Brain Computer Interfaces based on a symbioses between the AI application and the brain in which each one is teaching the other. The research involved two tetraplegic persons that have been trained, and trained the AI application, to control an avatar in a computer game. The communication took place through a soft helmet (see photo) capturing the electrical activity and using the eyes as feedback to the brain. They participated in the Cybathlon competition organised in Zurich in 2016 (the next one will be in 2020) dedicated to demonstrate progress in human machine cooperation in the area of disabilities, a sort of Olympics where the competing teams are made by humans and prosthetics. For the first time I have seen discussing a cooperation between a brain and a computer (AI application) in terms of symbioses.