Interfacing with our senses, skipping the sensors
In the previous post I shared my doubts on the feasibility of establishing a direct connection from a computer to the brain, at least in the foreseeable future and pointed out that a much more promising approach, already followed, is to take over the nervous pathways connecting our senses to the brain.
Of course, all computer to “us” communication today is mediated by our senses, the computer can display information that our eyes pick up, or produce sounds that our ears detect. Nothing new there, and although it is an area of ongoing, intense, research I am not going to discuss that here (I touched upon that in the first posts in this series).
Rather I am going to focus on two aspects of computer to brain interaction mediated by senses:
– sense augmentation
– nervous pathway highjack
Before diving into these areas, it is important to understand that from the point of view of the brain the nervous pathways bringing data to the brain are basically equivalent. This may be surprising both at an intuitive level and at a structural level. At an intuitive level we know very well that hearing is completely different from seeing, taste and smell are two different things (although they are very much connected: if you have a cold, and a runny nose, you lose the sense of smell and along with it a good portion of taste sensations…), touch is very different from the other senses and so on. At the same time anatomists tell us that the sensorial pathways end up in different places of the brain and activate different neural structures. Actually, this is a macro view of the pathways terminations. If we take, as researchers are doing in projects like the human connectome, (watch the clip showing the connections inside the brain, white matter) a finer view we discover that those terminations actually go, almost, everywhere.
For the brain a neurone spike is like any other neuron spike and signals flowing, chemically and electrically, on the sensorial pathways produce spikes in thousands, millions of neurons. It is the whole activity, parallel and sequential, of activation (and repression) of neurons that generates our perception of the world, what we call seeing, hearing, smelling…
We have proof of these basic equivalence of sensorial pathways by looking at what happens in cases where there is something unusual, like a malfunction of a pathway. What we see is that over time the other pathways (connected to other senses) are used by the brain to make up for the malfunctioning ones. This is the phenomena of synesthesia. A person can start seeing colours by hearing sounds, or tasting things by touch.
A well known example is the one of Neil Harbisson. He was born colour blind and managed to have an implant where colours picked up by a digital camera were translated by a computer into stimulation of his aural nerve. After a while Neil started to “see” (probably better to say “perceive”) colours with his ears.
This is interesting for our discussion because it means that we could, at least in principle, use existing sensorial pathways to stimulate the brain with data that are not harvested by our senses. I’ll look at this in the next post.