As technology progresses the world of bits and the world of atoms fade into a single reality. However, this reality will be dependent on the mediation devices that will be used, both hardware and software devices. The rooting of reality into the devices used to capture it is nothing unusual, although we seldom think about it.
Every day we use devices to connect our brain, hence our perception, to the external world: our senses. What I see is slightly different from what you see because my eyes are not like yours (I am actually half blind so I get most information from just one eye and that limits my appreciation of depth, I am also getting different colour information from each of my eye and this is also affecting the overall perception). It is not just about me. Every one of us has slightly different eyesight resulting is slightly different perception of the world.
What about the eyes of a bee? They are much better in the higher frequency of the spectrum (shorter wavelength), to the point that a bee can see ultraviolet colours that we cannot see. A snake, is on the opposite side, perceiving longer wavelengths -infrared- and again getting a different view of the world. A shark perceives the electromagnetic field (like it has an embedded cellphone in its head!) and thus gets a different view of the world.
Is the perception of a bee, snake, shark any less real than ours? No, they are all equally real, and they all work pretty well in terms of interacting with the world “out there”.
I am making this point to clarify that the role of devices (we call them senses in the animal kingdom) is crucial in connecting with the world and the perception of the world is tied to the capabilities of these devices.
The eyes, per sé, are just gathering data on certain light wavelength, the retina performs some processing of those data and much more processing takes place in various parts of the brain (most of it, but not all, in the occipital part of the brain). You may call this the software processing part. Same is true for sensors we can use to pick up data from the world out there. A good portion of the “meaning” of the data captured relies on signal processing and software. The evolution we are seeing is both in the hardware (sensors) and software.
If I wear glasses that can convert the longer wavelengths of the spectrum into shorter wavelengths I can perceive temperature in the ambient as red/yellow halos around objects. Is this view of the world less real or is it actually more real than the one provided by my unaided senses?
What the FTI’s report foresees is that in this decade we are going to experience a tremendous progress in hardware and software connecting us to the world, augmenting our senses and transforming our perception of the world.
I used the “eyes” as an example, but it is also about hearing, touching, “feeling”, smelling… And it is not just about the world out there, it is also about the perception of the cyberspace, of bits. This indeed, is something completely new: we are creating bits, they do not exist in the world (although is some cases they may represent some physical world characteristics). This second aspect, the perception of bits, creates brand new problem.
If, I guess, we can agree that the world perceived by a bee, although perceived in different ways that lead to a different interpretation of the world, is as real as the one we perceive, it is much more debatable if a world perception resulting from the mixing of bits and atoms is real. Probably many of us would say that such a world would be a distortion of the real world, or to put it bluntly, it is not a real world.
Yet, what the FTI’s report claims, and this is indeed a big claim, is that this decade will see the rise of a New Reality.
More to follow.