Home / Blog / Uhm, is this a photo?

Uhm, is this a photo?

The first “photo” ever of a black hole was released on April 10th, but is it really a “photo”? Image credit: NSF

Lots of excitement on April 10th as a team of scientists released the first ever photo of a black hole. The news has leaked through the media but still something was unexpected. It was not the black hole in our Galaxy but one in the M87 galaxy, farther away from us but bigger and less clouded by gases that makes observation of the inner core of our own galaxy more problematic.

As, excited as millions of other people, reading how this snapshot has been taken I started to wonder: is this a photo?

The various radio-telescopes locations transforming the planet Earth into a giant sensor. Graph credit: ScienceNews

The image of the black hole, that by being really black -no light emission at all- cannot be seen nor photographed as such, is the result of an amazing feat that created a virtual sensor by aggregating and analysing data coming from 8 sites (each containing a number of radio telescopes) see graphic. If you read the article on ScienceNews you may notice that they call the black hole photo a “picture” of a black hole, and this is a much more accurate way of describing it.

Definitely, it is not a photo but a rendering of a massive amount of data. The data were captured by the Event Horizon Telescope, EHT, and processed by some 200 photographers! (researchers). The “photo” was exposed over a 10 day campaign between the end of March early April 2017 with the eight sites synchronising one another using an atomic clock. Exposure in each site generated some 64Gbps of data, some 350TB a day per site. Actually, the data size was so big that it was more convenient, and faster, to use snail mail (airplane mail service) to bring the data to the MIT Haystack Observatory in Boston, US, and to the Max Plank Institute for Radio Astronomy in Bonn, Germany. If you think that 10 day is a very long exposure, consider the 2 years it took to … develop the image! As a matter of fact all data harvested (using detection in the 1.3 mm wavelength: just for curiosity, a 1.3 mm wavelength corresponds to a frequency of 230 GHz, that is 3 times the higher frequency of 5G, and within the foreseen frequency range of 6G — may be we will be able to observe black holes with our smartphones in 20 years time – just joking but an intriguing perspective!) had to be processed and decision taken on how to represent them.

The black hole representation as such was a no brainer. We call it black so we expect to see it “black”! Actually, the rendered image shows in black the shadow of the black hole, not the black hole itself. According to theory (including Einstein general relativity) a black hole distorts the space time in it and around it, and the distortion around it should create a specific emission of energy. That was what the EHT was looking for and that was what the data once analysed showed. So the point was how to render those energy levels detected and the decision was to use a bright yellow for the higher energy levels smoothly shifting to red and then black as the detected energy levels decreased. This is what we see in the picture. A rendering of data that have been created out of the data harvested from the radio telescopes. An equally valid representation would have been to choose any other colours (or hues) so that we  could have had an image showing a greenish-bluish torus (circle). I guess our color perception makes a yellowish reddish halo more in line with our expectation.

Of course, it is much easier to communicate the news by saying “this is the photo of a black hole”!

On the other hand, we are basically using the same approach on the photos we are taking every day with our smart phones. Strictly speaking they are not “photos”, rather the result of computational analyses on data gathered by the sensor, sometimes multiple batches of data taken by more than one sensors (which happens if your smartphone has 2 or more lenses) and in several frames (subsequent time frames).

We are calling a photo what we are expecting to see, what would make sense to our brain. In this sense, the image released by the EHT is a photo!

This is also showing how we are losing the perception of differences between the real and the virtual. We are transforming reality into digits and then we work on digits creating a virtual model that is then used to give us a sense of reality. That is the Digital Transformation. We have come to use the virtual world as the way to perceive and understand reality.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the Industry Advisory Board within the Future Directions Committee and co-chairs the Digital Reality Initiative. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.

One comment

  1. Great article!! I love the way you connected photos with big data! Spot on indeed!

Leave a Reply

Your email address will not be published. Required fields are marked *