Back in 2015 (a age ago) an article by the World Economic Forum listed the measuring stick for data quantity. So they started with kB (1,000 bytes) and moved up to reach the ExaBytes – a billion of GB-, the ZettaBytes – 1,000 of EB- and then the YottaBytes – 1,000 ZB or 1 million EB-. At that time, EB were nowhere to be seen, not to mention YB, but is was clear that it was just a matter of time. The expectation was that the volume of data produced and the number of data that makes the web would hit that scale.
Indeed today we have reached that scale:
- the Square Kilometre Array Telescope generates 1EB of data per day! If you are not familiar with SKA, take a look at the video clip.
- in 2020 50 ZB of data will be created
- the Digital Universe today (it goes much beyond the one that is accessible using a browser) is estimated to have reached 1 YB.
In that article, probably just for fun, they mentioned bigger higher orders of magnitude, the BrontoByte and the GeopByte, the former representing 1,000 YB and the latter 1 million YB.
There is nothing in this range today, however we are starting to see the possibility that by the end of this decade we will be using the BrontoByte to size the Digital Universe. Assuming a doubling of size every year we could indeed reach 1 BrontoByte by 2030 and some people are actually foreseen an acceleration (today’s web growth is in the order of a doubling every two years, but there are many more data being produced than those ending up on the web). For interesting estimates on the quantity of data take a look at: