Home / Blog / 60TB seem huge, till you see 330TB

60TB seem huge, till you see 330TB

A 3D NAND wafer seen under the microscope showing the cells in an SSD memory, like the ones used by Seagate for their 60TB storage. Credit: Micron

Last year, 2016, Seagate presented their new SSD storage with a huge 60TB capacity. It is not on the market yet, as of August 2017, and analysts expect a price tag around 40,000$, not exactly for my purse. That means a 0.66$ cost per GB. Compare this with a cost of 0.03$ per GB you are paying for a hard drive storage or a 0.3$ cost per GB per a mass market SSD and you see it will be a product for a few very demanding use cases.

So, why would this sort of capacity be needed? Well, according to Seagate in 2020 there will be some 1,300 EB (1.3 million Petabyte) of installed storage in Clouds all over the world, 1,800 EB installed storage in data centres, 10 EB captured by drones in 2020 (resulting from 560 million photos and videos captured by drones that will generate a 127 Billion $ market worldwide) and 393 EB of storage shipped to the mass market. In addition they expect 600 ZB (million of Petabyte) generated by IoT (not necessarily needing a permanent storage…). See the clip.

The mag tape developed by IBM and Sony increases the storage density leading to cartridges that can store up to 330TB. The feat was achieved by using vapour deposition of magnetic nanoparticles, rather than the liquid deposition used today. Credit: IBM

As I was marvelling at the rapid evolution of SSD storage I read the news of IBM and Sony developing an advanced mag tape with a storage density of 25GB per square inch. when you do the math, it means that a tape cartridge can store 330TB of data at a cost that is a fraction of the cost of the SSD. Obviously there is a trade off: you gain capacity but you lose speed!
SSD will remain king in applications where there is a need to access data randomly, like when you need run data analytics applications on huge amount of data, whilst tape cartridges would do perfectly when you just need to preserve data for archival reasons.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.