Home / Blog / Going beyond impossible: 2nm chip technology

Going beyond impossible: 2nm chip technology

2 nm technology as seen using electron microscopy. 2 nm is smaller than the width of a single strand of human DNA. Image credit: IBM.

Today’s most advanced  chip technology is based on a 5nm (5 billionth of a meter) thickness of etching. For comparison the first Intel 4040 was using a 10µm technology (2,000 times “thicker”!). Going beyond that raises many issues that have led to the assumption of “close to impossible”. Going below 4nm brings into the quantum domain where probability takes the upper hand (an electron can be here, but can also be there…) and determinism is impacted (you don’t want 2+2 = 4 most of the time, you actually expect it to be 4 all of the time, won’t you?).

Yet, IBM has announced the development of a new technology that has pushed the envelope of technical feasibility down to the 2 nm level. The chip architecture provides work around to quantum probabilistic effects (we have been using for decades ways to control random errors, including those due to single transistor malfunction).

This kind of chips would increase performances by 45% and decrease power consumption by 75% according to the IBM statement. This could mean a 4 fold improvement of your smartphone battery life, a sharp reduction of CO2 emission related to data centres (data centres are currently using 1% of the global electrical power worldwide) and faster processing with impact on image recognition (read autonomous cars).

At this size level it would be possible to pack 50 billion transistors in a chip of a fingernail size. For comparison, the most recent Apple chip, the M1, packs 16 billion transistors.

However, it is not just a technical issue. There is an economic issue as well: as we shrunk the transistors we managed to decrease the cost per transistor (this is part of the Moore’s law). Since 2015 this has no longer be the case, we kept shrinking the transistor but the cost per transistor did no longer decline, actually since 2017 it started to increase and this trend continues today. So, yes, we can build more powerful chips but their cost is no longer decreasing.

Hence some of the factors that have been pushing the economy and changed market expectations are no longer valid.

At the same time. what we are starting to see is that a new kind of Moore’s law is rising: the one of increased intelligence based on data availability and this is likely to dominate the future 10 years landscape.

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.