Home / Blog / Designing chip layout with AI

Designing chip layout with AI

Magnified image of a microchip. Notice the many pathways connecting the various components. As their number keep growing, it is now in the billions, the connectivity problem exceeds our human capability. Image credit: Tom Narwid, Getty Images

The complexity of designing microchips keeps growing. We have chips today that embeds more than a billion transistors and you need to connect them with one another in very specific ways to get the kind of circuits you need. Let’s get this straight. It is now a long time that designing chips without the support of a computer would simply be impossible. Even if you have “only” a million transistors on a chip the feat would be beyond our capabilities.

Software packages are used to design chips, they have libraries of circuits and software that creates the connections needed for the wafer to be etched.

These software packages are becoming more and more sophisticated as complexity keeps growing. As an example in the last few years chip design has moved from a 2D structure (laying components on a plane) to a 3D structure. This has both simplified and made more complex the connections since you have to decide where to place the various components, also taking into account other parameters, like trying to keep closer the ones that are most often interacting with one another, increasing transfer speed and decreasing power consumption.

Balancing all the various factors is more a form of art than engineering. It takes lot of experience and having a “knack” for design. Full automation in chip design has been an elusive goal so far.

Lately all the big guns, Apple, Google, IBM, NVIDIA, … have started exploring the possibility to use reinforced learning (an area of AI) to come up with an automated chip design.

Interestingly AI is designing several chip lay-out alternatives and then tries them out, one against the other through simulation to see which one provides the best, most desirable, mix of characteristics (higher yield, lower cost, higher performance, lower power consumption…) against a specific task (a chip involved in image processing needs to perform number crunching that may be quite different from a chip that needs to search through huge volumes of data).

Some of these chips being designed are targeting support of AI, so that we have a recursive approach using AI to support AI. We are just starting but the expectation is that in the next decade AI will be designing chips and by doing so it will be learning more an more to the point that AI chips will be generating better performing offsprings….

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.