Think about Moore’s Law, the observation made in 1965 by Intel co-founder Gordon Moore that the number of transistors in an integrated circuit doubles roughly every two years. This exponential relationship has long been held as a proxy for the growth of processing power and has been used to illustrate to computer scientists both the range and limitations of their ambitions. Moore’s Law has begun to break down, though, and it is breaking down as a result of the technological innovation that it seeks to predict.
With the arrival of AI and alternative processors, transistor count simply isn’t a useful representation of processing power anymore. Chips have become smaller and smaller, have moved from 2D to 3D, employ increasingly sophisticated and specialized materials in their construction, but traditional central processing units (CPUs) are no longer the frontline of technological innovation. A recent study by Silicone Angle showed that a strict definition of Moore’s Law, which would require transistor numbers to grow at an annual rate of 40%, had slowed to below 30% by 2020. And yet, processing power, taking into account the combination of traditional CPUs with AI and alternative processors, is growing at more than 100% each year. Everywhere we look, the rules of yesterday are being rewritten by the significant rise of technology.