Results for ""
With the growth of AI, hardware is fashionable again after years of software being the centre of attraction. The statistics are indicative of the trend, too; McKinsey has estimated that that hardware, such as head nodes, inference accelerators, and training accelerators, will account for 40-50% of total value to AI vendors. Moreover, as the scale of chip components gets closer and closer to that of individual atoms, it's become impossible keep up the pace of Gordon Moore’s prediction for the semiconductor industry. It's now more expensive and technically difficult to double the number of transistors, and thus the processing power, for a given chip every two years. The redundancy of Moore’s law means AI needs to enable machines to continue to make improvements in training and inference.
While components such as Central Processing Units (CPU) and Graphics Processing Units (GPU) have become a part of common parlance, there are further innovations that are happening in the field. Here we give an insight into the most valuable innovations in AI hardware.
Image from pxhere