Amazon.com on Wednesday launched a microchip aimed at so-called machine learning, entering a market that both Intel Corp and Nvidia Corp are counting on to boost their earnings in the coming years.
Amazon is one of the largest buyers of chips from Intel and Nvidia, whose semiconductors help power Amazon's booming cloud computing unit, Amazon Web Services. But Amazon has started to design its own chips.
Amazon's so-called "Inferentia" chip announced on Wednesday will help with what researchers call inference, which is the process of taking an artificial intelligence algorithm and putting it to use, for example by scanning incoming audio and translating that into text-based requests.
The Amazon chip is not a direct threat to Intel and Nvidia's business because it will not be selling the chips. Amazon will sell services to its cloud customers that run atop the chips starting next year. If Amazon relies on its own chips, it could deprive both Nvidia and Intel of a major customer.
Intel's processors currently dominate the market for machine learning inference, which analysts at Morningstar believe will be worth $11.8 billion by 2021. In September, Nvidia launched its own inference chip to compete with Intel.
In addition to its machine learning chip, Amazon on Monday announced a processor chip for its cloud unit called Graviton. That chip is powered by technology from SoftBank Group Corp-controlled <9984.T> Arm Holdings. Arm-based chips currently power mobile phones, but multiple companies are trying to make them suitable for data centres. The use of Arm chips in data centres potentially represents a major challenge to Intel's dominance in that market.
Amazon is not alone among cloud computing vendors in designing its own chips. Alphabet Inc-owned Google's cloud unit in 2016 unveiled an artificial intelligence chip designed to take on chips from Nvidia.
Custom chips can be expensive to design and produce, and analysts have pointed to such investment driving up research and capital expenses for big tech companies.
Google Cloud executives have said customer demand for Google's custom chip, the TPU, has been strong. But the chips can be costly to use and require software customisation.
Google Cloud charges $8 per hour of access to its TPU chips and as much as $2.48 per hour in the United States for access to Nvidia's chips, according to Google's website.
You’ve reached your limit of {{free_limit}} free articles this month.
Subscribe now for unlimited access.
Already subscribed? Log in
Subscribe to read the full story →
Smart Quarterly
₹900
3 Months
₹300/Month
Smart Essential
₹2,700
1 Year
₹225/Month
Super Saver
₹3,900
2 Years
₹162/Month
Renews automatically, cancel anytime
Here’s what’s included in our digital subscription plans
Exclusive premium stories online
Over 30 premium stories daily, handpicked by our editors


Complimentary Access to The New York Times
News, Games, Cooking, Audio, Wirecutter & The Athletic
Business Standard Epaper
Digital replica of our daily newspaper — with options to read, save, and share


Curated Newsletters
Insights on markets, finance, politics, tech, and more delivered to your inbox
Market Analysis & Investment Insights
In-depth market analysis & insights with access to The Smart Investor


Archives
Repository of articles and publications dating back to 1997
Ad-free Reading
Uninterrupted reading experience with no advertisements


Seamless Access Across All Devices
Access Business Standard across devices — mobile, tablet, or PC, via web or app
)