Nov 28 (Reuters) – Amazon.com (AMZN.O) on Tuesday announced a new artificial intelligence chip for its cloud computing service as competition with Microsoft (MSFT.O) to dominate the market for artificial intelligence heats up.
At a conference in Las Vegas, Amazon Web Services (AWS) Chief Executive Adam Selipsky announced Trainium2, the second generation of chip designed for training AI systems. Selipsky said the new version is four times as fast as its predecessor while being twice as energy efficient.
The AWS move comes weeks after Microsoft announced its own AI chip called Maia. The Trainium2 chip will also compete against AI chips from Alphabet’s (GOOGL.O) Google, which has offered its Tensor Processing Unit (TPU) to its cloud computing customers since 2018.
Selipsky said that AWS will start offering the new training chips next year. The proliferation of custom chips comes amid a scramble to find the computing power to develop technologies such as large language models that form the basis of services similar to ChatGPT.
The cloud computing firms are offering their chips as a complement to Nvidia (NVDA.O), the market leader in AI chips whose products have been in short supply for the past year. AWS also on Tuesday said that it will offer Nvidia’s newest chips on its cloud service.
Selipsky on Tuesday also announced Graviton4, the cloud firm’s fourth custom central processor chip, which it said is 30% faster than its predecessor. The news comes weeks after Microsoft announced its own custom chip called Cobalt designed to compete with Amazon’s Graviton series.
Both AWS and Microsoft are using technology from Arm Ltd (O9Ty.F), in their chips, part of an ongoing trend away from chips made by Intel (INTC.O) and Advanced Micro Devices (AMD.O) in cloud computing. Oracle (ORCL.N) is using chips from startup Ampere Computing for its cloud service.
Reporting by Yuvraj Malik in Bangalore and Stephen Nellis in San Francisco; Editing by Lisa Shumaker
Our Standards: The Thomson Reuters Trust Principles.