Microsoft's Own AI Chip "Athena" Surfaces, Intending to Break Nvidia's Monopoly

Technology Author: Yunfeng Zhang Apr 20, 2023 12:27 AM (GMT+8)

On April 18, The Information reported that Microsoft plans to launch its own artificial intelligence chips.

chip

The chips are reportedly designed to train software such as large language models while supporting inference and being able to power all the AI software behind ChatGPT. According to a source familiar with the matter, Microsoft's AI chip plan includes future generations of Athena chips, with the initial Athena chips all to be produced based on a 5nm process and the Athena series likely to enter mass production next year.

Equal Ocean has learned from sources familiar with the matter that Microsoft began internal development of the chip codenamed "Athena" as early as 2019, and that the chips have been made available to a small group of Microsoft and OpenAI employees who are already testing the technology. Microsoft hopes the chip will outperform chips it has spent hundreds of millions of dollars on from other vendors, saving money on its high-value artificial intelligence efforts.

In recent years, Microsoft's efforts to develop custom chips for its servers have become widely known, with the aim of providing support for Azure cloud computing services. At least 300 people are working on Athena-related tasks at Microsoft, according to The Information, citing people familiar with the matter. At the group level, the deployment of Microsoft's work on Athena stems from key executives such as Nadella. As early as 2019, Microsoft realized it was lagging behind Google and Amazon. Since then Microsoft has been developing its own chips for internal use and cloud customers, according to a person familiar with their thinking.

Tracy Woo, a senior cloud analyst at Forrester Research, a leading research firm, offered an explanation, "The AI boom is putting more pressure on cloud providers to develop their own chips." She further explains, "You can buy from Nvidia, but when you look at the giants like Google and Amazon, you find that they have the capital to design their own chips."

So far in 2019, to support OpenAI training out ChatGPT, Microsoft has spent hundreds of millions of dollars to connect tens of thousands of Nvidia's A100 GPUs together on the Azure cloud computing platform. Scott Guthrie, Microsoft's executive vice president for cloud computing and artificial intelligence, recently said that Microsoft may have spent hundreds of millions of dollars to build the supercomputers that support the OpenAI project. NVIDIA is clearly the giant benefiting the most from the ChatGPT wave, with its powerful GPUs being used to train ChatGPT, GPT-4 and other systems. As ChatGPT's user base soars, the price of NVIDIA's highest-end AI chips is now skyrocketing, even selling for more than $40,000 on eBay.

When some of Nvidia's major customers start developing their own AI chips, such as the latest addition, Microsoft, it will undoubtedly expose Nvidia to even stiffer competition. "Microsoft does not believe its own AI chips can widely replace Nvidia's products, but if Microsoft's internal efforts are successful, it can help it have more say in future negotiations with Nvidia." A person with insights into the project said in an analysis.