Technology Author:Yunfeng Zhang Apr 18, 2023 01:53 AM (GMT+8)

Sources close to the matter revealed that Nvidia A800, H800 chip production is not much, and is currently being "grabbed" by domestic companies, including servers, and Internet vendors.

chip

As the basis of the entire AI wave of computing power, NVIDIA's A100 and H100 GPUs (graphics processors) are becoming a hot commodity, especially the flagship level of the new H100 in overseas prices are also being speculated.

Keen Technologies CEO and renowned game developer John Carmack were the first to notice the phenomenon, tweeting that the Nvidia AI flagship chip H100 was selling for more than $40,000 at multiple stores. As of Monday, there were at least six stores on the overseas e-commerce platform eBay selling H100. Typically, NVIDIA's A100, H100 and other AI chips are mainly shipped by enterprises and less often retailed in the open market, where there is some room for hype.

In the AI trend driven by ChatGPT, GPUs are critical for training AI models. Microsoft has invested hundreds of millions of dollars to buy tens of thousands of A100 chips for the development of ChatGPT. in addition, Nvidia also has its own GPU-equipped supercomputer DGX, configured with eight GPUs. at this year's GTC (Nvidia developer conference), Nvidia launched the AI cloud service DGX Cloud, through the cloud vendor platform to provide users with the ability to train generative AI big models The monthly rental service of computing power.

For the Chinese market, Huang Renxun told Interface News and other media during the GTC that it will continue to work with cloud providers to launch the service. "By providing landing capabilities through cloud service partners such as Alibaba, Tencent and Baidu, they are fully capable of providing top-notch AI arithmetic service systems." Huang Renxun said that Nvidia can now ship the domestic custom version of the A800 and H800 chips that meet the conditions of U.S. export controls, as a Chinese "special edition."

On April 14, Tencent, the fourth largest cloud vendor in China, also announced the launch of a new generation of a high-performance computing cluster, which uses Tencent's own cloud servers, equipped with Nvidia's latest China-specific version of the GPU H800, with inter-server connection bandwidth up to 3.2 Tbps. Tencent said that the server cluster's arithmetic performance has increased by three times compared to its predecessor, and shortened the training time of Tencent's self-developed "hybrid NLP large model" from 11 days to four days.

For the H800 in the domestic shipments, Nvidia-related sources responded that "customers have been in the booking."

A800, H800 chip is currently being "grabbed" by domestic companies, including servers, and Internet vendors, "H800 domestic shipments began, but the original product is not much. "The source said," H800 shipments even less, the rest of the customer new orders may be the fastest to December to deliver."