Leading US chip maker, Nvidia appears to be in a tricky spot with the US regulations and could suffer a significant blow to its business in China.
The company is currently facing restrictions on the export of its most advanced AI chips to China due to US regulations. The ban applies to Nvidia’s top-of-the-line chips, prized for their high processing power and are particularly valuable for applications like advanced AI research, data analytics, and autonomous vehicles.
Nvidia Caught Between Satisfying Chinese Clients and Complying with US Rules
The situation began in 2022 when the US government imposed new export control measures, citing concerns about potential “military end use” of the technology by China. US officials are concerned that the advanced AI chips could be employed in weapons development, surveillance, or other military applications.
While the ban may hinder China’s progress in AI technology, Nvidia stands to lose big amid the situation as most of its big Chinese clients, including Alibaba, Tencent, and ByteDance, are beginning to turn to other local manufacturers to source alternative chips.
In November, Reuters reported that leading Chinese AI company Baidu ordered 1,600 of 910B Ascend AI chips from Huawei. The chip model from Huawei is considered an alternative to Nvidia’s A100 chip, which is banned for export to the region.
According to WSJ, Huawei received at least 5,000 orders for Ascend 910B chips in 2023 from major Chinese internet companies.
Will the US Rule Impact Nvidia Revenue?
Nvidia still is able to sell less powerful chips to Chinese customers. However, those are of lower demand compared to the high-performance chips. Consequently, the loss of revenue from these high-end chips could be substantial and may affect its future earnings reports.
In October, Nvidia reported its Q3 revenue at $18.12 billion, up 34% from Q2 revenue and 206% from a year ago – a growth the CEO, Jensen Huang, attributes to the “the broad industry platform transition from general-purpose to accelerated computing and generative AI.”