Sanctions imposed by the United States last year to slow the development of China’s artificial intelligence (AI) industry are having very little impact on the Chinese military and technology sector.
“The AI companies that we talk to seem to see the handicap as relatively small and manageable,” said Charlie Chai, a Shanghai-based analyst with 86Research, referring to export curbs intended to throttle China’s development of supercomputers used to develop nuclear weapons and artificial intelligence systems like ChatGPT.
Rules imposed by the Biden Administration restricted shipments of Nvidia and Advanced Micro Devices (AMD) chips that have become the global technology industry’s standard for developing chatbots and other AI systems.
Also on AF: Access to China ‘Essential’ as it Develops Chips: ASML CEO
But Nvidia has created variants of its chips for the Chinese market that are slowed down to meet US rules.
Industry experts say the newest one – the Nvidia H800, announced in March – will likely take 10% to 30% longer to carry out some AI tasks and could double some costs compared with Nvidia’s fastest US chips.
But even the slowed Nvidia chips represent an improvement for Chinese firms.
Tencent, one of China’s largest tech companies, in April estimated that systems using Nvidia’s H800 will cut the time it takes to train its largest AI system by more than half, from 11 days to four days.
Protecting US companies
Part of the US strategy in setting the rules was to avoid such a shock that the Chinese would ditch US chips altogether and redouble their own chip-development efforts.
“They had to draw the line somewhere, and wherever they drew it, they were going to run into the challenge of how to not be immediately disruptive, but how to also over time degrade China’s capability,” said one chip industry executive who requested anonymity to talk about private discussions with regulators.
The export restrictions imposed by the US have two parts. The first puts a ceiling on a chip’s ability to calculate extremely precise numbers, a measure designed to limit supercomputers that can be used in military research. Chip industry sources said that was an effective action.
But calculating extremely precise numbers is less relevant in AI work like large language models where the amount of data the chip can chew through is more important.
Nvidia is selling the H800 to China’s largest technology firms, including Tencent, Alibaba and Baidu for use in such work, though it has not yet started shipping the chips in high volumes.
“The government isn’t seeking to harm competition or US industry, and allows US firms to supply products for commercial activities, such as providing cloud services for consumers,” Nvidia said in a statement last week.
China is an important customer for US technology, it added.
“The October export controls require that we create products with an expanding gap between the two markets,” Nvidia said last week. We comply with the regulation while offering as-competitive-as-possible products in each market.”
Bill Dally, Nvidia’s chief scientist, said in a separate statement this week that “this gap will grow quickly over time as training requirements continue to double every six to 12 months.”
A spokesperson for the Bureau of Industry and Security, the arm of the US Commerce Department that oversees the rules, did not return a request for comment.
Also read: ASML, Lam See Strong China Demand for Less Advanced Chips
Slower chips not limiting
The second US limit is on chip-to-chip transfer speeds, which does affect AI. The models behind technologies such as ChatGPT are too large to fit onto a single chip. Instead, they must be spread over many chips – often thousands at a time – which all need to communicate with one another.
Nvidia has not disclosed the China-only H800 chip’s performance details, but documents show it has a chip-to-chip speed of 400 gigabytes per second. That is less than half the peak speed of 900 gigabytes per second for Nvidia’s flagship H100 chip available outside China.
Some in the AI industry believe that is still plenty of speed. Naveen Rao, chief executive of a startup called MosaicML that specialises in helping AI models to run better on limited hardware, estimated a 10-30% system slowdown.
“There are ways to get around all this algorithmically,” he said. “I don’t see this being a boundary for a very long time – like 10 years.”
Changing dynamics of AI
Money helps. A chip in China that takes twice as long to finish an AI training task than a faster US chip can still get the work done.
“At that point, you’ve got to spend $20 million instead of $10 million to train it,” said one industry source who requested anonymity because of agreements with partners. “Does that suck? Yes it does. But does that mean this is impossible for Alibaba or Baidu? No, that’s not a problem.”
Moreover, AI researchers are trying to slim down the massive systems they have built to cut the cost of training products similar to ChatGPT and other processes. Those will require fewer chips, reducing chip-to-chip communications and lessening the impact of the US speed limits.
Two years ago the industry was thinking AI models would get bigger and bigger, said Cade Daniel, a software engineer at Anyscale, a San Francisco startup that provides software to help companies perform AI work.
“If that were still true today, this export restriction would have a lot more impact,” Daniel said. “This export restriction is noticeable, but it’s not quite as devastating as it could have been.”
- Reuters, with additional editing by Vishakha Saxena
Also read:
China’s Guangdong Plans $4.4bn Fund to Boost Chip Sector
Huawei Beats US Sanctions With Chip Tool Breakthrough
South Korea Asked Not to Fill Chip Gap if China Bans Micron
US Has No Plan to ‘Decouple’ with China, Yellen Says
YMTC Working to Make Advanced Chips With Local Tech – SCMP
US-China Rivalry May Spur Decoupling of Chip Sector – BBC