“China is somewhat behind, obviously,” Tsai said, citing how ChatGPT creator OpenAI has leapfrogged the rest of the tech industry in AI innovation, in a podcast interview published on Wednesday with Nicolai Tangen, chief executive of Norges Bank Investment Management – the branch of Norway’s central bank that is responsible for managing the world’s largest sovereign wealth fund.
Tsai pointed out that China’s tech companies are “possibly two years behind” the top AI firms in the US. He said US export restrictions that bar Chinese companies’ access to advanced semiconductors, such as the highly sought-after graphics processing units (GPUs) from Nvidia, have “definitely affected” tech firms on the mainland, including Alibaba. Hangzhou-based Alibaba owns the South China Morning Post.
“We’ve actually publicly communicated [that] it did affect our cloud business and our ability to offer high-end computing services to our customers,” he said. “So it is an issue in the short run, and probably the medium run.”
05:03
How does China’s AI stack up against ChatGPT?
How does China’s AI stack up against ChatGPT?
The Biden administration last week updated sweeping export controls it implemented in October, making it harder for the mainland to have access to advanced AI processors, semiconductor-manufacturing equipment and even laptop computers built with those chips, according to a Reuters report. The revised rules took effect on April 4.
The candid assessment made by Tsai in the interview reflects the concerns of China’s broader technology industry on how these tightened export controls are dampening local AI innovation, making the country less competitive in this important field.
Tsai indicated that Chinese tech firms are continuing to look for ways to mitigate the impact of these restrictions, including sourcing advanced processors from other suppliers and stocking up on available chips in the market.
Updated US chip controls raise demand in China for Nvidia’s graphics card
“I think in the next year or 18 months, the training on large language models (LLMs) can still go ahead, given the inventory that people have,” Tsai said. LLMs are the technology used to train ChatGPT and similar generative AI systems.
“There’s more high [performance] computing that’s required for training, as opposed to the applications, what people call inference,” he said. “So on the inference side, there are multiple options. You don’t need to have as high-power and high-end chips as the latest model from Nvidia.”
He predicted that “China will develop its own ability to make these high-end GPUs” over the long term.
Beijing urges breakthroughs in chips and quantum computing to command future
“AI is essential,” Tsai said. “Having a good large language model developed in-house is very, very important because it helps our cloud business.”
Cloud computing technology enables companies to distribute over the internet a range of software and other digital resources as an on-demand service, just like electricity from a power grid. These resources are stored and managed inside data centres.
Denial of responsibility! Pioneer Newz is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.