China Telecom say AI model with 1 trillion parameters trained with Chinese chips

A Chinese state-owned carrier said it has developed two large language models (LLM) trained entirely on domestically produced chips, illustrating the progress China has made in its effort to achieve chip autonomy in artificial intelligence (AI).
The Institute of AI at China Telecom, one of the country’s large state-backed telecoms operators, said in a statement on Saturday that its open-source TeleChat2-115B and a second unnamed model were trained on tens of thousands of domestically produced chips, marking a milestone amid tightening US restrictions on China’s access to advanced semiconductors, including Nvidia’s latest AI chips.

The achievement “indicates that China has truly realised total self-sufficiency in domestic LLM training” and marks the start of a new phase for China’s innovation and self-reliance in LLMs, the technology behind OpenAI’s ChatGPT, the AI institute said in a statement published to WeChat.

China Telecom said the unnamed model has 1 trillion parameters – a machine-learning term for variables present in an AI system during training. The sophistication and effectiveness of an AI model depend largely on the scale of parameters involved in the training process. TeleChat2t-115B has over 100 billion parameters, the company said.

China Telecom is believed to be using chips from Huawei to train AI models. Photo: AFP

Read original article here

Denial of responsibility! Pioneer Newz is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a Comment