Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
China AI Hardware

China Trained a 1-Trillion-Parameter LLM Using Only Domestic Chips (theregister.com) 9

"China Telecom, one of the largest wireless carriers in mainland China, says that it has developed two large language models (LLMs) relying solely on domestically manufactured AI chips..." reports Tom's Hardware. "If the information is accurate, this is a crucial milestone in China's attempt at becoming independent of other countries for its semiconductor needs, especially as the U.S. is increasingly tightening and banning the supply of the latest, highest-end chips for Beijing in the U.S.-China chip war." Huawei, which has mostly been banned from the U.S. and other allied countries, is one of the leaders in China's local chip industry... If China Telecom's LLMs were indeed fully trained using Huawei chips alone, then this would be a massive success for Huawei and the Chinese government.
The project's GitHub page "contains a hint about how China Telecom may have trained the model," reports the Register, "in a mention of compatibility with the 'Ascend Atlas 800T A2 training server' — a Huawei product listed as supporting the Kunpeng 920 7265 or Kunpeng 920 5250 processors, respectively running 64 cores at 3.0GHz and 48 cores at 2.6GHz. Huawei builds those processors using the Arm 8.2 architecture and bills them as produced with a 7nm process."

The South China Morning Post says the unnamed model has 1 trillion parameters, according to China Telecom, while the TeleChat2t-115B model has over 100 billion parameters.

Thanks to long-time Slashdot reader hackingbear for sharing the news.

China Trained a 1-Trillion-Parameter LLM Using Only Domestic Chips

Comments Filter:
  • I'm training an indefinite parameter LLM on my laptop. I just write "00000000" onto a cloud storage drive for forever. I'm running out of cloud storage though, please invest or you're a dirty commie that wants China to win.
  • A;most always works like that against an enemy with significant resources. I guess it was more important to pretend fighting China than actually doing it ...

  • China trains 100-billion-parameter AI model on home grown infrastructure

    All you had to do was copy and paste the headline from TFA. [theregister.com]

    • by Westley ( 99238 )

      There are multiple articles linked in the post, and explicit mentions of multiple models. Yes, the Register one only refers to a 100 billion parameter model; the Tom's Hardware and South China Morning Post articles specifically talk about a 1 trillion parameter model. So does the post itself:

      > The South China Morning Post says the unnamed model has 1 trillion parameters, according to China Telecom, while the TeleChat2t-115B model has over 100 billion parameters.

      So, two models: one with 100 billion parame

    • All you had to do was copy and paste the headline from TFA. [theregister.com]

      Yeah but then you'd have to read the register why not paste one of the other headlines: "State-owned China Telecom has trained domestic AI LLMs using homegrown chips — one model reportedly uses 1 trillion parameters" there you go, copy and paste.

  • I bet it was hungry again in a hour /s

Usage: fortune -P [-f] -a [xsz] Q: file [rKe9] -v6[+] file1 ...

Working...