Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
China AI Hardware

China Trained a 1-Trillion-Parameter LLM Using Only Domestic Chips (theregister.com) 16

"China Telecom, one of the largest wireless carriers in mainland China, says that it has developed two large language models (LLMs) relying solely on domestically manufactured AI chips..." reports Tom's Hardware. "If the information is accurate, this is a crucial milestone in China's attempt at becoming independent of other countries for its semiconductor needs, especially as the U.S. is increasingly tightening and banning the supply of the latest, highest-end chips for Beijing in the U.S.-China chip war." Huawei, which has mostly been banned from the U.S. and other allied countries, is one of the leaders in China's local chip industry... If China Telecom's LLMs were indeed fully trained using Huawei chips alone, then this would be a massive success for Huawei and the Chinese government.
The project's GitHub page "contains a hint about how China Telecom may have trained the model," reports the Register, "in a mention of compatibility with the 'Ascend Atlas 800T A2 training server' — a Huawei product listed as supporting the Kunpeng 920 7265 or Kunpeng 920 5250 processors, respectively running 64 cores at 3.0GHz and 48 cores at 2.6GHz. Huawei builds those processors using the Arm 8.2 architecture and bills them as produced with a 7nm process."

The South China Morning Post says the unnamed model has 1 trillion parameters, according to China Telecom, while the TeleChat2t-115B model has over 100 billion parameters.

Thanks to long-time Slashdot reader hackingbear for sharing the news.

China Trained a 1-Trillion-Parameter LLM Using Only Domestic Chips

Comments Filter:
  • China trains 100-billion-parameter AI model on home grown infrastructure

    All you had to do was copy and paste the headline from TFA. [theregister.com]

    • by Westley ( 99238 )

      There are multiple articles linked in the post, and explicit mentions of multiple models. Yes, the Register one only refers to a 100 billion parameter model; the Tom's Hardware and South China Morning Post articles specifically talk about a 1 trillion parameter model. So does the post itself:

      > The South China Morning Post says the unnamed model has 1 trillion parameters, according to China Telecom, while the TeleChat2t-115B model has over 100 billion parameters.

      So, two models: one with 100 billion parame

    • All you had to do was copy and paste the headline from TFA. [theregister.com]

      Yeah but then you'd have to read the register why not paste one of the other headlines: "State-owned China Telecom has trained domestic AI LLMs using homegrown chips — one model reportedly uses 1 trillion parameters" there you go, copy and paste.

  • by NoWayNoShapeNoForm ( 7060585 ) on Sunday October 06, 2024 @04:39AM (#64843453)
    I bet it was hungry again in a hour /s
  • Surely they didn't think their sanctions would stop them? A delay is the best they could hope for...oh, and a big hit to US companies due to the loss of the Chinese market, and more competition around the world...not to mention an increase in the trade deficit...usa:"buy more stuff from us...no! Not that! Only things you don't want...buy more things you don't want!". So, they buy more of the things they can, but don't need, and stockpile them...and the US complains about that too. Sheesh.

  • It seems curious why a country of a billion people would be interested in developing AI capabilities that would potentially decimate middle-class jobs

Real Users hate Real Programmers.

Working...