The implications of Chinese for AI development, part 2
With this post, we are already acquainted with Inspur's Yuan 1.0, "one of the most advanced deep learning language models that can generate coherent Chinese texts." Now, with the present article, we will delve more deeply into the potentials and pitfalls of Inspur's deep learning language model:
"Inspur unveils GPT-3 equivalent for Chinese language", by Wei Sheng, TechNode (1026/21)
The model is trained with 245.7 billion parameters—the number of weights in an artificial neural network, according to the company. This is more than the Elon Musk-backed GPT-3 language model for English, which has 175 billion parameters. Inspur said the Yuan model was trained with 5 terabytes of datasets.
…
Read the rest of this entry »



