The Yandex company announced the YaLM 100B neural network, designed to generate and process texts in Russian and English. It is claimed to be the world’s largest GPT-like (Generative Pre-trained Transformer) model published in the public domain to date.

Image Source: pixabay.com/geralt
YaLM 100B contains 100 billion parameters – more than any of the existing models for the Russian language. Thanks to this, the neural network can be used to solve a wide range of problems related to natural language processing.
Language models of the YaLM family determine the principle of constructing texts and generate new ones based on the laws of linguistics and their knowledge of the world. Any type of text is allowed: it can be answers, poems, congratulations, etc. Moreover, algorithms are able to come up with ideas for advertising campaigns, create product descriptions and videos, and classify texts.

Image source: pixabay.com / Gerd Altmann
In the process of training the network, Yandex supercomputers, recognized as the most powerful in Eastern Europe, were involved. YaLM 100B processed about 2 TB of texts from open sources and the Internet in English and Russian.
Now the neural network can be used by developers and researchers from all over the world. The model is provided under the Apache 2.0 open license and is available at GitHub.
If you notice an error, select it with the mouse and press CTRL + ENTER.