GPT, ChatGPT, text-generation program, OpenAI Inc., San Francisco, California, USA


Führung: GPT-2 – The text writing AI (IT)

Jul 3, 2020

“GPT-2” is the name of a machine learning model with which the American research group OpenAI has developed a remarkably powerful language system. “GPT-2” is transformer-based, which is the name of a new approach to NLP (Natural Language Processing), in which the system also learns independently which words and parts of a text require more attention. The model was trained “unsupervised” with the simple goal of predicting the next word by taking into account all previous words in a text. To do this, it was fed with text from eight million web pages, can take into account 1.5 billion parameters and thus produce impressively “real” looking texts.

In this Home-Deilvery edition we tell you more in Italian language about this and also why the developers initially decided to make the trained model available to the public only in a very limited form.
 
Back
Top