GPT3
GPT-3: generative pre-trained transformer 3
GPT-3 looks for patterns in data.
The program has been trained on a huge corpus of text that it’s mined for statistical regularities.
These regularities are unknown to humans, but they’re stored as billions of weighted connections between the different nodes in GPT-3’s neural network. Importantly, there’s no human input involved in this process: the program looks and finds patterns without any guidance, which it then uses to complete text prompts. If you input the word “fire” into GPT-3, the program knows, based on the weights in its network, that the words “truck” and “alarm” are much more likely to follow than “lucid” or “elvish.” So far, so simple.