OpenAI, an organization that focuses only to design artificial general intelligence devices or systems and build it intact for humans. No Terminator-like horrible imaginary place, no unbalanced machines that formulate human paperclips. Only computers with accustomed intelligence in order to help in solving large-scale computational problems. This time it has come up with merely an autocomplete program, like the one in the Google search bar. You start typing and it predicts what comes next. But while this sounds simple, it’s an invention that could end up defining the decade to come.
The program itself is called GPT3(generative pre-trained transformer), the third in a series of autocomplete tool by openAI. The previous version that is GPT2 had 1.5 billion parameters but this state of the art language model has made up with 175 billion parameters. At its core, this is an extremely sophisticated text predictor a human gives it a chunk of text as input and the model generates its best guess as to what the next chunk of text should be. It can then repeat the process taking the original input together with a newly generated chunk treating that as the new input and from it generating a subsequent chunk until it reaches a length limit.
How does GPT3 go about generating these predictions?
GPT3 has ingested effectively all of the texts available on the internet the output it generates is a language that it calculates to be a statistically plausible response to the input it is given based on everything that humans have everything published online concerns. Like all deep learning systems, GPT-3 looks for patterns in data. To simplify things, the program has been trained on a huge corpus of text that it’s mined for statistical regularities. These regularities are unknown to humans, but they’re stored as billions of weighted connections between the different nodes in GPT-3’s neural network. Importantly, there’s no human input involved in this process: the program looks and finds patterns without any guidance, which it then uses to complete text prompts.
If you input the word “fire” into GPT-3, the program knows, based on the weights in its network, that the words “truck” and “alarm” are much more likely to follow than “lucid” or “Elvish.” So far, so simple. Amazingly rich and nuanced insights can be extracted from the patterns latent in massive datasets far beyond what the human mind can recognize on its own. This is the first premise of modern machine learning.
What it can do?
- Code software program:
GPT3 has been used to design and code applications based on text input. All developers have to do is enter the text that captures the essence of the product and generated dishes out code.
- Write Creative Fiction:
When properly primed by a human it can write creative fiction. Researchers said that GPT3 samples are not just close to human-level but wittier, meta, and beautiful.
- Wring Emails and business proposals:
GPT3 can also write business memos. The researcher fed GPT3 with the first half of how to run a board meeting and in a few minutes it wrote up a three-step process on how to recruit board members.
- Qualitative text search and can be used as a sophisticated text predictor.
- The potential loss of jobs:
Jobs like software developers, journalists, content writers, and creative professionals, lawyers, accountants are threatened by GPT3. As it can code in many languages without training and can deal with legalistic language and writing contents.
- Can produce offensive or silly responses
It lacks the ability to reason abstractly basically it lacks true common sense. Sometimes it can produce offensive responses such as being racist, antisemitic, and sexist.
- Can propagate fake news
Its predictions are capable of spreading fake news and propaganda.
These concerns do not mean that GPT3 is not a useful tool or that it will not underpin much valuable application. OpenAI CEO sam Altman made a statement:
The GPT3 hype is way too much. It’s impressive but still has many weaknesses and sometimes makes silly mistakes. Ai is going to change the world but GPT3 is a very early glimpse.