GPT-3 took Twitter by storm last month, as the advanced AI language model opened the door to new possibilities. Let's talk about what GPT-3 is and what it looks like.
- GPT-3 is an artificial intelligence research lab and OpenAI's latest AI language model. With 175 billion parameters, it's considered the most powerful one ever built, far greater than its predecessor, 2019's GPT-2 model, which had 1.5 billion parameters.
- OpenAI is planning to release this as a commercial model, with customers including Reddit and search-as-a-service provider Algolia.
- After inputting text, the model predicts what the input following it should be because its "memory" contains all text ever published by humans on the internet. As GPT-3 was trained on a dataset of half a trillion words – an archive of the internet called the Common Crawl – the model can also identify linguistic patterns within the text and more, allowing it to understand nuanced human text.
- Vercel CEO Guillermo Rauch says it represents “general intelligence” and "a building block for a new category of apps," adding that it feels like “Google as an API.”
- Open AI is backed by investors including Microsoft, who invested $1b. Other backers include Infosys, Peter Thiel, and Reid Hoffman. Founders of Open AI include Elon Musk, Sam Altman, Ilya Sutskever, and Greg Brockman.
What can it do?
Last month, OpenAI gave some people public access to GPT-3 through an API after releasing research on it in May, which inspired many people to make some intriguing creations, including tools that can generate code.
In just the first week of it being released people were able to use GPT-3 to:
- Produce a basic React app – a simple to-do list.
- Produce code for a recreation of Google's home page.
- Build a React dice component.
- Build a fully functioning search engine that returns the exact answer and the corresponding URL for any query.
- One individual built a Figma plugin to design for you.
- 'Auto-complete' an empty image by suggesting what pixels should be in it.
- Generate data for autocomplete spreadsheets. Given how many no-code tools use spreadsheets as APIs, this makes it even faster to create an app without code.
While impressive, GPT-3 might be a little too hyped up, as even OpenAI CEO Sam Altman tweeted. "It still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot still to figure out," he said.
- Since GPT-3 is AI, it has a hard time actually understanding what words mean. Its lack of common sense has also produced some erroneous results when it has had to process text the internet has not prepared it for.
- GPT-3 also struggles with more than a few paragraphs, as even the OpenAI researchers stated: “GPT-3 samples [can] lose coherence over sufficiently long passages, contradict themselves, and occasionally contain non-sequitur sentences or paragraphs.” This is because the model generates its output word-by-word based on nearby text.
- While GPT-3 can create unique text, it has also reused whole quotes of existing texts it was trained on, raising plagiarism concerns.