How will GPT-3 affect different industries?
Web Design: Below is an example of GPT-3 being used in developing web pages.
Similar to Low/no-code platforms, we can create the layout of web applications without much programming knowledge. In addition to building web pages, GPT-3 is also likely to have an impact in the design industry, including companies like Webflow, Canva, Figma, etc. The impact could be in terms of both adoption (increase/decrease) and usage (different applications).
Web design in the U.S. is a $40b industry, as of 2020. The Low code industry (which requires minimal coding for applications - e.g.: Salesforce) including no-code (e.g.: Bubble) is expected to reach $21b by 2022 and $45b by 2025. Gartner also estimates that, by 2024, 65% of all app developments would be done by Low-code (and no-code) platforms. As this estimation is done without considering the impact of GPT-3, the release of the next version, or applications/platforms released through the commercial version of GPT-3 could accelerate the change.
Legal tech: Below is an app that was built using GPT-3 to convert legal terms into plain English.
As a substantial part of the work done in the legal industry is based on a particular repository of knowledge, the industry could find many use cases for GPT-3. The potential market size of the corporate legal department is estimated to be ~$16b. The possible areas of AI application in legal tech are Document Management System, Contract Management, Legal Research, Legal Analytics, Cybersecurity, and predictive technology. It could create a document or contract by using simple English as the input, and provide more intelligent search results for precedents and legal research.
Here is an example of a simple search engine that is built using GPT-3.
The top five search engines worldwide currently are Google, Yahoo, Baidu, Yandex, and DuckDuckGo. The global search engine market is expected to reach $200b in 2024. But, search engines typically use a complex adaptive system to rank the search results, and hence it would be interesting to see how a search engine built using GPT-3 would function. Take a look at this if you want to know when such a search engine is released.
Who would benefit from this?
Investor Gavin Baker pointed out that progress in AI likely depends on innovation in semiconductor architectures. With more processing power required to utilize computations of this magnitude, the semiconductor industry is expected to see significant gains. (We wrote about the current landscape of the U.S. semiconductor industry and you can read it here.) Nvidia may also benefit as all the models were trained on Nvidia processors.
In addition to the positive advance in technology, there are also some adverse effects as these computer-intensive AI training experiments have a material carbon footprint. It is estimated the 40-day training required for the AlphaGo Zero program produced 96 tonnes of CO2, which is equivalent to 1,000 hours of air travel or the carbon footprint for 23 American homes for an entire year.
Estimates suggest one hour of training on an Nvidia GPU in California produced about 0.25lbs of CO2 in 2019. The GPU-accelerated supercomputer built for OpenAI has 285,000 CPU cores and 10,000 GPUs where one hour of training could produce 2,500 lbs of CO2.
It’s not just the training hours but also in the inference after deployment of the models. Nvidia estimates about 80-90% of the cost of the model is in its inference. There are also the datacenters that use 200TW as well. Though we don’t actually know what the exact carbon footprint of GPT-3 will be, there seems to be a consensus that it will be very energy-intensive.
The electricity analogy: In 2003, Jeff Bezos gave a famous Ted Talk declaring that the internet is not like the gold rush in the 1800s, but like electricity, with a lot of scope for improvement. Similarly, AI and GPT-3 are like electricity. The millions of innovations in electricity were made possible when the cost of electricity dropped.
As Sam Altman said, GPT-3 indeed has various limitations. But, the limiting factor in the application of such AI models are not in the technology as much as in the commodification of the technology. With a drop in the cost of prediction, more applications can be expected.