Daily Post 1407


GPT 3, ”generative pre-trained transformer’, is the third in the series of autocomplete tools, researched, designed, trained and created by Open AI. Autocomplete we all know, but this AI variant of it, can leave you in bewilderment. One it’s face, it strikingly simple. A merely autocomplete program, the ones we keep playing around day in and day out for everything on every gadget. While you type, it predicts what should be the data / answer / location / code et al. It has the capability to defining AI in the days come. OpenAI is an artificial intelligence research laboratory, started in 2015 with Elon Musk and Sam Altman as founders. In 2019, OpenAI received a US$1 billion investment form Microsoft.

The San Francisco based AI lab OpenAI aims at creating Artificial General Intelligence, ”computer programs that possess all the depth, variety and flexibility of the human mind.” GPT 3 for sure cannot be termed as an AGI tool, it can be treated as the first step in the direction. ”What is human speech if not an incredibly complex autocomplete program running on the black box of our brains.” These path breaking developments in the natural language processing is similar to the developments in computer vision 2012 onwards, i.e, leap in AI image processing. This brought in revolutionising computer vision enabled technologies to the world, from self driving cars to ubiquitous facial recognition to drones. GPT 3 promises a similar transformational world in natural language professing.

Unsupervised learning is to be celebrated. GPT 3 has got a scale unparalleled so far and also the range of the autocomplete functions. First GPT has 117 million parameters, GPT 2 had 1.5 billion parameters but GPT 3 by comparison has 175 billion parameters, more than 100 times of is predecessor. ”The entirety of English Wikipedia constitutes just 0.6% of GPT 3’s database. The GPT 3 looks for patterns like all deep learning systems, mines for statistical regularities. ”These regularities are unknown to humans, but they’re stored as billions of weighted connections between the nodes in GPT 3’s neural network.” Some GPT 3 autocompletes are namely; a question based search engine, language and syntax puzzles, code generation based on text description, answer medical queries, compose guitar tabs, write creative fiction, autocomplete images etc.

For some it is a major milestone in the long and arduous journey toward AGI. Once achieved human life would be more like a fairy tale. For the critics every mistake is enough to declare it as decades away from AGI, and undeniably it commits any number of mistakes. GPT 3 is error prone is not debated even by the creators. ”Its true value lies in its capacity to learn different tasks without supervision and in improvements it’s delivered purely by leveraging greater scale.” The idea followed by GPT so far is that quantity has a quality of its own, the billion dollar question is how much further can this path take us?


Leave a Comment

Your email address will not be published. Required fields are marked *

The reCAPTCHA verification period has expired. Please reload the page.

Scroll to Top