GPT-3 Explained in Under 2 Minutes

栏目: IT技术 · 发布时间: 5年前

So, you’ve seen some amazing GPT-3 demos on Twitter (machine-made Op-Eds, poems, articles, even working code). But what’s going on under the hood of this incredible model? Here’s a (brief!) look inside.

GPT-3 is a neural-network-powered language model. Alanguage model is a model that predicts the likelihood of a sentence existing in the world. For example, a language model can label the sentence “I take my dog for a walk” as more probable to exist (i.e. on the Internet) than the sentence “I take my banana for a walk.” This is true for sentences as well as phrases and, more generally, any sequence of characters.

Like most language models, GPT-3 is elegantly trained on an unlabeled text dataset (in this case, Common Crawl ). Words or phrases are randomly removed from the text, and the model must learn to fill them in using only the surrounding words as context. It’s a simple training task that results in a powerful and generalizable model.

The GPT-3 model architecture itself is atransformer-based neural network. This architecture became popular about 2–3 years ago, and is the basis for the popular NLP modelBERT. From an architecture perspective, GPT-3 is not actually very novel! So what makes it so special and magical?

IT’S REALLY BIG. I mean really big. With 175 billion parameters, it’s the largest language model ever created (GPT-2 had only 1.5 parameters!), and was trained on the largest dataset of any language model. This, it appears, is the main reason GPT-3 is so impressive.

And here’s the magical part. As a result, GPT-3 can do what no other model can do (well): perform *specific* tasks without any special tuning. You can ask GPT-3 to be a translator, a programmer, a poet, or a famous author, and it can do it with fewer than 10 training examples. Damn .

Most other models (like BERT) require an elaborate fine-tuning step, where you gather thousands of examples of (say) French-English sentence pairs to teach it how to do translation. With GPT-3, you don’t need to do that fine-tuning step. This is the heart of it. This is what gets people excited about GPT-3: custom language tasks without training data.

Today, GPT-3 is in private beta, but boy can I not wait to get my hands on it.


很遗憾的说,推酷将在这个月底关闭。人生海海,几度秋凉,感谢那些有你的时光。


以上所述就是小编给大家介绍的《GPT-3 Explained in Under 2 Minutes》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

互联网金融手册

互联网金融手册

谢平、邹传伟、刘海二 / 中国人民大学出版社 / 2014-4-1 / 68.00元

本书作者在2012年4月7日“金融四十人年会”上首次公开提出了“互联网金融”概念。在短短两年中,互联网金融已经成为中国金融界和IT界最热门的词汇之一,相关创业活动也非常活跃。本书是作者两年来深入研究、思考的结晶,畅想了金融与IT结合的未来图景,将理论与实践高度融合,与读者分享了许多深具洞察力的观点。本书力图规范互联网金融的定义,完善互联网金融的理论体系,分析互联网金融目前的六种主要类型——金融互联......一起来看看 《互联网金融手册》 这本书的介绍吧!

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

Base64 编码/解码
Base64 编码/解码

Base64 编码/解码

MD5 加密
MD5 加密

MD5 加密工具