GPT-3: Creative Potential of NLP

栏目: IT技术 · 发布时间: 4年前

GPT-3: Creative Potential of NLP

Photo: Merzmensch

It was last year in February, as OpenAI published results on their training of unsupervised language model GPT-2 . Trained in 40Gb texts (8 Mio websites) and was able to predict words in proximity. GPT-2, a transformer-based language applied to self-attention, allowed us to generated very convincing and coherent texts. The quality was that good, so the main model with 1.5 billion parameters wasn’t initially publicly accessible, to prevent uncontrolled fake news. Luckily, the complete model was later published and could be even used withColab Notebooks.

This year OpenAI strikes back with new language model GPT-3. With 175 billion parameters.

Unnecessary spoiler: it’s incredibly good.

There are already some profound articles on TDS examining features and paper of GPT-3:

But how does it look like in action?

OpenAI is building an API, currently accessible via waiting list:

Fortunately, I could get access and experiment with GPT-3 directly. Here are some of my initial outcomes.

Interface, Settings, Presets.

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

The AI Playground interface looks simple, but it bears the power within. For the first, here is a setting dialog, which lets you configure text length, temperature (from low/boring to standard to chaotic/creative), and other features.

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

You also can define where the generated text has to start and to stop, these are some of the control functions that have a direct impact on textual results.

The simple interface provides also some GPT-3 presets . The amazing thing about transformer-driven GPT-models is among others the ability to recognize a specific style, text character, or structure. In case you begin with lists, GPT-3 continues generating lists. In case your prompt has a Q&A structure, it will be kept coherently. If you ask for a poem, it writes a poem.

You can do your own presets, or use the existing, which are:

Chat.

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

A typical setting for a chatbot. You ask - AI answers. It’s possible to change the “characters” or setting also. As you can see, the chat situation was accomplished perfectly (even if my, Human’s, third question was kind of unfair).

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

To demonstrate the contextual impact, let’s change the AI character from “helpful” and “very friendly” to “brutal, stupid and very unfriendly”. You will see how the whole dialogue will be influenced:

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

I think, we re-invented Marvin.

Q&A

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

This preset consists of a clear dual structure: Question and Answer. You need some training before it starts to answer the question (and get the rules), but then it works perfectly. I asked some random questions from various areas and here you go:

GPT-3: Creative Potential of NLP
Screenshot: beta.openai.com // by: Merzmensch

I’d say, perfect!

Parsing unstructured data

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

This one is fascinating and shows a good comprehension of the unstructured text — extracting structured data from the full text.

Summarizing for a 2nd grader

This preset shows another level of comprehension — including rephrasing of difficult concepts and sentences in clear words.

I tried Wittgenstein:

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

The simple proverb can be paraphrased convincingly:

GPT-3: Creative Potential of NLP
Screenshot: beta.openai.com // by: Merzmensch

Or look at this pretty well and clear transition of Sigmund Freud’s time distancing concept:

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

As you see, compression of text and its coherent “translation” is one of the strengths of GPT-3.

What about languages?

GPT-2 was already a great language model when it was about English. You could generate amazing texts, especially with 1.5 billion parameters. I used GPT-2 for a screenplay of this short movie — and its absurdity could be rather understood as a good tradition of David Lynch and Beckett:

The dialogues were logical, even if spontaneous. But it was regarding English. If you’ve tried with inputs in other languages, you would face the barrier of understanding. GPT-2 tried to imitate languages, but you needed to fine-tune it on text corpus in a specific language to get good results.

GPT-3 is different.

It’s processing in other languages is phenomenal.

I tried German, Russian, and Japanese.

German.

It was rather my daughter, who tried to let GPT-3 write a fairy tale. She began with “ Eine Katze mit Flügeln ging im Park spazieren ” (“ A cat with wings took a walk in a park ”).

GPT-3: Creative Potential of NLP

Here is the full text .

The emerged story was astonishingly well written. With irony, vivid characters, and some leitmotifs. This is not just a collection of topoi or connected sentences. This is… a story!

Russian.

GPT-3: Creative Potential of NLP

The full text is here.

I trained once GPT-2 on Pushkin’s poetry and have got some interesting neologisms, but it was a grammar mess. Here I input some lines of Pushkin’s poem — and the result I’ve got was… interesting. It hadn’t rhymes, but stylistically intense power. It was not Pushkin style, though. But almost without any mistakes or weird grammar. And… it works as poetry (especially if you are ready to interpret it).

Japanese.

GPT-3: Creative Potential of NLP

Full text here .

This was something special. I entered just a random sentence:

今日は楽しい一日になりますように!と言いました。// Today was funny and entertaining day, I said.

And the result was a small story about prayer, happiness, wisdom, and financial investment. In well written Japanese (neutral politeness form, like the input).

It does mean: GPT-3 is ready for multilingual text processing.

Various experiments (and alerting signals).

ShakespAIre and writing poems

My first try was, of course, to write a Shakespearean sonnet. So the prompt was just:

here is a poem by Shakespeare

The result was this:

GPT-3: Creative Potential of NLP

Screenshot: beta.openai.com // by: Merzmensch

Perfect iambic verse, great style, nice rhymes… If not one thing:

The first two lines are actually from Alexander Pope, The Rape of the Lock. And here we have a reason to be cautious: GPT-3 produces unique and unrepeatable texts, but it can reuse the whole quotes of existing texts it was trained on.

Re-examination of results is inevitable if you want to guarantee a singularity of a text.

I wonder, if there are some possibilities for “Projection” like StyleGAN2 feature , just in opposite to StyleGAN2 (where it compares the image with latent space), in GPT-3 it would compare with the dataset it was trained on? To prevent accidental plagiarism.

But the thing is: GPT-3 can write poems on demand, in particular styles.

Here is another example:

Essays

As I still hadn’t accessed, I asked a friend to let GPT-3 write an essay on Kurt Schwitters, a German artist, and Dadaist:

The outcome is: GPT-3 has already a rich knowledge, which can be recollected. It is not always reliable (you have to fine-tune it to have a perfect meaning match), but it’s still very close to the discourse.

Coding with GPT-3

Another mindblowing possibility is using GPT-3 is quite different cases than just text generation:

You can get support by CSS:

And calling it General Intelligence is already a thing:

Summary.

We are still at the beginning, but the experiments with GPT-3 made by the AI community show its power, potential, and impact. We just have to use it with reason and good intention. But that’s the human factor. Which is not always the best one.

For more wonderful text experiments I highly recommend you to read Gwern:

Let the journey continue!


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

Programming Rust

Programming Rust

Jim Blandy / O'Reilly Media / 2016-8-25 / GBP 47.99

This practical book introduces systems programmers to Rust, the new and cutting-edge language that’s still in the experimental/lab stage. You’ll learn how Rust offers the rare and valuable combination......一起来看看 《Programming Rust》 这本书的介绍吧!

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

SHA 加密
SHA 加密

SHA 加密工具

XML、JSON 在线转换
XML、JSON 在线转换

在线XML、JSON转换工具