AI-driven Creative Apps for You and Your Kids: Tell a story.

AI-driven Creative Apps for You and Your Kids: Tell a story.

GPT-2, a model, released by OpenAI last year, caused a significant impact on Natural Language Processing. Meanwhile, the entire 1.5B parameters version is out. Trained on 40 GB text, it convinces in narrative quality. (Of course, there is Google BERT, but GPT-2 is IMHO way interesting if you’re looking for chaotic but consistent fiction).

An interesting and important aspect of GPT-2 is the transformer driven algorithm of self-attention: text generation is aligned with the first human prompt — and then is coherent within itself. So you can control the initial tendency or focus on the text (which gets gradually more sophisticated).

There are many ways to use GPT-2. I recommend using two of them. In these cases, it’s original model, not trained on additional sources.


01. Talk to Transformers

The most intuitive implementation is TalkToTransformer.com, provided by Adam King. All you have to do is enter the first sentence or some words — and the model continues.

Thank self-attention, you can, for example, evoke a fairy-tale generation writing, “Once upon a time there was a king”.

No alt text provided for this image

As you see, the results can be hilarious, surreal, and absurd, but also with the danger of profanity. If it is not an issue for you, you can generate it together with kids. Of course, you can “curate” texts for your collaborative writing.

PROs:
? easy to use
? there are already some presets available
? the system implements the entire 1.5b model
? quick and available even via mobile

CONTRAs:
? (actually only one:) it generates only short text pieces

02. GPT-2 Colab Notebook

If you want to give your kid the first impression about Python — and also about the backgrounds of how GPT-2 works, use Colab Notebooks (my general advice — by so many wonderful Colab Notebooks out there).

There are also various GPT-2 Colab Notebooks out there. But I suggest trying this implementation: GPT-2 with JavaScript Interface.

No alt text provided for this image

The reasons I like this implementation are:

  • you can modify the text length and other parameters.
  • if you have luck with Google-GPU, you will even get the whole model with 1.5b parameters (like in the screenshots above)
  • Java-Interface is intuitive compared to other implementations (considering your kid wants to play around with GPT-2).

To parameters (according to definitions by GitHub folk):

temperature: Float value controlling randomness in boltzmann distribution. Lower temperature results in less random completions. As the temperature approaches zero, the model will become deterministic and repetitive. Higher temperature results in more random completions.

top_k: Integer value controlling diversity. 1 means only 1 word is considered for each step (token), resulting in deterministic completions, while 40 means 40 words are considered at each step. 0 is a special setting meaning no restrictions. 40 generally is a good value.

Source: https://github.com/openai/gpt-2/issues/27

My fav settings:

temperature = 0.9, / top_k = 80, / generate how much = 1000

PROs:
? easy to use
? longer coherent text possible (self-attention mechanism!)
? the system implements various models including 1.5b parameters
? more possibilities to modify parameters

CONTRAs:
? initialization and text generation takes time

? the bigger model may generate sudden profanity and other weird stuff, probably not suitable for your kids. You have no control over results.

Language barriers? Assuming your kids aren’t native English speakers — it isn’t a problem. You can use DL-driven translation service Deepl.com and translate it to your language:

And of course, you can even let AI read the story aloud in many languages using Amazon AWS Polly service (with around 30 languages available at the moment):

Index: AI-driven Creative Apps for You and Your Kids.

This is an edited version of my Essay at Towards Data Science.

要查看或添加评论,请登录

Vladimir Alexeev的更多文章

社区洞察

其他会员也浏览了