Adam Johnson

Home | Blog | Training | Projects | Colophon | Contact

Fun with GPT-3

2020-07-18 Deep, deep learning

Here’s a conversation I had earlier with the late Ernest Hemingway:

Me: Please help me word this better: “Django is made with Python, a free and open source programming language that was made by Guido van Rossum in the 1990’s. Python’s name is a reference to the Monty Python comedy series and Django’s name is a reference to the Belgian jazz guitarist. Django can help you make any website.”

Ernest Hemingway: Django is a tool that helps developers build web sites and web applications. It is made with Python, which is a programming language that allows you to write code to do things for you. Django can help you make any website.

Naturally, this wasn’t with the real Hemingway, but a simulation of him with the GPT-3 transformer-based AI. The model receives the first paragraph as input, plus the “Ernest Hemingway:” prompt, and predicts the next words.

I’m pretty impressed with the result, which is from the first attempt (no cherry-picking). GPT-3 Hemingway dropped the whimsical facts and got straight to the point about Django. It also decided Python should be expanded on a little more.

GPT-3 is created by the Open AI project, and is an update to last year’s GPT-2. It’s new enough that it doesn’t feature on the Open AI blog yet, but the paper and GitHub repository are available. This has been enough to spawn some small projects using it.

The project I used above to get GPT-3 Hemingway to suggest copy improvements above was Learn from Anyone by McKay Wrigley. It’s a chatbot which you can provide a name and then ask questions, with the idea that GPT-3 completes the conversation, using the name you give it.

The amount of embedded knowledge is pretty amazing, although it can imagine a lot of things too. For example, I tried to see if GPT-3 is self-aware. When I asked it to describe itself, it went a bit off course and described a bot for playing the game of Go:

Me: What is GPT-3? How does it compare to GPT-2?

Open AI: Great question. I can break it down. GPT is the Go Playing the game and the number the model is numbered with. So GPT-3 is the third model to play Go, but not the third model of GPT…

Whilst Go is a game that AI seem to have mastered, it’s not quite what I was looking for. This Go bot seems made up, but the description it gave is very self-consistent (I cut several sentences).

One other experiment I tried with Learn from Anyone was asking it to reply as James Bennett on the Django ORM. Whilst far from accurate, it picked up that we were talking about the Python web framework and not Django the musician or Django Unchained the movie. It then proceeded to generate a bunch of Python-ish code with interleaved descriptions. I posted this on Twitter.

I’ve had some fun looking at other GPT-3 resources this week.

Gwern’s page on GPT-3 collects many experiments and observations on trying to get GPT-3 to complete certain tasks. It’s long, but the first sections are really worth reading and the remainder covers many different tasks. I especially enjoyed the Harry Potter literary parodies.

Gwern’s main observation is that GPT-3 can do well on many different tasks with the right prompt. I suspect the Learn from Anyone bot can be improved with more text in the underlying prompt.

GPT-3 reminds me we have barely seen the start of “the deep learning revolution”. I have been using TabNine, a GPT-2-based programming autocomplete tool, for about a year. At this point I cannot imagine life without it. It can complete whole lines of code and learns from any local repetition to suggest the next steps. I cannot wait to use the GPT-3 based version.

One project has already started looking at applying GPT-3 to code generation. debuild.co is a React Website Code Generator, built by Sharif Shameem. The only prompt it needs is a plain english description of the desired application, and it writes functional code. See this video for the latest demonstration. If such code generation works too well, I might be out of job soon.

GPT-3 and its descendants will have many applications. Programmer tools have started appearing first, but so many things humans do use the medium of text. Everyone will be affected by GPT-3 in time.

Fin

May you find yourself on the right side of the AI revolution,

—Adam


Working on a Django project? Check out my book Speed Up Your Django Tests which covers loads of best practices so you can write faster, more accurate tests.


Subscribe via RSS, Twitter, or email:

One summary email a week, no spam, I pinky promise.

Related posts:

Tags: django