9 Comments
User's avatar
Radu Floricica's avatar

To put in context, there's probably not enough media in existence to train a GPT 6. There may not even be enough for GPT 5. At some point, GPT-x tech will turn from an exponential to a logistics curve, just like everything else, and will have to fall back to something more... inductive.

Expand full comment
Andy Crouch's avatar

This post has a vivid, specific conceit (or a couple of them—the one I am primarily thinking of is the question "what GPT-n could do this?") combined with vivid, specific opinions that combine to make it punchy and memorable. Which is not so different from comedy itself.

And I don't see ANY pathway from the GPT outputs I am currently seeing or generating to something that could write this post.

I guess I have no doubt that some future GPT-n will be a good cover band, and thus perhaps it could be a good 68-year-old Jerry Seinfeld. I don't see the evidence that it is in any way progressing toward becoming a 30-year-old Jerry Seinfeld — which I guess is your "6"-level standup comedy. What are your reasons for thinking it will go in that direction rather than become an ever more high-fidelity echo of what has already been done and said?

Expand full comment
Kenny's avatar

My own intuition is that comedy, and creativity more generally, is the output of something GPT-like for prompts basically of the form 'X but Y'. The most recent GPT seems impressively capable of that already. I also believe/suspect that the GPTs are being deliberately 'handicapped' in terms of the style/form of the output they're trained to generate and, without those constraints, they might be even 'scarier' than they already are.

Expand full comment
Radu Floricica's avatar

I didn't say it will - if anything, I gave an argument that it might not be able to. GPT is not as much "smart" as it's doing a massive amount of pattern-matching. If it had a 100 billion fully digital civilization worth of training data to rely on, than it could (conceivable) breeze through Seinfeld routines, because it would have enough "feel" for what a good comedy routine is.

But GPT is not the end of AI. It'll definitely be a component of the final product, just the same way we use pattern matching for a good chunk of our daily lives (see the kiki / bouba effect). But we also have a System 2, which AI doesn't have yet. It's not a risky guess that the real deal will be when somebody manages to make a Bayesian engine work with GPT.

Expand full comment
EBS's avatar

GPT 3 Seinfeld is already pretty impressive: https://www.youtube.com/watch?v=1onxri0duN0

Expand full comment
James Cropcho's avatar

> I ran the same prompt about 5 or 6 times and this was the best.

[https://www.youtube.com/watch?v=1onxri0duN0&lc=UgxW0BlU6Y-9WVkHYX94AaABAg.9cVkdeZmmgF9cVm8MxQgzD]

I expected something like this. The results are less impressive when one learns the routine was not a GPT solo project but rather an AI/human collaboration.

Expand full comment
James Cropcho's avatar

A useful working definition of "boring person," for subjective use relative to a given Listener: a Person is "boring" if the Listener's internal model of Person's speech is good enough that Person says nothing which further improves Listener's model.

Expand full comment
Bill Benzon's avatar

For what it's worth I had indirect access to GPT-3 through a friend, and had him ask GPT-3 to interpret a Seinfeld routine, the first one he'd ever performed on TV. It was an interesting conversation, but GPT-3 didn't quite get it. I ran the same routine at ChatGPT, and he (it, they – what pronouns should I use? Should I ask? Surely some'ones done that) got it right off the bat: Screaming on the flat part of the roller coaster ride: From GPT-3 to ChatGPT – https://new-savanna.blogspot.com/2022/12/screaming-on-flat-part-of-roller.html" target

Expand full comment
Harvey Bungus's avatar

Consider watching Jerrod Carmichael's 8, which is at least an 8.

Expand full comment