9 Comments

I still don’t really understand how GPTs are different than regular ChatGPT.

Are GPTs just an ongoing Chat within ChatGPT that you can share with other people + Embeddings?

Or is there more to it?

Expand full comment
author

Involves a bunch of custom instructions, provided context, scripted actions and so on I think? But I don't get to play with them yet. Does seem like a big deal.

Expand full comment

I believe it's ChatGPT + embeddings + a custom system prompt + a custom set of plugins (you can specify whether your custom GPT is able to use web browsing, Code Interpreter, etc., including third-party plugins or your own custom plugins).

I believe it is *not* a shared ongoing chat history.

Think of it as a simple, no-code way of creating a chat app that is fine-tuned for a particular use case. Not literally fine-tuned in the LLM training sense, but using embeddings + system prompt to get partway there for extremely minimal effort.

Expand full comment

the system prompt is pretty easy already so it seems like the biggest benefit is adding an easy way to add Embeddings to your GPT. All of the other stuff seems already available in ChatGPT?

Expand full comment

From a technical perspective, yeah, it doesn't seem to be a huge deal. In addition to embeddings, there's a better plugin model, and the ability to specify precisely which plugins are available in a given custom GPT.

The potentially big deal is the ease of use – both to create a custom GPT, and to use one. Sure it was already possible to customize the system prompt, but now (I gather, haven't tried it) you can literally just give someone a URL that leads them to a new chat session with a specific system prompt. Your company can have an internal web page with links to 20 handy system prompts. And you can bundle embedding data + custom plugins as well, all behind the same URL. Don't underestimate the impact of making something very very very easy to use.

Also on the ease-of-use front, I gather you can create custom GPTs by having a chat with an OpenAI bot, which will interview you about your goals and then go so far as to write the system prompt for you. This could be huge for a wide category of nontechnical users.

Expand full comment

oh yes, agreed that ease of use is a big deal and i don't mean to downplay the feature, just trying to figure out what the *is* is.

Expand full comment

So how close are we to being able to do that possibly fake thing from a few posts back where we showed it a prescription and it guessed the medication so I can finally realize my super villain origin story I mean nuke my entire profession I mean finally get my colleagues to stop being so smug and complacent about technology I mean catch up on sleep on the job I mean Free Up Time To Provide More Patient Centered Pharmacy Care.

Expand full comment
author

We're talking price, I'd assume? So not far from a non-zero amount of it, but far from doing it reliably, including false positives, which is a problem.

Expand full comment

(Less jokey here) I was thinking more in the realm of can it be done to any kind of scale at all (yet), assuming that the earliest iterations are still going to be supervised by an actual person the false positive rates just need to be not atrocious.

I guess if an actual organization wanted it to do it price would be an issue and you're going to need an army of lawyers to put it into real use, but I was also thinking more building-a-robot-pharmacist-in-my-garage here.

Expand full comment