top | item 39203789

(no title)

mattew | 2 years ago

This right here is actually the coolest part about developing with LLMs. You just changed the functionality with a sentence rather than a config file, or writing code. It’s great to be able to break out functionalty into things that can be easily handled in English (or your human language of choice) or what should be done in code.

discuss

order

digging|2 years ago

I think it's the worst part, because it's completely inscrutable. Ask for the same thing in different wording and get a different response. Ask for a similar thing and get stonewalled. A config file has structure which you can (in theory) learn perfectly from documentation or even from your IDE while writing the file. None of that is true of asking in plain English.

I feel in some ways current LLMs are making technology more arcane. Which is why people who have the time are having a blast figuring out all the secret incantations that get the LLM to give you what you want.

Terr_|2 years ago

> I feel in some ways current LLMs are making technology more arcane. Which is why people who have the time are having a blast figuring out all the secret incantations

Yeah, there's an important gap between engaging visions of casting cool magic versus (boring) practical streamlining and abstracting-away.

To illustrate the difference, I'm going to recycle a rant I've often given about VR aficionados:

Today, I don't virtually fold a virtual paper to put it in a virtual envelope to virtually lick a virtual stamp with my virtual tongue before virtually walking down the virtual block to the virtual post office... I simply click "Send" in my email client!

Similarly, it's engaging to think of a future of AI-Pokemon trainers--"PikaGPT, I choose you! Assume we can win because of friendship!"--but I don't think things will actually succeed in that direction because most of the cool stuff is also cruft.

chankstein38|2 years ago

Yeah until the notoriously unreliable ChatGPT forgets that it's supposed to follow that and starts giving you some CYOA text.

behnamoh|2 years ago

Until the LLM starts hallucinating its own instructions and "fills in the blanks".