top | item 38324200

(no title)

11217mackem | 2 years ago

GPT–2 was the first model that created long-form and coherent text.

The research led to a technology with a simple use case: input / output.

The "product" is a chatbot. input / output.

There is no innovation in interface or product design.

All the magic – all the "product innovation" is in it creating coherent text output.

(OpenAI's UI is not really that great).

All attempts at "product" (as in serving a specific use case tailored to a specific business or commercial workflow) that OpenAI have led to date – i.e. plugins – have been a huge flop.

I am uncertain who is behind assistants interface design to make RAG so easy. It's usability is underpinned by the larger context window of GPT-4T.

All of the magic is in the underlying technology. And I think Ilya's at the core of it. Yes, Attention Is All We Need is the Google paper that invented the transformer.

Ilya also worked at Google and was a key part of the ecosystem, and I imagine had indirect (and most likely very direct) contributions.

He is one of the most cited computer scientists of all time.

Without Ilya, I don't think we'd have the hype right now.

Someone would have come to the same conclusion, I'm sure.

discuss

order

No comments yet.