(no title)
aik | 2 years ago
Given gpt-4 is already ridiculously useful and we’ve barely scratched the surface, it makes complete sense to me. More capacity + faster gpt responses unlocks massive amounts of more potential/use cases.
aik | 2 years ago
Given gpt-4 is already ridiculously useful and we’ve barely scratched the surface, it makes complete sense to me. More capacity + faster gpt responses unlocks massive amounts of more potential/use cases.
azinman2|2 years ago
No one is even talking about these kind of figures being used for climate change, which is a far more pressing problem.
HeatrayEnjoyer|2 years ago
Geometric growth doesn't feel like much until you slam into the wall.
__loam|2 years ago
Where's the killer app? The only one I can think of off hand is co-pilot and the reception I've seen is that it's pretty mid. Most of the proposed applications require human checking to get right which is a huge limitation to the adoption of these systems unless you accept a 3-5% error rate which is terrible. I've not met anyone who is interested in something like a book written using this thing and the main use case I've seen basically amounts to denial of service attacks with believable bullshit.
Frankly the only people I've seen who are super excited about this stuff are people in the field or the uninformed.
aik|2 years ago
In terms of error rate: gpt 3.5 had a high hallucination rate that made use cases fairly narrow. It then got faster which opened up some more use cases. Then gpt 4 came out that had a significantly smaller hallucination rate which opened up a gigantic number of additional possibilities. And had a larger context window and output size that made it significantly more useful. Then it got faster with an even larger context size… each of these iterative improvements just continue to add more and more possibility in a gigantic range of cases that have literally never existed before.