As if anyone is good at predicting the future. Please can we stop acting like expertise equates to fortune telling capabilities?! Nobody has any clue what a 1000x sized GPT model could do, and anybody who makes strong claims is a charlatan. In this age of paranoid AI risk cultists we need to cultivate humility and calm, a willingness to follow data rather than beliefs and predictions.
p-e-w|2 years ago
There is broad consensus among experts that a hypothetical strong AI would be a threat, and potentially an existential threat, to humanity. While not everyone agrees on details like timeline and alignment issues, the idea that AI is dangerous is not a cult, it's the mainstream view.
Climate scientists cannot "predict the future" with certainty either. That doesn't mean their warnings are hot air, and neither are the warnings from AI safety experts. It seems like the educated masses are currently in denial about AI in much the same way as the uneducated masses have been in denial about climate change for a while.
Risk assessment doesn't require understanding. I don't have to understand how a venomous snake senses prey in order to know that the snake is a potential threat to me. In fact, the less I know about the snake, the higher the assessed risk should be, since the uncertainty is higher as well.
goldenshale|2 years ago
We can understand the physics of greenhouse gases and take measurements of earth systems to build evidence for models and theories. (Many of which are nonetheless very inaccurate beyond short time horizons.) Show me any evidence for AI risk today beyond people's theories and beliefs?
The best predictor of the future is the past, not people's wild ideas about what the future could be. I'm not about to sit here feeling scared because there is more uncertainty that our matrix multiplies are about to go rogue. There are no AGI experts or AI risk experts, because we don't have any of these systems to study and analyze. What we have is people forming beliefs about their own predictions about systems which are unknowable.
civilized|2 years ago
There are two definitions of expertise:
1. Knowing more than most people about a topic. This is the type of expertise that wins the Quiz Bowl.
2. Actual mastery of a field, such that predictions and analyses generated by a person possessing such mastery are reliable. This is the type of expertise that fixes your home or car.
The first definition is easily verifiable, and due to the availability heuristic, it is often presented as a legitimate proxy for the second. But it isn't really, not in general.
If I know more about horoscopes than most people, I am a horoscope expert. But it doesn't mean I can be relied on to predict any of the things horoscopes supposedly predict. It's the same with AI risk. Expertise in AI risk is not a basis for credibility because AI risk is not a real scientific field.
Climate change is a real field of science. AI risk is Nostradamic prognostication by people who know more than you.
antifa|2 years ago
I wish a cool scifi robot woke up one day and violently optimized all of humanity into paperclips, instead I live in the real world where the jobs are going to evaporate like water in a newly installed desert and the "let them eat cake" will get increasingly louder and blue-check-markier.
ilaksh|2 years ago
I love GPT and my whole life and plans are based on AI tools like it. But that doesn't mean that if you make it say 50% smarter and 50 times faster that it can't cause problems for people. Because all it takes is systems with superior reasoning capability to be given an overly broad goal.
In less than five years, these models may be thinking dozens of times faster than any human. Human input or activities will appear to be mostly frozen to them. The only way to keep up will be deploying your own models.
So to effectively lose control you don't need the models to "wake up" and become living simulations of people or anything. You just need them to get somewhat smarter and much faster.
We have to expect them to get much, much faster. The models, software, and hardware for this specific application all have room for improvement. And there will be new paradigms/approaches that are even more efficient for this application.
For hyperspeed AI to not come about would be a total break from computing history.
goldenshale|2 years ago
This pseudo-intellectual belief structure is very cult like. Its an end of the world scenario that only an elite few can really understand, and they, our saviors, our band of reluctant nerd heroes, are screaming from the pulpit to warn us of utter destruction. The actual end of days. These "black box" (er, I mean, we engineered them that way after decades of research, but no, nobody really understands them, right?) shoggoths will be so incredibly brilliant that they will be able to dominate all of humanity. They will understand humans so well as to manipulate us out of existence, yet they will be so utterly stupid as to pursue paper clips at all cost.
Maybe instead these models will just be really useful software tools to compress knowledge and make it available to humanity in myriad forms to develop a next level of civilization on top of? People will become more educated and wise, the cost of goods and services will drop dramatically, thereby enriching all of humanity, and life will go on. There are straighter paths from where we are today to this set of predictions than there are to many of the doomsday scenarios, yet it has become hip among the intelligentsia to be concerned about everything. Being optimistic is somehow not real, (although the progress of civilization serves as great evidence that optimism is indeed rational) while being a loud mouthed scare mongerer or a quiet, very serious and concerned intellectual, is seen as respectable. Forget that. All the doomers can go rot in their depressive caves while the rest of us build a bad ass future for all of humanity. Once hail bop has passed over I hope everyone feels welcome to come back to the party.
ignoramous|2 years ago
Whoever predicts the right direction, (and when the time is right) puts money where their mouth is, stands a shot at unseating... the alt man.
paulgraham.com/ambitious.htmlJimtheCoder|2 years ago