The best part of this is I watched Sam Altman say he really thinks fusion is a short period of time away in response to a question about energy consumption a couple years ago. That was the moment I knew he's a quack.
Not to be anti YC on their forum, but the VC business model is all about splashing cash on a wide variety of junk that will mostly be worthless, hyping it to the max, and hoping one or two is like amazon or facebook. He's not an engineer, he's like Steve Jobs without the good parts.
Altman recently said, in response to a question about the prospect of half of entry-level white-collar jobs being replaced by "AI" and college graduates being put out of work by it:
> “I mean in 2035, that, like, graduating college student, if they still go to college at all, could very well be, like, leaving on a mission to explore the solar system on a spaceship in some completely new, exciting, super well-paid, super interesting job, and feeling so bad for you and I that, like, we had to do this kind of, like,
really boring old kind of work and everything is just better."
Which should be reassuring to anyone having trouble finding an entry-level job as an illustrator or copywriter or programmer or whatever.
Fusion is 8 light-minutes away. The connection gets blocked often, so methods to buffer power for those periods are critical, but they're getting better so it's gotten a lot more practical to use remote fusion power at large scales. It seems likely that the power buffering problem is easier to solve than the local fusion problem, so more development goes to improving remote fusion power than local.
Sam is an investor in a fusion startup. In any case, how long it takes us to get to working fusion is proportional to the amount of funding it recieves. I'm hopeful that increased energy needs will spur more investment into it.
People saying that usually mean it as "AI is here and going to change everything overnight now" yet, if you take it literally, it's "we're actually over 50 years into AI, things will likely continue to advance slowly over decades".
The common thread between those who take things as "AI is anything that doesn't work yet" and "what we have is still not yet AI" is "this current technology could probably have used a less distracting marketing name choice, where we talk about what it delivers rather than what it's supposed to be delivering".
Machine learning as a descriptive phrase has stopped being relevant. It implies the discovery of information in a training set. The pre-training of an LLM is most definitely machine learning. But what people are excited and interested in is the use of this learned data in generative AI. “Machine learning” doesn’t capture that aspect.
But the things we try to make LLMs do post-pre-training are primarily achieved via reinforcement learning. Isn't reinforcement learning machine learning? Correct me if I'm misconstruing what you're trying to say here
That was an impressive takeaway from the first machine learning course i took: that many things previously under the umbrella of Artificial Intelligence have since been demystified and demoted to implementations we now just take for granted. Some examples were real world map route planning for transport, locating faces in images, Bayesian spam filters.
Andrew Ng has a nice quote: “Instead of doing AI, we ended up spending our lives doing curve fitting.”
Ten years ago you'd be ashamed to call anything "AI," and say machine learning if you wanted to be taken seriously, but neural networks have really have brought back the term--and for good reason, given the results.
Except AI already had a clear definition well before it started being used as a way to inflate valuations and push marketing narratives.
If nothing else it's been a sci-fi topic for more than a century. There's connotations, cultural baggage, and expectations from the general population about what AI is and what it's capable of, most of which isn't possible or applicable to the current crop of "AI" tools.
You can't just change the meaning of a word overnight and toss all that history away, which is why it comes across as an intentionally dishonest choice in the name of profits.
CSSer|4 months ago
ctkhn|4 months ago
jacobolus|4 months ago
> “I mean in 2035, that, like, graduating college student, if they still go to college at all, could very well be, like, leaving on a mission to explore the solar system on a spaceship in some completely new, exciting, super well-paid, super interesting job, and feeling so bad for you and I that, like, we had to do this kind of, like, really boring old kind of work and everything is just better."
Which should be reassuring to anyone having trouble finding an entry-level job as an illustrator or copywriter or programmer or whatever.
SAI_Peregrinus|4 months ago
rohit89|4 months ago
timeon|4 months ago
2OEH8eoCRo0|4 months ago
CharlesW|4 months ago
You and also everyone since the beginning of AI. https://quoteinvestigator.com/2024/06/20/not-ai/
zamadatix|4 months ago
The common thread between those who take things as "AI is anything that doesn't work yet" and "what we have is still not yet AI" is "this current technology could probably have used a less distracting marketing name choice, where we talk about what it delivers rather than what it's supposed to be delivering".
adastra22|4 months ago
simpleladle|4 months ago
hnuser123456|4 months ago
bcrosby95|4 months ago
I took an AI class in 2001. We learned all sorts of algorithms classified as AI. Including various ML techniques. Under which included perceptrons.
timidiceball|4 months ago
porphyra|4 months ago
brandonb|4 months ago
Ten years ago you'd be ashamed to call anything "AI," and say machine learning if you wanted to be taken seriously, but neural networks have really have brought back the term--and for good reason, given the results.
wilg|4 months ago
Spare_account|4 months ago
Root_Denied|4 months ago
If nothing else it's been a sci-fi topic for more than a century. There's connotations, cultural baggage, and expectations from the general population about what AI is and what it's capable of, most of which isn't possible or applicable to the current crop of "AI" tools.
You can't just change the meaning of a word overnight and toss all that history away, which is why it comes across as an intentionally dishonest choice in the name of profits.
layer8|4 months ago
lo_zamoyski|4 months ago
huflungdung|4 months ago
[deleted]