top | item 47198939

(no title)

cedws | 1 day ago

We should be very concerned for the next generation. When you have the constant temptation of digging yourself out of a problem just by asking an LLM how will you ever learn anything?

My biggest lessons were from hours of pain and toil, scouring the internet. When I finally found the solution, the dopamine hit ensured that lesson was burned into my neurons. There is no such dopamine hit with LLMs. You vaguely try to understand what it’s been doing for the last five minutes and try to steer it back on course. There is no strife.

I’m only 24 and I think my career would be on a very different path if the LLMs of today were available just five years ago.

discuss

order

andoando|1 day ago

Ok imagine you went back 30 years and you had a swarm of experts around you who you could ask anything you wanted and they would even do the work for you if you wanted.

Does this mean youd be incapable of learning anything? Or could you possibly learn way more because you had the innate desire to learn and understand along with the best tool possible to do it?

Its the same thing here. How you use LLMs is all up to your mindset. Throughly review and ask questions on what it did, or why, ask if we could have done it some other way instead. Hell ask it just the questions you need and do it yourself, or dont use it at all. I was working on C++ for example with a heavy use of mutexs, shared and weak pointers which I havent done before. LLM fixed a race condition, and I got to ask it precisely what the issue was, to draw a diagram showing what was happening in this exact scenario before and after.

I feel like Im learning more because I am doing way more high level things now, and spending way less time on the stuff I already know or dont care to know (non fundementals, like syntax and even libraries/frameworks). For example, I don't really give a fuck about being an expert in Spring Security. I care about how authentication works as a principal, what methods would be best for what, etc but do I want to spend 3 hours trying to debug the nuances of configuring the Spring security library for a small project I dont care about?

demorro|1 day ago

> Does this mean you'd be incapable of learning anything?

Yes. This strikes me as obvious. People don't have the sort of impulse control you're implying by default, it has to be learnt just like anything else. This sort of environment would make you an idiot if it's all you've ever known.

You might as well be saying that you can just explain to children why they should eat their vegetables and rely on them to be rational actors.

cedws|1 day ago

It may very well have stunted my learning. What’s the point of absorbing information when you have a consortium of experts available 24/7?

Saying what you said about it being down to being how you use LLM comes from a privileged position. You likely already know how to code. You likely know how to troubleshoot. Would you develop those same skillsets today starting from zero?

0x00cl|1 day ago

> We should be very concerned for the next generation. When you have the constant temptation of digging yourself out of a problem just by asking an LLM how will you ever learn anything?

This is just the same concern whenever a new technology appears.

* Socrates argued that writing would weaken memory, that it would create only superficially knowledge but incapable of really understanding. But it didn't destroy it. It allowed to store information and share it with many others far away.

* The internet and web indexers made information instantly accessible, allowing you to search for the information you just need, the fear is that people would just copy from the internet, yet researching information became way faster, any one with Internet access could access this information and learn themselves, just look at the amount of educational websites with courses to learn.

Each time a new technology came and people feared that it could degrade knowledge, the tools only helped us to increase our knowledge.

Just like with books and the internet, people could simply copy and not learn anything, its not exclusive to LLMs. The issue isn't in the tool itself, but how we use it. The new generation will probably instead of learning how to search, they will need to learn how to prompt, ask and evaluate whether the LLM isn't hallucinating or not.

cafebabbe|20 hours ago

Socrates was proven dead wrong by neurobiology.

LLMs making you dumber is far from being "disproven" by science. Quite the opposite https://arxiv.org/abs/2506.08872

0xbadcafebee|1 day ago

As an older person, I'm not worried. The world changes all the time. People are put people in difficult situations, and they have to adapt. "Oh no, how will people learn things?" is not that big of a struggle in the grand scheme. We're not burning books or giving people lobotomies. People can still learn if they want to, easier than ever before. Businesses will adapt, people will adapt, by necessity. Things will be very different, sure. But then we get used to the difference, and it becomes normal.

Kids today couldn't imagine how people used to live just 100 years ago, like it was the dark ages. People from that age would probably look at kids 10 years ago and think, these poor children! They don't know how to work hard! They don't know anything about life! They're glued to these bizarre light machines! Every age is different.

rwyinuse|22 hours ago

Yea, IMO people shouldn't make jobs / professions too big a part of their identity. At some point human programming may be largely gone, but probably there will be increased demand for something else.

It should be government's job to make it as easy as possible for people to retrain, switch jobs and start new careers. Obviously taxation should be reworked too, if AI and robots replace lots of jobs in some sectors. Profits produced by efficiency gains shouldn't be concentrated just among few billionaires.

aryehof|23 hours ago

My concern is also, how will programming and software design ever improve?

MeanEYE|16 hours ago

In my eyes, it will be same as introduction of garbage collectors. It will help to a degree, make people more lazy along the way and cause some additional and brand new issues. But over all very little will change as for serious implementations human intellect is still going to be the primary actor and AI will be disallowed.

eastbound|1 day ago

At the beginning of the internet, I used to save all webpages where I’d find info, just in case I would be stuck without a connection or if the website removed it. I had parts of the MDN.

The internet never fell. I bet it’ll be the same with AI. You will never not have AI.

The big difference is the internet was a liberation movement: Everything became open. And free. AI is the opposite: By design, everything is closed.

MeanEYE|16 hours ago

Not only that. AI will have increasingly diminishing returns as it relies on good quality human written code. As that starts being less and less true, quality of generated code will also suffer since at some point AI will train from AI generated content.