I can know literally nothing about a programming language, ask a LLM to make me functions and a small program to do something, then read documentation and start building off of the base immediately, accelerating my learning allowing me to find new passions for new languages and new perspectives for systems. Whatever's going on in the AI world, assisting with learning curves and learning disabilities is something it's proving strong in. It's given me a way forward with trying new tech. If it can do that for me, it can do that for others.Diminishing returns for investors maybe, but not for humans like me.
EternalFury|1 year ago
mewpmewp2|1 year ago
lambdaone|1 year ago
None of which is to discount the furture potential of LLMs, or the amazing ability they have right now - I've solved other simpler problems almost entirely with LLMs. But they are not a panacea.
Yet.
ern|1 year ago
My current feeling is that LLMs great with dealing with known unknowns. You know what you want, but don’t know how to do it, or it’s too tedious to do yourself.
throw101010|1 year ago
A 20% time improvement sounds like a big win to me. That time can now be spent learning/improving skills.
Obviously learning when to use a specific tool to solve a problem is important... just like you wouldn't use a hammer to clean your windows, using a LLM for problems you know have never really been tackled before will often yield subpar/non-functional results. But even in these cases the answers can be a source of inspiration for me, even if I end up having to solve the problem "manually".
One question I've been thinking about lately is how will this work for people who always had this LLM "crutch" to solve problems when they've started learning how to solve problems? Will they skip a lot of the steps that currently help me know when to use a LLM and when it's rather pointless currently.
And I've started thinking of LLMs for coding as a form of abstraction, just like we have had the "crutch" of high-level programming languages for years, many people never learned or even needed to learn any low-level programming and still became proficient developers.
Obviously it isn't a perfect form of abstraction and they can have major issues with hallucinations, so the parallel isn't great... I'm still wondering how these models will integrate with the ways humans learn.
Mc91|1 year ago
Eddy_Viscosity2|1 year ago
mewpmewp2|1 year ago
neerd|1 year ago
swatcoder|1 year ago
Learning new programming languages wasn't a hurdle or mystery for anyone experienced in programmong previously, and learning programming (well) in the first place ultimately needs a real mentor to intervene sooner than later anway.
AI can replace following rote tutorials and engaging with real people on SO/forums/IRC, and deceive one into thinking they don't need a mentor, but all those alternatives are already there, already easily available, and provide very significant benefits for actual quality of learning.
Learning to code or to code in new languages with the help of AI is a thing now. But it's no revolution yet, and the diminishing returns problem suggests it probably won't become one.
__MatrixMan__|1 year ago
unknown|1 year ago
[deleted]
SoftTalker|1 year ago
cratermoon|1 year ago
poink|1 year ago
The diminishing returns for humans like you are in the training cost vs. the value you get out of it compared to simply reading a blog post or code sample (which is basically what the LLM is doing) and implementing yourself.
Sure, you might be happy at the current price point, but the current price point is lighting investor money on fire. How much are you willing to pay?
thefz|1 year ago
player1234|1 year ago