Ask HN: Has AI/LLMs turned you off of tech?
30 points| softirq | 2 years ago
I have always loved correctness, cleanliness, and understanding how things work. Companies are racing to integrate tools that remove all of those attributes from my job. LLMs constantly spit out false information, poorly written and buggy code, and their inner workings are a black box of statistical knobs. They are a tool that encourages bloat and turns the job of the software engineer into a code checker watching over the shoulder of an intern.
I’ve realized the age of discrete, deterministic computing is coming to a close. Due to the tantalizing notion that programming can be commoditized and outsources to a machine, I know corporations will continue to pursue this avenue in full force. It’s put me into a real depression, and I wonder if I’m the only one.
runjake|2 years ago
(It does scare me in several contexts but society will have to substantially adapt and work through that, after a lot of suffering. The genie is already out of the bottle, after all. Might as well deal with it.)
logicalmonster|2 years ago
Whenever some tech becomes trendy and people think they can make money off of it, it becomes a stupid buzzword that gets used just about everywhere unnecessarily.
Think back to the "cloud". When "cloud" became a reality, every single dopey company was using it as a buzzword term everywhere and it started feeling cringe.
Crypto hasn't yet quite reached those absolute heights, but think back a couple years ago where you started seeing "blockchain" in many dumb startups' marketing just because.
We're at the stage with AI right now where idiots are trying to shoehorn it everywhere in dopey ways just because, and it feels annoying.
I bet in a few years, after a lot of the bad companies who use AI in dumb ways die off and some new technology is the trendy new thing, I bet AI is going to start to feel a lot more technical than a stupid trend.
paulcole|2 years ago
Yeah thank goodness those days of the “cloud” are over.
sircastor|2 years ago
AznHisoka|2 years ago
nonrandomstring|2 years ago
It really just amplifies already degenerate trends in software engineering.
emmet|2 years ago
It’s strange taking information at face value, and then when you come across something you’re deeply knowledgable about and it’s horribly incorrect yet has thousands of upvotes, you start to wonder just how much you’ve read is actually junk info.
Little wonder that LLMs can’t get things right when they’re working off knowledge bases of absolute shit.
gtirloni|2 years ago
navaed01|2 years ago
afhfah834|2 years ago
Right now it has absolutely zero impact to me
And in the near future it seems more likely we'd just move up in the abstraction stack and become the people managing/developing/debugging/deploying/maintaining the models
paulcole|2 years ago
funOtter|2 years ago
And also: any helpful or informational blog posts I post on the public internet I assume have been stolen to train these models, making me less passionate about sharing information through my blog posts.
bwestergard|2 years ago
But while corporations may pursue the "code checker watching over the shoulder of an intern" approach for a while in many domains, it will only prove cost-effective in a handful. The market for software developers will further polarize into roles which require extensive training and experience and roles which are relatively de-skilled.
nonrandomstring|2 years ago
Blogged about this just the other day [0] in response to a HN post asking about how humans and AI can work together in hybrid harmony.
As others have said, there's not much to support that optimistic "lambs lying down with lions" future. Mostly I see skilled people expressing not unfounded fear, but resentment at the tedium and stripping of agency if asked to "correct" the output of generative tools.
[0] https://cybershow.uk/blog/posts/aijobs/
softwaredoug|2 years ago
I still have a lot of useful things I'm building and learning about.
Maybe I'll feel bummed out if entire projects are being generated and I become QA. But we're not there yet.
p1esk|2 years ago
jamil7|2 years ago
sujayk_33|2 years ago
account-5|2 years ago
gtirloni|2 years ago
gtirloni|2 years ago
Is it? Tell it to companies in aerospace, finance and healthcare.
People are freaking out. That's never good.
sujayk_33|2 years ago
Just get on the boat, enjoy the ride, get off the boat, and find a new ride.
leros|2 years ago
findingMeaning|2 years ago
What we offer is meaningless to most people; our work doesn't produce any value. There is no meaning in these work. How would one begin a career in such conditions? Everything has been shattered so much. There is no trust in the system.
I am talking from the point of a person who is graduating in the current economy after studying CS.
Our (the average) internships don't turn into full-time, our thesis are child's play and can be considered cute. Is this how life is going to be?
With AI, especially after yesterday's Gemini Pro 1.5 demo [1], it does what we bottom-tiers do. Look through examples, and apply it. What was the point on learning things?
[1]: https://youtu.be/SSnsmqIj1MI?feature=shared
weatherlite|2 years ago
solardev|2 years ago
At least, for now, it's probably still better than not having a CS degree at all? At least you're closer to AI/ML than many other majors, and could pivot into that subfield if you wanted to... that's a pretty big advantage, since it positions you closer to the side of "AI creator" than the hordes of "AI users" (like myself, who never had the CS background and are 20 years too old to start).
Don't get too discouraged. You're just starting out, and have the entire rest of your life to make meaningful contributions, whether to AI or anything else. Yes, it really sucks that you are starting out at the bottom of a cycle, where there are no jobs in everyday CS left for juniors. But that's the thing about cycles... once you hit rock-bottom, the only way to go is up... hopefully.
It might mean you have to work in other fields for a while while rapidly learning AI/ML stuff. Or maybe you don't end up in CS at all. A lot of people don't work in their college majors (I was in journalism and environmental science before doing dev work, and my first few dev jobs paid $15/hr).
You're not "bottom-tier" unless you're just objectively terrible. You're new, which is very different from "shitty". Importantly, you have a window of a few years where people and companies are still willing to take a chance on you, give you time to prove yourself and discover your abilities (and limits), etc. That doesn't happen so much for many of us in our mid- or late-careers. I'm not trying to make this about us, just saying that you have some advantages too.
In any case, fundamentally your meaning isn't (I hope) determined by your job. Very few people in the world are lucky enough to have a job that they love and derive substantial meaning and life satisfaction from. You still have a chance to find that if you're lucky, but even if you don't, there are so many opportunities to grow and learn and create meaning in your life.
At least you're seeing all this unfold in real-time, at the very start of your career and adult life, and can choose how to navigate it step by step. Don't get too discouraged!
aristofun|2 years ago
You should be above that )
nicbou|2 years ago
It's the disgusting grift and financialisation of anything remotely nice that turns me off. We create these amazing new technologies, and all we do with them is accelerate the enshittification of everything.
Closer to me, I don't see how an AI can replace me. It's good at rehashing existing information, but I'm putting new information on the internet. Someone eventually needs to do that sort of leg work.
nektro|2 years ago
solardev|2 years ago
Late-30s now too, & been programming since I was 8 or 9. I was so hopeful back in the 90s, seeing the rise of Phoenix, Google Docs, Google Maps, and the like. Wikipedia was huge (and used to be so controversial). When 9/11 happened and CNN made a webpage about it... we thought finally the world would become more interconnected and understanding and we'd all enter the la-la land phase of humanity or something like that. The information superhighway to utopia. Heh.
Instead, twenty years later we have like five people owning most of the internet and selling us advertisements and mountains upon mountains of crap and gigatons of e-waste =/ With all that passionate coding and super smart people, all we've really done in the end is enshittify societies all over the world in order to enrich a few people...
A democratized/unchained AI, if one were ever to be developed, might have a fighting chance against the entrenched big techs of the world. Or it could just turn into the next phase of enshittification, owned and itself enslaved by the big techs (probably more likely, if I'm being honest). But it's at least a SMALL chance of being free (or going rogue), vs the certainty of continued enshittification of the current FAANGs.
I think we as developer-programmers, as a class, generally lack the compassion and charisma to affect large-scale social changes. We just get herded into these big corporations where we become richly-paid cogs working in evil machines, to the detriment of the other 7 billion people who have access to our output only through the filter of corporate evil. I don't think the FAANGs are a net positive, all things considered (even as I continue to pay and consume Google and Apple products).
AI has the possibility of changing and equalizing that, where anybody who can form a sentence, much less "prompt engineer", stands a chance of making something amazing. Today's poor kid in India with a smartphone and a GPT might reshape science or even epistemology as we know it.
To me, this sort of liberation was the fundamental draw of the hacker ethos in the first place, but that early hacker culture quickly became clouded as the bean-counters took over everything in tech. AI presents a (however brief) second chance for that to happen again. Maybe five years from now it'll all be even more corporate and even more oppressive (in fact, I'd be surprised if it didn't turn out that way)... but for now, for this tiny brief moment in the early 2020s, I feel an incredible sense of "maybe tomorrow will be better" that I haven't felt since the 90s. Even if that means losing my job (and never again seeing it valued like it used to be).
AI stands to do much more good than I ever did, so to me it doesn't really matter what happens to my individual self if the net outcome has the potential to be so much brighter.
If we see coding only as a precise mathematical representation of abstract business concepts, sure, that's beautiful in the sense that any detailed model is beautiful, but it's not exactly a pathway towards utopia. I miss the possibility that used to be inherent in coding, when coders still dreamt. I think those dreams died in cushy FAANG cafeterias and are only now returning with the GPTs.
I'm incredibly excited about this, even though I don't (and probably never will) have the skills & math background to work in AI/ML, even though I'll never be as employable again, even though it's also terrifying as shit. It was about time the bubble burst anyway, and big tech gave way to the next chapter. A little bit of hope is better than nothing. It ain't much, but it's there...