(no title)
comeondude | 7 months ago
I’m an artist who’ve always struggled to learn how to code. I can pick up on computer science concepts, but when I try to sit down and write actual code my brain just pretends it doesn’t exist.
Over like 20 years, despite numerous attempts I could never get past few beginner exercises. I viscerally can’t stand the headspace that coding puts me in.
Last night I managed to build a custom CDN to deliver cool fonts to my site a la Google fonts, create a gorgeous site with custom code injected CSS and Java (while grokking most of it), and best part … it was FUN! I have never remotely done anything like that in my entire life, and with ChatGPT’s help I managed to it in like 3 hours. It’s bonkers.
AI is truly what you make of it, and I think it’s an incredible tool that allows you to learn things in a way that fits how your brain works.
I think schools should have curriculum that teaches people how to use AI effectively. It’s truly a force multiplier for creativity.
Computers haven’t felt this fun for a long time.
nomel|7 months ago
This is actually what I'm most excited about: in the reasonably near future, productivity will be related to who is most creative and who has the most interesting problems rather than who's spent the most hours behind a specific toolchain/compiler/language. Solutions to practical problems won't be required to go through a layer of software engineer. It's going to be amazing, and I'm going to be without a job.
ryandv|7 months ago
Why stop at software? AI will do this to pretty much every discipline and artform, from music and painting, to law and medicine. Learning, mastery, expertise, and craftsmanship are obsolete; there's no need to expend 10,000 hours developing a skill when the AI has already spent billions of hours in the cloud training in its hyperbolic time chamber. Academia and advanced degrees are worthless; you can compress four years of study into a prompt the size of a tweet.
The idea guy will become the most important role in the coming aeon of AI.
deadbabe|7 months ago
Consumer apps may see less sales as people opt to just clone an app using AI for their own personal use, customized for their preferences.
But there’s a lot of engineering being done out there that people don’t even know exists, and that has to be done by people who know exactly what they’re doing, not just weekend warriors shouting stuff at an LLM.
wwweston|7 months ago
atleastoptimal|7 months ago
Yes, AI currently has limitations and isn't a panacea for cognitive tasks. But in many specific use cases it is enormously useful, and the rapid growth of ChatGPT, AI startups, etc. is evidence of that. Many will argue that it's all fake, that it's all artificial hype to prop of VC evaluations, etc. They literally will see the billions in revenue as not real, same with all the real people upskilled via LLM's in ways that are entirely unique to the utility of AI.
I would trust many peoples' evaluations on the impacts of AI if they could at least engage with reality first.
ants_everywhere|7 months ago
One person told me the other day that for the rest of time people will see using an AI as equivalent to crossing a picket line.
vlovich123|7 months ago
It works better than you for UI prototypes when you don’t know how to do UI (and maybe even faster even if you do). It doesn’t work at all on problems it hasn’t seen. I literally just saw a coworker staring at code for hours and getting completely off track trying to correct AI output vs stepping through the problem step by step using how we thought the algorithm should work.
There’s a very real difference between where it could be in the future to be useful vs what you can do with it today in a useful way and you have to be very careful about utilizing it correctly. If you don’t know what you’re doing and AI helps you get it done cool, but also keep in mind that you also won’t know if it has catastrophic bugs because you don’t understand the problem and the conceptual idea of the solution well enough to know if what it did is correct. For most people there’s not much difference but for those of us who care it’s a huge problem.
alfalfasprout|7 months ago
If anything, HN is in general very much on the LLM hype train. The contrarian takes tend to be from more experienced folks working on difficult problems that very much see the fundamental flaws in how we're talking about AI.
> Many will argue that it's all fake, that it's all artificial hype to prop of VC evaluations, etc. They literally will see the billions in revenue as not real
That's not what people are saying. They're noting that revenue is meaningless in the absence of looking at cost. And it's true, investor money is propping up extremely costly ventures in AI. These services operate at a substantial loss. The only way they can hope to survive is through promising future pricing power by promising they can one day (the proverbial next week) replace human labor.
> same with all the real people upskilled via LLM's in ways that are entirely unique to the utility of AI.
Again, no one really denies that LLMs can be useful in learning.
This all feels like a strawman-- it's important to approach these topics with nuance.
thousand_nights|7 months ago
stop underestimating the amount of internalized knowledge people can have about projects in the real world, it's so annoying.
an llm can't ever possibly get close to it. there's some guy in a team in another building who knows why a certain weird piece of critical business logic was put there 6 years ago, the llm will never know this and won't understand this even if it consumed the whole repository because it would have to work there for years to understand how the business works
smohare|7 months ago
[deleted]
llmthrow103|7 months ago
What you're describing is a dead simple hobby project that could be completed by a complete novice in less than a week before the advent of LLMs.
It's like saying "I'm absolutely blown away by microwaves, I can have a meal hot and ready in just a few minutes with no effort or understanding. I think all culinary schools should have a curriculum that teaches people how to use microwaves effectively."
Maybe the goal of education should be giving people a foundation that they can build on, not make them an expert in something that a low skill ceiling and diminishing returns.
RangerScience|7 months ago
zx8080|7 months ago
From the context, it's not Java, but Javascript.
le-mark|7 months ago
My takeaway as an AI skeptic is AI as human augmentation may really have potential?
comeondude|7 months ago
adamredwoods|7 months ago
comeondude|7 months ago
we are never ready for seismic changes. But we will have to adapt one way or another, might as well find a good use for it and develop awareness as a child would around handling knives.
andsoitis|7 months ago
Enjoy the ride.
stevage|7 months ago
Have you tried using AI to further changes to any of these projects down the line?
comeondude|7 months ago
Since I’ve literally been working on this project for two days, here’s a somewhat related answer to your question: I’ve been using chat gpt to build art for TCG. Initially I was resistant and upset at AI companies were hoovering up people’s work wholesale for training data (which is why I think now is an excellent time to have serious conversation about UBI, but I digress).
But I finally realized that I could develop my own distinctive 3D visual style by feeding GPT my drawings and having it iterate in interesting directions. It’s fun to refine the style, by having GPT simulate actual camera lens and lighting set up.
But yes I’ve used AI to make numerous stylistic tweaks to my site, including building out a tagging system that allows me to customize the look of individual pages when I write a post)
Hope I’ll be able to learn how to build an actual complex app one day, or games.
debugnik|7 months ago
"I'm a computer scientist who's always struggled to learn how to paint." "Last night I managed to create a gorgeous illustration with Stable Diffusion, and best part ... it was FUN!" "Art hasn't felt this fun for a long time."
edaemon|7 months ago
comeondude|7 months ago
But, basically I wanted a way to have a custom repository of fonts a la Google Fonts (found their selection kinda boring) that I could pull from.
Ran fonts through transfonter to convert them to .woff2, set up a GitHub repository (which is not designed for people like me), and set up an instance on Netlify, then wrote custom CSS tags for my ghost.org site.
The thing that amazes me is that aside from my vague whiff of GitHub, I had absolutely no idea how to do this. Zilch. Nada. Chat GPT gave me a clear step by step plan, and exposed me to Netlify, how to write CSS injections, how ghost.org tagging works from styling side of things. And I’m able to have back and forth dialogue with it, not only to figure out how to do it, but understand how it works.
msgodel|7 months ago
Those are probably near the top of the list of things you don't want to blindly trust an LLM with building.
alganet|7 months ago
Think of a version that is even more fun, won't teach your kids wrong stuff, won't need a datacenter full of expensive chips and won't hit the news with sensacionalist headlines.
comeondude|7 months ago