(no title)
eden_hazard | 2 years ago
I've been scared of AI since seeing chatgbt a couple of years ago. I feel like it's only a matter of time until a dev can feed an AI machine it's entire code base and business requirements. And then a separate AI could carry out the manual/integration testing tasks. AI could potentially cut down the number of devs required to maintain a webapp or ios app after it's built.
I feel triggered by this post especially because I've made it a career writing automation code haha.
xyst|2 years ago
Seeing the same patterns with AI. Every startup is now incorporating “AI” or “deep learning” or “OpenAI” into their decks/motto/pitch.
Have yet to see anything worth using beyond the initial hype. Using AI to me is like learning another programming language. Same shit. Different interface.
65|2 years ago
1. type out yourself 2. copy and paste from StackOverflow 3. find a library that does the thing you want
It's not like CoPilot is any better. It's like when Microsoft and Google tried to force text completion on emails, it just gets in the way and makes me lose my train of thought.
AI is really great for very specific tasks that would be difficult to incorporate into a traditional algorithm. I really like Photoshop's background removal tool for example. But general purpose AI to me is blown out of proportion in terms of hype. Not everything needs iPhone levels of scaling. 3D printing, VR, Web 3, Cryptocurrencies, NFTs, Metaverse, AI. The list goes on. These things have niche use cases. AI is great for a lot of niche use cases (video upscaling, for example). But for general purpose software interaction? Maybe not.
dotty-|2 years ago
seba_dos1|2 years ago
koliber|2 years ago
jameshart|2 years ago
Why do we need web apps or iOS apps? They're just task specific computer interfaces.
It's possible that this kind of AI tech eliminates the need for task specific computer interfaces at all.
You don't need to tell the LLM to code up a TODO list app so you can sell it in the App Store.
The user doesn't even need to tell an LLM to make them a TODO list app.
Given an LLM that can persist and restore context, the user can just use the LLM as a personal assistant that keeps track of their TODO list.
Whatever the software is that we're working alongside our AI colleagues to build in ten years' time, I don't think it's going to be automated tests for apps and websites.
amluto|2 years ago
I would expect better results from LLMs using programming languages, perhaps ones tailored to LLMs, to prepare tasks on behalf of their users.
(Also, LLMs doing anything direct are an incredibly inefficient use of computing resources. There are quite a few orders of magnitude of difference between the FLOPs needed to do basic calculations and the FLOPs needed to run inference on a large model that may be able to do those calculations if well trained.)
karmasimida|2 years ago
wolverine876|2 years ago
Who can say how AI will develop, but beware of happily-ever-after stories. It could be a nightmare for all of civilization, for just engineers, or just not develop much futher.
jhrmnn|2 years ago
hooverd|2 years ago
mtrovo|2 years ago
It's extra funny when you think that resilience was the first thing Sam Altman answered when asked what kids should be learning today .https://youtube.com/shorts/OK0YhF3NMpQ
tomatohs|2 years ago
What's amazing is the social impact this has - often people don't believe it's real. It feels like when I had to explain to my parents that in my online multiplayer game, that the other characters were other kids at home on their own computers.
I think it's a matter of denial. Yes, software is made for humans and we will always need to validate that humans can use that software. But should a human really be required to manually test every PR in 10k person teams?
Again, as a founder of an AI Agent for E2E testing, we work with this every day. If I was a QA professional right now, I would watch the space closely in the next 6 months. The other option is to specialize in the emotional human part like in gaming. You can't test for "fun."
1. https://testdriver.ai. Demo: https://www.youtube.com/watch?v=HZQxgQ1jt4g
qlk1123|2 years ago
Sounds intuitive, but there are gaming researches working on that regard. Two related terms (learnt from IEEE Conference of Games) that come to mind:
1. Game refinement theory. The inventors of this theory see games as if they were evolving species, so this is to describe how game became more interesting, more challenging, more "refined". Personally I don't buy that theory because the series of papers had only a limited number of examples and it is questionable how related statistics were generated (especially the repeatedly occured baselines Go and Mahjong), but nonetheless there is theory on that.
2. Deep Player Behaviory Modeling (DPBM): This is the more interesting one. Game developers want their game to be automatically testable, but the agents are often not ready or not true enough. Says AlphaZero for Go or AlphaStar for StarCraft II, they are impressive ones but super-human, so the agnet's behavior give us little insight on how the quality of the game is and how to further improve the game. With DPBM, the signature of real human play can be captured and reproduced by agents, and thus auto-play testing is possible. Balance, fairness, engagement, etc. can then be used as the indirect keys to reassemble "fun."
unknown|2 years ago
[deleted]
dvngnt_|2 years ago
but this solution only appears to do e2e testing ignoring api and unit testing. Additionally, automated test are mostly used for regression testing not exploratory testing of new features where most bugs will be found.
unknown|2 years ago
[deleted]
gerdesj|2 years ago
Amongst other things, one of the tasks I tried to put ChatGPT to was writing scripts in a not so popular dialect of a very popular language: PowerCLI (PowerShell). Gemini is even worse!
The issue is of course the relative lack of PowerCLI vs the huge body of PowerShell generic stuff. Hallucinations include invented function parameters and much worse. It doesn't help that PowerCLI and MS's Hyper-V effort (whatever that is) both have a Get-VM function etc.
These things are "only" next token/word guessers. They are not magic and they are certainly not intelligent. I do get great results in other domains and with a bit of creativity but you have to be really careful.
No need to feel triggered. Use these tools as best works for you and crack on but do be careful to be an engineer and critically examine the output from the tool.
cjbprime|2 years ago
This is precisely the kind of vacuous "this is technology, I know technology, this is simple" hubristic underestimation that's being called out.
There is no upper bound to the intelligence of a "next token/word guesser". You can end up incorporating an entire world model to your predictions to improve their accuracy, and arguably this has already happened, to a currently-unreliable and basic level. It is possible that no technological advances are required to reach better than human intelligence from this point -- only more compute, bigger models and datasets, and (therefore) better next-token predictions.
saurik|2 years ago
Swizec|2 years ago
Services! All sorts of things are going to become cost-effective that currently aren't.
Want a personal trainer? Motivational coach? Someone to sit next to you slapping your phone out of your hands every time you open social media? Personal runner doing errands? You can afford that now!
We're going to have an ever-shrinking pool of highly critical un-automatable people with an army of support folk keeping them running at peak productivity at all times.
You already see this trend in people who think of themselves as a business. They hire everyone from personal assistants to nannies. All in the name of "Well I make $200/h and there's this chore that costs only $50/h to delegate ..."
skydhash|2 years ago
twobitshifter|2 years ago
Jerrrry|2 years ago
OpenAI + any half-assed data broker could easily infer the company, as I am sure they have already done.
All hail Microsoft, I am glad I chose the right AI megacorp overload early.
littlestymaar|2 years ago
What's the useful part of my job? It's bringing values to my customers: they want to do something, and they need help. But today, unfortunately I must always tell them to reduce their ambition because they wouldn't be able to afford it or to be stuck waiting for the project to deliver even if they could fund it.
I've seen a enormous shift in developers ability to actually ship valuable products to customers when opensource became mainstream, thanks to github and things like npm: no more custom half-backed libraries for doing all the features necessary for the product to work, we could just re-use an existing library. More than half of our job disappeared these days, yet nobody regrets the days when you had to write your own code for absolutely every features, and the number of programmers exploded since then[1].
I wish AI assistant could be as impactful as github and npm, and I'm pretty sure they will eventually, and that day will be a great day for developers, not a bad one.
We're not going to lose our job, because from the perspective of the dude who hold the money, our job is to be the weirdo that talks to machines to deliver what the his ego want his company to be. Hence, the more you are able to deliver, the happier the money holder, and the more money you make. Our job will be threatened when the guys with money will be willing to actually make the effort of talking with the machine, but I don't see this day coming anytime soon. The ambition of man is unlimited, but his will to make efforts is in scarce supply.
The only realistic risk with AI, is big corporations grabbing all the benefits of the added productivity, this is a serious risk, and a very good reason not to be using OpenAI so we don't trigger a self-reinforcing feedback loop that give them a monopoly position where we all end up losing because we depend on them.
[1]: this has caused lots of sustainability issues for the library authors, but not for the developers using the libraries.
cauliflower2718|2 years ago
g-b-r|2 years ago
There's plenty of managers who prefer hacking an excel and vba than relying on developers; those will probably be more than glad of using some IA tool, to do something more advanced; of course mostly just for internal stuff, in the short term.
Anyhow many managers have a distaste for developers; they consider them overpaid slackers who'll lose time on anything, rather than producing money for the business.
They'll be more than glad to turn the job to the cheapest unqualified guy around, if with an LLM he can make something that looks passable
unknown|2 years ago
[deleted]
QuantumGood|2 years ago
dvngnt_|2 years ago
pton_xd|2 years ago
FirmwareBurner|2 years ago
Seriously, the demand for skilled handymen in the cities is insane as people can't do shit anymore. As per the South Park episode, you have to treat them well if you want them to pick up the phone or return your calls.