(no title)
Tractor8626 | 4 months ago
20 years ago things werent any better. Software didn't consume gigabytes of ram because there was no gigabytes of ram to consume.
Tractor8626 | 4 months ago
20 years ago things werent any better. Software didn't consume gigabytes of ram because there was no gigabytes of ram to consume.
0xbadcafebee|4 months ago
We have a vastly different software culture today. Constant churning change is superior to all else. I can't go two weeks without a mobile app forcing me to upgrade it so that it will keep operating. My Kubuntu 24.04 LTS box somehow has a constant stream of updates even though I've double-checked I'm on the LTS apt repos. Rolling-release distros are an actual thing people use intentionally (we used to call that the unstable branch).
I could speculate on specifics but I'm not a software developer so I don't see exactly what's going on with these teams. But software didn't used to be made or used this way. It felt like there were more adults in the room who would avoid making decisions that would clearly lead to problems. I think the values have changed to accept or ignore those problems. (I don't want to jump to the conclusion that "they're too ignorant to even know what potential problems exist", but it's a real possibility)
lproven|4 months ago
Bad news... only the GNOME edition is a true LTS. All the flavours are not.
https://wiki.ubuntu.com/NobleNumbat/ReleaseNotes/Kubuntu:
Support lifespan
Kubuntu 24.04 will be supported for 3 years.
username223|4 months ago
dilap|4 months ago
The main reason is the ability to do constant updates now -- it changes the competitive calculus. Ship fast and fix bugs constantly wins out vs. going slower and having fewer bugs (both in the market & w/in a company "who ships faster?").
When you were shipping software on physical media having a critical bug was a very big deal. Not so anymore.
chipsrafferty|4 months ago
alexjplant|4 months ago
Let's not even think about the absolute mess that the web was with competing browser box models and DHTML and weird shared hosting CGI setups. We have it easy.
jayd16|4 months ago
Sure, plenty of stuff didn't work. The issue is we're not bothering to make anything that does. It's a clear cultural shift and all of this "nothing ever worked so why try" talk here is not what I remember.
We're in a stochastic era of scale where individual experiences do not matter. AI turning computers from predictable to not is in the same direction but with yet more velocity.
cmrdporcupine|4 months ago
Companies offered such (expensive) services because they had no choice. They made every effort to divert and divest from such activities. Google and companies like them made filthy profits because they figured out the secret sauce to scaling a business without the involvement of humans, but people were trying it for literally decades with mixed results (usually enraged customers).
Stupid red tape, paperwork, and call centre frustrations were the order of the day 20-30 years ago.
pornel|4 months ago
It's from 1995 and laments that computers need megabytes of memory for what used to work in kilobytes.
restalis|4 months ago
Nowadays' disregard for computing resource consumption is simply the result of said resources getting too cheap to be properly valued and a trend of taking their continued increase for granted. There's simply little to no addition in today's software functionality that couldn't do without the gigabytes levels of memory consumption.
dlcarrier|4 months ago
bigstrat2003|4 months ago
Yes they were. I was there. Most software was of a much higher quality than what we saw today.
username223|4 months ago
lproven|4 months ago
No, I don't think that's right, because:
> 20 years ago things werent any better.
I think you have the timeframe wrong.
20Y ago, no.
30Y ago, yes, somewhat. Win NT came out 32 years ago.
40Y ago, yes, very much.
No public internet, very slow point-to-point dialup comms for a tiny %age of users, and tiny simple software for very limited hardware meant better quality software.
I installed multiple Novell Netware servers on company networks, both Netware 2.15 and Netware 3.1. They never ever got updated, and ran flawlessly for years on end.
I installed dozens, hundreds, of machines with DOS 3.3 and they ran it until they were scrapped.
I put in multiuser systems based around SCO Xenix: Unix boxes, but with no networking, no GUI or X11, no comms, no compiler. They had uptimes in years: zero crashes.
Stuff was more reliable because it had to be because shipping an updated meant posting media to thousands of users and sending a human to install it. Nobody could afford it.
Software and hardware should be subject to the same laws as vehicles: if it fails in standard use, the maker is liable. So make it safe.
If that means it has to be 0.1% of the size and 0.1% of the functionality that it was 20Y ago, fine: so be it.
Because that's still huge and rich compared to the DOS stuff I started my career on. It is not some savage brutal unimaginable limitation, utterly unrealistic. It was the reality of end-20th century software around the time that the PC industry moved to 32-bit hardware at the end of the 1980s.
spankibalt|4 months ago
don_neufeld|4 months ago
Therac-25?
Anyone?
Bueller? Bueller?
snapcaster|4 months ago
1313ed01|4 months ago
cmrdporcupine|4 months ago
On top of that many things were simply hard to use for non-specialists, even after the introduction of the GUI.
They were also riddled with security holes that mostly went unnoticed because there was simply a smaller and less aggressive audience.
Anyways most people's interaction with "software" these days is through their phones, and the experience is a highly focused and reduced set of interactions, and most "productive" things take a SaaS form.
I do think as a software developer things are in some ways worse. But I actually don't think it's on a technical basis but organizational. There are so many own goals against productivity in this industry now, frankly a result of management and team practices ... I haven't worked on a truly productive fully engaged team in years. 20-25 years ago I saw teams writing a lot more code and getting a lot more done, but I won't use this as my soapbox to get into why. But it's not technology (it's never been better to write code!) it's humans.
lproven|4 months ago
Bad news. We're older than we tend to remember.
Windows NT 3.1 shipped 32 years ago, the year after OS/2 2.0.
By 1994 NT 3.5 was out, and 30 years ago, NT 3.51 had been out for about 6 months.
I ran that and supported it in production and it was damned near bulletproof.
jjk166|4 months ago
dghlsakjg|4 months ago
Computers crashed all the fucking time for dumb bugs. I remember being shocked when I upgraded to XP and could go a full day without a BSOD. Then I upgraded to intel OSX and was shocked that a system could run without ever crashing.
Edit: this isn't to say that these issues today are acceptable, just that broken software is nothing new.
ieie3366|4 months ago
kilbuz|4 months ago