What ever you think of Vulkan, it is NOT typical of modern software engineering. Its a massive outlier. It is very demanding of its users, it is very low level, it exposes the complexity of the hardware, it is large but not bloated, as it reflects the hardware complexity.
If you complain that modern software is slow (it is), then its because people are NOT using low level APIs like Vulkan, that forces you to think about how to most effectively utilize the hardware. Its because people don't want to know how anything works, and load up on a mountain of dependencies and run everything in a garbage collected virtual machine in a browser.
IMHO, the most concerning sign about software engineering practice isn't exactly "bloat", but the inability of the industry to do anything securely.
Consider every software security update to be a bridge falling down, due to incompetence.
In this case, the fault isn't so much individual incompetence, as collective incompetence of the field. The ecosystem is toxic, as are conventional practices, as are market incentives. Individuals might be incompetent on top of that, but the situation is nigh impossible for competent ones as well.
And there are no professional engineer licenses to pull, nor few individuals to send to jail.
Computer security was broken during the Viet Nam conflict, as there were systems that had to accommodate multiple classification levels at the same time, and they couldn't do it with the OSs of the day. Multilevel security was developed after that, which leads to Capability Based Security.
Essentially, your current OS requires you to have absolute faith in any software that runs on your behalf. If it goes rogue, or gets confused, almost anything you could do to sabotage yourself... it can do in milliseconds.
On the other hand, if the OS didn't have that requirement, and instead let you choose what files to open, and enforced your decisions, you wouldn't have to trust your software at all.
From the point of a GUI user, they wouldn't see any difference in behavior. File Open still works like always, but the logic behind the scenes is slightly different.
Command line usage, that's a tougher nut to crack. There needs to be a standard way of defining what files you're passing on the command line.
We could stop blaming everything but the OS... but we likely won't. 8(
This is pure hyperbole. You may as well compare my broken fridge to a bridge falling down. Engineers designed that too but no one is going to jail over it.
> In this case, the fault isn't so much individual incompetence, as collective incompetence of the field. The ecosystem is toxic, as are conventional practices, as are market incentives. Individuals might be incompetent on top of that, but the situation is nigh impossible for competent ones as well.
Eh, I disagree. Software engineering allows for failure where the cost of failure is low. So your mail app stops working and you have to restart it, who cares, so long as you get some neat new features faster that on the whole make your life better?
In parts of the industry where failure is costly, like medical or aviation, we operate differently.
Speed and correctness are antithetical generally, and different parts of the industry accept different trade-offs based on their risk tolerance.
The goal isn't to build something that operates perfectly at all costs, the goal is to develop systems that operate as well as they need to given external constraints.
[edit] The truth is we wouldn't have 1/10th of the cool shit we have today if we demanded absolute perfection from things that just didn't need it because we're a "profession."
I respectfully disagree. I’ve passed the same thoughts back and forth in my mind before, but it’s mostly nostalgia.
Yes, some tools were much snappier on much crappier hardware, but they also lacked features, including safety and collaborative ones.
We have so many more people using computer devices and the internet now. We have to account for them to some extent and a lot of libraries do that for us, but it does make them heavier.
I’d say yes software has gotten fat and slow in a lot places, but also immensely more capable and more reusable.
> We have so many more people using computer devices and the internet now. We have to account for them to some extent and a lot of libraries do that for us, but it does make them heavier.
A significant percentage (majority?) of those devices and internet connections are low powered and slow compared to what most readers here are using
Those heavy libraries can often exclude those users simply because they can't adequately run the fancy features those libraries provide
There's a balance here that I think we, as an industry, are not managing well
> If you want a multiplatform graphics API, you should use a library which implements such API on top of these native OS-specific APIs.
Typical software approach- instead of fixing the fundamental problem, just put a plaster on top. That’s how we deal with everything. Thats why our industry can’t be trusted.
Something similar happened back in the late 80's when we had painful EGA and then VGA/MCGA arrived with its glorious mode 13h, then everything about graphics became easy and cool! Then Super-VGA entered...
So the problems in article are: Apple doesn't support Vulkan, Apple chips were incompatible with Docker, Apple deprecated OpenGL support, Apple don't make their Metal multiplatform. I think I see a pattern here, hehe
I am very pissed when I see software “engineers” more concerned about a vim configuration than thinking why their shell profile takes three to four full seconds to load.
I've been happily playing Baldur's Gate 3 on my MacBook at 1920x1200 resolution with most settings maxed out. It hasn't sucked. In fact, it's been pretty smooth.
I would argue the iPhone and iPad together are probably one of the biggest gaming markets on earth. There are after all 300M consoles and 900M iPhones out there.
The root of many successful revolutions is observing that what exists is simply unacceptable and working to build something transformatively new. Many such attempts fail, but not all of them do. Perhaps the author is identifying such a need, and might consider architecting a superior cross platform solution from first principles.
I highly doubt that building a 10 c++ files was as fast as now. I am clearly remember building c# apps for a much longer, i remember how deployment of a trivial app on Azure took ~15 minutes. Meanwhile we still don't have 5k displays with 120hz, hardware is not even here.
"Graphics programmer thinks that their field represents all of software, news at 11. In other news, systems programmers think that anyone who doesn't want to deal with memory management is a weenie."
Missed an option for cross platform graphics, which is IMO the best one: WebGPU. Supported on all platforms, much simpler than Vulkan, and with solid C++ (Dawn) and Rust (wgpu) libraries.
I've been saying this for some time. This is all in line with the bullshit jobs phenomenon which is likely the result of reserve bank 'full employment' agenda.
People used to think I was a conspiracy theorist for suggesting this. Yet it's clear as crystal that the incentives in the monetary system itself are set up this way. Can you think of a more powerful incentive than money to drive behavior?
Meh. Yet another unfocused rant about "complexity". PC gaming is essentially dying and so there's no money in it and no effort being put into it, and that goes triple for PC gaming on Apple systems. Stuff gets faster when you pay skilled people to make it faster, and gets slower when you don't care how fast it is. That's all it's ever been.
PC gaming is essentially dying?? Is that why the gaming market is more than the market for music and movies combined? Maybe Steam would like to comment.
[+] [-] quelsolaar|2 years ago|reply
If you complain that modern software is slow (it is), then its because people are NOT using low level APIs like Vulkan, that forces you to think about how to most effectively utilize the hardware. Its because people don't want to know how anything works, and load up on a mountain of dependencies and run everything in a garbage collected virtual machine in a browser.
[+] [-] mistrial9|2 years ago|reply
for some definition of "people" .. isn't the detail at the low level sort of overwhelming on modern cards?
[+] [-] Cacti|2 years ago|reply
[+] [-] neilv|2 years ago|reply
Consider every software security update to be a bridge falling down, due to incompetence.
In this case, the fault isn't so much individual incompetence, as collective incompetence of the field. The ecosystem is toxic, as are conventional practices, as are market incentives. Individuals might be incompetent on top of that, but the situation is nigh impossible for competent ones as well.
And there are no professional engineer licenses to pull, nor few individuals to send to jail.
[+] [-] mikewarot|2 years ago|reply
Essentially, your current OS requires you to have absolute faith in any software that runs on your behalf. If it goes rogue, or gets confused, almost anything you could do to sabotage yourself... it can do in milliseconds.
On the other hand, if the OS didn't have that requirement, and instead let you choose what files to open, and enforced your decisions, you wouldn't have to trust your software at all.
From the point of a GUI user, they wouldn't see any difference in behavior. File Open still works like always, but the logic behind the scenes is slightly different.
Command line usage, that's a tougher nut to crack. There needs to be a standard way of defining what files you're passing on the command line.
We could stop blaming everything but the OS... but we likely won't. 8(
[+] [-] bcrosby95|2 years ago|reply
[+] [-] mathgladiator|2 years ago|reply
[+] [-] ClumsyPilot|2 years ago|reply
And reliably. I’ve been banging this drum for a while, the castle is built on quicksand.
As general purpose software makes its way into cars, and ever more critical system start running JavaScript on non-realtime Linux with buggy drivers.
we will eventually have some kind of software caused catastrophe and then regulation will come down on us like a metric ton of bricks.
[+] [-] arcticbull|2 years ago|reply
Eh, I disagree. Software engineering allows for failure where the cost of failure is low. So your mail app stops working and you have to restart it, who cares, so long as you get some neat new features faster that on the whole make your life better?
In parts of the industry where failure is costly, like medical or aviation, we operate differently.
Speed and correctness are antithetical generally, and different parts of the industry accept different trade-offs based on their risk tolerance.
The goal isn't to build something that operates perfectly at all costs, the goal is to develop systems that operate as well as they need to given external constraints.
[edit] The truth is we wouldn't have 1/10th of the cool shit we have today if we demanded absolute perfection from things that just didn't need it because we're a "profession."
[+] [-] rijx|2 years ago|reply
Yes, some tools were much snappier on much crappier hardware, but they also lacked features, including safety and collaborative ones.
We have so many more people using computer devices and the internet now. We have to account for them to some extent and a lot of libraries do that for us, but it does make them heavier.
I’d say yes software has gotten fat and slow in a lot places, but also immensely more capable and more reusable.
[+] [-] wild_egg|2 years ago|reply
A significant percentage (majority?) of those devices and internet connections are low powered and slow compared to what most readers here are using
Those heavy libraries can often exclude those users simply because they can't adequately run the fancy features those libraries provide
There's a balance here that I think we, as an industry, are not managing well
[+] [-] Const-me|2 years ago|reply
If you want a multiplatform graphics API, you should use a library which implements such API on top of these native OS-specific APIs.
I have good experience with that one: http://diligentgraphics.com/diligent-engine/ I’ve used it couple times on Windows with D3D12 backend, and on Linux with GLES 3.1 backend.
[+] [-] ClumsyPilot|2 years ago|reply
Typical software approach- instead of fixing the fundamental problem, just put a plaster on top. That’s how we deal with everything. Thats why our industry can’t be trusted.
[+] [-] readyplayernull|2 years ago|reply
[+] [-] qustrolabe|2 years ago|reply
[+] [-] 0x457|2 years ago|reply
Uhm what?
[+] [-] keyle|2 years ago|reply
[+] [-] mathgladiator|2 years ago|reply
[+] [-] cassianoleal|2 years ago|reply
[+] [-] arcticbull|2 years ago|reply
[+] [-] sumuyuda|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] RagnarD|2 years ago|reply
[+] [-] ex3ndr|2 years ago|reply
[+] [-] jiggawatts|2 years ago|reply
E.g.: https://tftcentral.co.uk/news/asus-announced-rog-swift-pg32u...
Azure deployment pipelines are still slow as molasses. That hasn't changed.
[+] [-] w0z_|2 years ago|reply
[+] [-] commandlinefan|2 years ago|reply
[+] [-] EPWN3D|2 years ago|reply
[+] [-] wffurr|2 years ago|reply
[+] [-] llimllib|2 years ago|reply
[+] [-] jongjong|2 years ago|reply
People used to think I was a conspiracy theorist for suggesting this. Yet it's clear as crystal that the incentives in the monetary system itself are set up this way. Can you think of a more powerful incentive than money to drive behavior?
[+] [-] justanotherjoe|2 years ago|reply
[+] [-] lmm|2 years ago|reply
[+] [-] RagnarD|2 years ago|reply