It's analogous to how functional programming expressed through languages like lisp is the classical foundation of spreadsheets.
I believe that skipping first principles (sort of like premature optimization) is the root of all evil. Some other examples:
- If TCP had been a layer above UDP instead of its own protocol beside it, we would have had real peer to peer networking this whole time instead of needing WebRTC.
- If we had a common serial communication standard analogous to TCP for sockets, then we wouldn't need different serial ports like USB, Thunderbolt and HDMI.
- If we hid the web browser's progress bar and used server-side rendering with forms, we could implement the rich interfaces of single-page applications with vastly reduced complexity by keeping the state, logic and validation in one place with no perceptible change for the average user.
- If there was a common scripting language bundled into all operating systems, then we could publish native apps as scripts with substantially less code and not have to choose between web and mobile for example.
- If we had highly multicore CPUs with hundreds or thousands of cores, then multiprocessing, 3D graphics and AI frameworks could be written as libraries running on them instead of requiring separate GPUs.
And it's not just tech. The automative industry lacks standard chassis types and even OEM parts. We can't buy Stirling engines or Tesla turbines off the shelf. CIGS solar panels, E-ink displays, standardized removable batteries, thermal printers for ordinary paper, heck even "close enough" contact lenses, where are these products?
We make a lot of excuses for why the economy is bad, but just look at how much time and effort we waste by having to use cookie cutter solutions instead of having access to the underlying parts and resources we need. I don't think that everyone is suddenly becoming neurodivergent from vaccines or some other scapegoat, I think it's just become so obvious that the whole world is broken and rigged to work us all to the grave to make some guy rich that it's giving all of us ADHD symptoms from having to cope with it.
I'm not sure about the rest of your comment, but we would likely still want GPUs even with highly multicore CPUs. Case in point: the upper-range Threadripper series.
It makes sense to have two specialized systems: a low-latency system, and a high-throughput system, as it's a real tradeoff. Most people/apps need low-latency.
As for throughput and efficiency... turns out that shaving off lots of circuitry allows you to power less circuitry! GPUs have a lot of sharing going on and not a lot of "smarts". That doesn't even touch on their integrated throughput optimized DRAM (VRAM/HBM). So... not quite. We'd still be gaming on GPUs :)
winwang|9 months ago
SQL on GPUs is definitely a research classic, dating back to 2004 at least: https://gamma.cs.unc.edu/DB/
zackmorris|9 months ago
Set Theory is the classical foundation of SQL:
https://www.sqlshack.com/mathematics-sql-server-fast-introdu...
It's analogous to how functional programming expressed through languages like lisp is the classical foundation of spreadsheets.
I believe that skipping first principles (sort of like premature optimization) is the root of all evil. Some other examples:
- If TCP had been a layer above UDP instead of its own protocol beside it, we would have had real peer to peer networking this whole time instead of needing WebRTC.
- If we had a common serial communication standard analogous to TCP for sockets, then we wouldn't need different serial ports like USB, Thunderbolt and HDMI.
- If we hid the web browser's progress bar and used server-side rendering with forms, we could implement the rich interfaces of single-page applications with vastly reduced complexity by keeping the state, logic and validation in one place with no perceptible change for the average user.
- If there was a common scripting language bundled into all operating systems, then we could publish native apps as scripts with substantially less code and not have to choose between web and mobile for example.
- If we had highly multicore CPUs with hundreds or thousands of cores, then multiprocessing, 3D graphics and AI frameworks could be written as libraries running on them instead of requiring separate GPUs.
And it's not just tech. The automative industry lacks standard chassis types and even OEM parts. We can't buy Stirling engines or Tesla turbines off the shelf. CIGS solar panels, E-ink displays, standardized removable batteries, thermal printers for ordinary paper, heck even "close enough" contact lenses, where are these products?
We make a lot of excuses for why the economy is bad, but just look at how much time and effort we waste by having to use cookie cutter solutions instead of having access to the underlying parts and resources we need. I don't think that everyone is suddenly becoming neurodivergent from vaccines or some other scapegoat, I think it's just become so obvious that the whole world is broken and rigged to work us all to the grave to make some guy rich that it's giving all of us ADHD symptoms from having to cope with it.
winwang|9 months ago
It makes sense to have two specialized systems: a low-latency system, and a high-throughput system, as it's a real tradeoff. Most people/apps need low-latency.
As for throughput and efficiency... turns out that shaving off lots of circuitry allows you to power less circuitry! GPUs have a lot of sharing going on and not a lot of "smarts". That doesn't even touch on their integrated throughput optimized DRAM (VRAM/HBM). So... not quite. We'd still be gaming on GPUs :)