(no title)
deerpig | 7 years ago
I used these boxes in Hong Kong in the early 90's, then later in Japan in the late 90's for a number of projects and I was still using an Indy as my main desktop until 2001.
SGI created hardware that almost took your breath away, you knew you were seeing a future that not many people had the privilege to see back then in person. To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...
bcaa7f3a8bbc|7 years ago
In the recent 3-5 years, there is a clear revival of the cyberpunk subculture online. Many related hobbyist websites appeared, many new cyberpunk-inspired independent art, music and games are composed, new communities are formed, etc.
Themes include a general nostalgia of the 80s, especially vintage computers, also the 90s early pre-Web 1.0.
The reason? We can clearly see. The lost future that never comes...
erikpukinskis|7 years ago
It will come slowly at first, and then all at once.
unknown|7 years ago
[deleted]
spitfire|7 years ago
aaaaaaaaaab|7 years ago
spilk|7 years ago
jsjohnst|7 years ago
Annatar|7 years ago
AriaMinaei|7 years ago
Like, can you arrange, say, ten flagship graphic cards for realtime rendering? Do we have game engines that can scale to that number?
tachyonbeam|7 years ago
Sidenote: I've read that John Carmack and id Software liked to develop on workstations that were "ahead of the curve" that way. It gave them an edge, in that they were able to develop future games for hardware that didn't yet exist, but knowing that consumer PCs would eventually catch up.
I think what made these SGI computers really amazing at the time is that there was no such thing as accelerated 3D graphics in the consumer market at the time (or much real-time 3D for that matter). They also had a cool Unix operating system with a UI that was way ahead of anything you could get on a consumer PC. I can also imagine that it was a much much more comfortable development environment than developing on say, MS-DOS, which didn't even have multitasking.
jamesfmilne|7 years ago
Unfortunately I wouldn’t say it feels like the future, more like a normal CentOS Linux desktop.
You’ll struggle to get a PC whose BIOS can handle much more than that too.
We used to build clusters for the same thing in the past, but that was largely standard supercomputing stuff but very similar to how the InfiniteReality machines were used. I believe our software once ran on Onyx machines in the dim & distant past.
So in short I wouldn’t say having loads of GPUs is enough to make it feel futuristic.
erikpukinskis|7 years ago
1) find some graphics problems which people say are not possible on any near-term hardware
2) study the algorithms and identify low level calculations which, if you could do orders of magnitude more of them, would allow you to solve the problem.
3) get a bunch of FPGAs and try to design a machine which can (very slowly) run that architecture
4) once you’ve got it working, slowly replace the FPGAs with ASICs
5) build a box with 16-64 of everything.
I would avoid polygons, since the current architectures are all extremely good at filling polygons. SDFs and raytracing are where you may find the “not on current gen” problems.
octorian|7 years ago
pvg|7 years ago
whywhywhywhy|7 years ago
brianpgordon|7 years ago
https://www.nvidia.com/en-us/data-center/hgx/
It shows up to the host computer as one really big GPU. Of course, you're going to get worse performance than just a single Titan V because it can handle any game already and there's inevitably going to be latency added by doing work over NVLink/NVSwitch. Those massive GPU products are targeted toward offline rendering or machine learning applications, not so much realtime simulation.
spitfire|7 years ago