(no title)
evanweaver | 2 years ago
Early 90s: SGI invented OpenGL to make realtime 3D graphics practical, initially for CAD/CAM and other scientific/engineering pursuits, and started shipping expensive workstations with 3d accelerated graphics. Some game artists used these workstations to prerender 3d graphics for game consoles. Note that 2D CAD/CAM accelerators had already been in market for nearly a decade, as had game consoles with varying degrees of 2D acceleration.
Mid-90s: Arcades and consoles starting using SGI chips and/or chip designs to render 3d games in real time. 3DFx, founded by ex-SGI engineers, created the Voodoo accelerator to bring the technology down market to the PC for PC games, which was a rapidly growing market.
Late 90s: NVIDIA entered the already existing and growing market for OpenGL accelerators for 3D PC gaming. This was a fast-follow technical play. They competed with 3DFx on performance and won after 3DFx fell behind and made serious strategy mistakes.
Later 90s: NVIDIA created the “GPU” branding to draw attention to their addition of hardware texture and lighting support, which 3DFX didn’t have. Really this was more of an incremental improvement in gaming capability.
Early 00s: NVIDIA nearly lost their lead to ATI with the switch to the shader model and DirectX 9, and had to redesign their architecture. ATI is now part of AMD and continues to compete with NVIDIA.
Mid 00s: NVIDIA releases CUDA, which adapts shaders to general purpose computation, completing the circle in a sense and making NVIDIA GPUs more useful for scientific work like the original SGI workstations. This later enabled the crypto boom and now generative AI.
Of course, along the way, OpenGL and GPUs have been used a lot for art, including art in games, but at no point did anybody say "hey, a lot of artists are trying to make 3D art, we should make graphics hardware for artists". Graphics hardware was made to render games faster with higher fidelity.
benbreen|2 years ago
That said, starting in the early 1990s is missing the whole first half of the story, no? Searching Google Books with a 1980-1990 date range for things like "3d graphics" "art" or "3d graphics" "special effects" yields a lot of primary sources that indicate that creative applications were driving demand for chips and workstations that focused on graphics. For instance this is from a trade journal for TV producers in 1987: "Perhaps the greatest dilemma facing the industrial producer today is what to do about digital graphics... because special effects, 2d painting, and 3d animation all rely on basically the same kind of hardware, it should be possible to design a 'graphics computer' that can handle several different kinds of functions." [https://www.google.com/books/edition/E_ITV/0JRYAAAAYAAJ?hl=e...]
It's not hard to find more examples like this from the 1985-1989 period.
evanweaver|2 years ago
Of course graphics hardware was also used for more creative purposes including desktop publishing, special effects for TV, and digital art, so you will find some people in those communities vaguely wishing for something better, but artistic creation, even for commercial purpose, was never the market driver of 3D acceleration. Games were. The hardware was designed for gamers first, game programmers second, game artists a distant third, and for nobody else.
The closest thing to an "art computer" around that time was the Amiga which targeted the design/audio/video production markets.
sonicanatidae|2 years ago
It was mostly gamers. As a gamer from that time, the hardware was marketed to gamers, hard. I don't doubt that artists had an impact, but the world had many, many more gamers, than artists and gamers spend money for the best/mostest/etc.
I mainly know this from living through the CGA/EGA/VGA/SVGA/3D add-on card/3D era.
Thank you for taking the time to delve into this. While I may not agree with your conclusions, I respect your work, and the effort put in. :)