top | item 7822246

(no title)

pyalot2 | 11 years ago

OpenGL is the only thing you get on iOS. There is no Direct3D on iOS. Likewise, it's the only thing you get on PS4, Steambox, OSX etc.

But that's not my issue, I acknowledge freely that OpenGL drivers are bad. I just don't quite see how that's a failing of OpenGL, rather than the vendors who actually implement the drivers.

discuss

order

cwyers|11 years ago

Well, no, Sony has its own low-level API that you can use on the PS4[0], and because all PS4s use the same GPU you don't have to worry about a lot of what OpenGL has to offer you in terms of abstracting away the underlying hardware, if all you care about is the PS4.

http://arstechnica.com/gaming/2013/03/sony-dives-deep-into-t...

kevingadd|11 years ago

PS4 doesn't use OpenGL. No game console I'm aware of has ever used OpenGL. (The PS3 is the closest example, since it used to let you run Linux, so you could run Mesa - but the GPU wasn't accessible to you.) I don't know why people keep claiming that a given console runs OpenGL.

Jach|11 years ago

People probably bring it up because you can often use OpenGL on a console, even if it's through a wrapper library instead of supported directly, or even if it's only "OpenGL-like" instead of fully compliant -- which still may be better than porting to the native format depending on the game. (Though the original XBox's D3D was pretty OpenGL friendly: http://www.emuxtras.net/forum/viewtopic.php?f=193&t=4009) So for the PS2, PS3, Wii, and handhelds like the PSP and DS, there are OpenGL or OpenGL-like wrappers available.

Mikeb85|11 years ago

PS4 doesn't use 'OpenGL', just a low level api and a higher level api that has features suspiciously close to OpenGL 4.3...

Also uses Clang and a bunch of Unixy open source stuff...

fixermark|11 years ago

The bigger problem is a failing of the ecosystem at large; there's an insufficiently toothful agency policing the vendors and a lack of gold-seal certification that matters, leaving space for vendors to do whatever gets the card out the door before the competition. OpenGL's breadth as an API definitely doesn't help ("now that you've implemented direct-buffer rendering, let's go implement all that glVertex crap that lets you do the exact same thing, only slower! Your library coders have infinite time, right?"). But I doubt it's the root cause of the frustration; the "hit the benchmarks and beat ATI out the door this Christmas" ecosystem is my biggest gripe.

I've had to deal with cards that explicitly lie to the software about the capabilities by specifying they support a shader feature that's implemented in software without acceleration (!!!). There's no way to tell via the software that the driver is emulating the feature besides enabling it and noticing your engine now performs in the seconds-per-frame range. So we blacklist the card from that feature set and move on, because that's what you do when you're a game engine developer.