• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Civ lead gfx engineer: "OpenGL is broken". T. Lottes posts reply again

Don't those places have their own APIs (RenderMan something) or am mixing up something?
Different layers. Their APIs still need opengl to be able to use hardware acceleration, which they use for realtime previsualization. Renderman isn't a hardware driver, it's a rendering system. Same difference between unreal engine and directx, to put it very simply.
 
we're just in that part where it sucks because people don't use it very often

be forward thinking enough to see there is no reason for it not to get better as more people complain and use it for things

That's what they said about Ouya.

Started from the bottom, now we ... Still at the bottom.
 
OpenGL could use a reboot, this I agree with. But so far I haven't really encountered many obstacles and difficulties when developing with OpenGL. And those I did encounter existed only because of a lack of experience.

Yes. Its only lack of experience or those devs are to lazy to learn how to handle it right.

OGL isnt perfect but neither is DX.
 
That depends on what you mean by "competitive". I have noticed a marked increase in Linux support for mid-sized games already since SteamOS was announced.

I do believe it would take MS shooting themselves in the foot even harder than with Windows 8 for it to truly take over. But that's not out of the question.

Its out of the question. The chance Linux had was in the 90's when 3d graphics was starting to get standardized, before the casual took off. Carmack was calling for Linux support around this time and he was right. Companies didn't want to spend the money and man power to maintain anything like that though. MS was willing to so they supported MS and fucked themselves in the process. MS would have to screw up really bad for Linux to have a chance.
 
Yes. Its only lack of experience or those devs are to lazy to learn how to handle it right.

OGL isnt perfect but neither is DX.

but one must be more than the other.
There is no hazard or conspiration in the actual situation.
And if dx12 keeps its promises, there is no reason for it to change for AAA titles with technical ambition because thoses games are not meant to be played on mobile device where ogl shines.
 
Reason #1:
OpenGL is highly fragmented across platforms. “Write-once run anywhere” is a myth. Mobile GL, Linux GL, Windows GL, and Mac GL, are all different from one another, and offer varying levels of feature support.


http://unity3d.com/


But seriously. No one is saying opengl is better than DX. They are saying Opengl is on EVERYTHING.
 
Just like HTML5 development, everything is fine and magical until you try to run your code on a different environment. That's when the hatred is born.
Heh. Funny you should mention HTML5. Because with WebGL things can also differ per environment. And not just per operating system, but also per browser and hardware vendor. Did you know Chrome on Windows uses Direct3D for WebGL? Or that browsers like Firefox and Chrome have their own set of extensions for WebGL?

The fact OpenGL can be found on soooo many platforms (and thus many hardware vendors) seems like one of the big issues OpenGL is facing. In that regard, it sometimes seems a bit unfair to compare OpenGL with D3D. D3D has way less platforms to account for.

Needless to say, one of the complaints Joshua Barczak has is that there can be many ways to do a single thing in OpenGL. And I agree with that. It's certainly a confusing aspect of OpenGL when learning it.

Yes. Its only lack of experience or those devs are to lazy to learn how to handle it right.

OGL isnt perfect but neither is DX.
Whoa whoa whoa. That's not what I meant. I meant I'm nowhere near as experienced as the graphic engineers in the gaming business. I've never actually done things at a scale the folks in the games industry do. So I never encountered their issues. Lazy or inexperienced is the last thing I'd call them.
 
Sounds like OGL should get a reboot. The idea of splitting it in two for a legacy version and a "dog's bollocks" version is probably a lot more complicated than it sounds.

In context of Valve, since Rick Geldreich was happy enough to make a point by making a succinct post on the issues of OGL, I'd have to imagine that he has been as or more vocal about these grievances with Valve themselves. Should be interesting to see where this all goes
 
Sounds like OGL should get a reboot. The idea of splitting it in two for a legacy version and a "dog's bollocks" version is probably a lot more complicated than it sounds.
That is, in fact, what recent versions of OpenGL have done. There is a "Core profile", with mostly just how stuff should be done in modern code, and a "Compatibility profile" with (almost) everything that was all the rage in the 90s still included.
 
THIS

No in all honestly it really will, Microsoft has the best teams and talent in the industry and they know how to reduce loads on CPU usage and just about everything else. DX pretty much single handedly saved PC gaming in the 90's when it was first introduced. It gave game developers a specific engine and pipeline to work with which was largely unheard of back then. With DX12 the xbox one will have a solid 60fps at 1080p I can almost guarntee it. It's just the way MS does things.

Wow... Just wow.
 
THIS

No in all honestly it really will, Microsoft has the best teams and talent in the industry and they know how to reduce loads on CPU usage and just about everything else. DX pretty much single handedly saved PC gaming in the 90's when it was first introduced. It gave game developers a specific engine and pipeline to work with which was largely unheard of back then. With DX12 the xbox one will have a solid 60fps at 1080p I can almost guarntee it. It's just the way MS does things.
Oh god.... why?
I hope that's sarcasm.

btw.
Yep, the comment about the varying driver quality is true imo.
 
I love Valve, but I have serious doubts about the amount of staff they currently have to tackle a problem of this magnitude.

DirectX is a standard for a reason.
 
This is common for people to believe when they begin working with opengl instead of directx. OpenGl is currently far more complicated to get simple tasks done, which means it has a mean learning curve. Graphics card vendors defend its capabilities and in some ways it's true, opengl is basically as capable as directx, but lacks some intuitive api calls for the sake of backwards compatibility.

This isn't really news or theory worthy though, there's no cross platform api better than opengl currently and most software developers are seeking to make platform agnostic code. Besides, open apis are a common to bitch about because they're the most likely to change due to the open nature of their development.

I've been saying this for a long time, but opengl needs to remove it's backwards compatibility for the sake of a more streamlined rendering process. Preserving the old fixed function code has caused a lot of depricated rendering practices to exist far longer than in something like directX which can freely hit the reset button as it needs.
 
Err, okay space cadets. It doesn't matter if the devs who are complaining are wrong, perception is the problem. No one gives a shit if you've built the most perfect thing ever if no one is using it, and if someone tells you why they don't want to use it, responding to them with "You're wrong" is not going to make converts. Either address the issues in a real way with substantive changes, or suffer the consequences of the incorrect perception. If OpenGL needs an evangelist to make everyone see the True Light, then get one. If the problems are real, then fix them.

man, ain't that the truth.
 
THIS

No in all honestly it really will, Microsoft has the best teams and talent in the industry and they know how to reduce loads on CPU usage and just about everything else. DX pretty much single handedly saved PC gaming in the 90's when it was first introduced. It gave game developers a specific engine and pipeline to work with which was largely unheard of back then. With DX12 the xbox one will have a solid 60fps at 1080p I can almost guarntee it. It's just the way MS does things.

You know how a post suddenly takes a hard turn for the worse? I hope this is some sort of sarcasm.

GAF needs a database of all the various secret sauce claims.
 
martino said:
And if dx12 keeps its promises
As long as DX continues to be used as OS sales vehicle it'll not do anything for the market at large, it'll just continue ceding market share. MS's approach for past 8 years has been remarkably effective at:
1) Holding-back the progress in PC games market
2) More recently, giving the market away to OGL
 
I love Valve, but I have serious doubts about the amount of staff they currently have to tackle a problem of this magnitude.

DirectX is a standard for a reason.

DirectX is only present in Windows, Xbox, Xbox 360, and Xbox One. The GameCube, PS2, PS3, PS4, Wii, and Wii U are all based off OpenGL, and all mobile gaming on phones and tablets people actually own is OpenGL. Even for operating systems, you're using OpenGL on Mac and Linux.

Unless you're writing an Xbox exclusive or a Windows exclusive game, you have no real reason to base your code on DirectX.
 
Pretty sure it has been already covered that consoles use their own APIs and not OGL? PS4 for ex uses GNM and GNMX. Its GPU supports DX and OGL features.
 
FyreWulff said:
The GameCube, PS2, PS3, PS4, Wii, and Wii U are all OpenGL
Literally none of them are OGL.
Nintendo consoles have loose syntax similarities (as did the PSP) but none of their APIs are OGL compliant, and they all allow low-level constructs that OGL has no way of doing.
PS2 had no vendor-defined API at all(not until 5 years into its lifespan anyway, and even then, it only supported a small subset of hw), and while PS3 had an OGLES implementation that was partially compliant - what really got used was half-a-step away from PS2 into a vendor-defined API.
 
THIS

No in all honestly it really will, Microsoft has the best teams and talent in the industry and they know how to reduce loads on CPU usage and just about everything else. DX pretty much single handedly saved PC gaming in the 90's when it was first introduced. It gave game developers a specific engine and pipeline to work with which was largely unheard of back then. With DX12 the xbox one will have a solid 60fps at 1080p I can almost guarntee it. It's just the way MS does things.
Benny a, would be proud
 
Naughty Dog chimes in.

http://gearnuke.com/naughty-dog-programmers-talk-directx-api-optimization-ps4/

Christian Gyrling, who is the lead programmer at Naughty Dog, talked about optimizing for PS4 API and how low level access to the hardware is better.


Low-level access to the GPU really makes you understand why using DirectX is slow and why it is really just in your way.

Being able to unmap/remap memory pages from its virtual address space while maintaining its contents is absolutely amazingly useful.


Cort Stratton, Sony’s ICE Team member and programmer at Naughty Dog, was also asked to comment on the recent DirectX 12 session held at GDC and about the statements that were made at that conference calling it “Biggest leap in technology”. Stratton had quite an interesting response to this question.


I was in that room : ) That session was pure marketing hyperbole. Later DX12 sessions at GDC were more realistic/interesting IMHO.

And but so yeah, these GDC sessions are exactly what I had in mind when I wrote the tweets I linked you to earlier.
 
from the same article

The tweets that he is referring to is what he earlier said regarding DirectX 12 API, when it was originally unveiled. He agreed that new SDKs do improve performance significantly but he can’t say anything about DirectX 12 since it was not his area of exp
 
THIS

No in all honestly it really will, Microsoft has the best teams and talent in the industry and they know how to reduce loads on CPU usage and just about everything else. DX pretty much single handedly saved PC gaming in the 90's when it was first introduced. It gave game developers a specific engine and pipeline to work with which was largely unheard of back then. With DX12 the xbox one will have a solid 60fps at 1080p I can almost guarntee it. It's just the way MS does things.

lol
 
THIS

No in all honestly it really will, Microsoft has the best teams and talent in the industry and they know how to reduce loads on CPU usage and just about everything else. DX pretty much single handedly saved PC gaming in the 90's when it was first introduced. It gave game developers a specific engine and pipeline to work with which was largely unheard of back then. With DX12 the xbox one will have a solid 60fps at 1080p I can almost guarntee it. It's just the way MS does things.

Penello's alt account?
 
The GameCube, PS2, PS3, PS4, Wii, and Wii U are all based off OpenGL, and all mobile gaming on phones and tablets people actually own is OpenGL.
Nintendo doesn't use OpenGL, they use GX. GX is a close relative to GL, the proprietary predecessor of OpenGL. Essentially, OpenGL is an open successor to GL, Silicon Graphics' proprietary 3D programming API, and GX is a fork of the original GL specifically created for games, designed by the same team (because the N64 used a Silicon Graphics GPU). Dr. Wei Yen was responsible for both GL and GX, and his own Project Reality team at sgi, the dudes responsible for the N64 GPU, later founded ArtX, the design team of the Gamecube and Wii GPU.
 
Its out of the question. The chance Linux had was in the 90's when 3d graphics was starting to get standardized, before the casual took off. Carmack was calling for Linux support around this time and he was right. Companies didn't want to spend the money and man power to maintain anything like that though. MS was willing to so they supported MS and fucked themselves in the process. MS would have to screw up really bad for Linux to have a chance.

You have no idea what you are talking about.

Linux was only getting off the ground in the 90's (and windows had a huge market share), today on the other hand you have Windows losing it's casual marketshare (to mostly Linux powered Tablets and Phones) and Linux is in a much better place to support people running it on their desktops.

Unity dev chimes in.

Btw isn't the problem with AZDO that they're vendor specific?

Where does this idea that windows gives you relevant GPU drivers (not just VGA) out of the box and that people use them come from?

If you got a prebuilt computer they come with Nvidia, AMD or Intel's GPU drivers installed and if you built it yourself there is little likely hood that you do not know to download them.
 
Where does this idea that windows gives you relevant GPU drivers (not just VGA) out of the box and that people use them come from?

If you got a prebuilt computer they come with Nvidia, AMD or Intel's GPU drivers installed and if you built it yourself there is little likely hood that you do not know to download them.

It does. I've installed Windows 8 on a PC with a AMD 5770 and started playing Skyrim without installing a AMD gpu driver. You don't get Catalyst but Windows update will give a relatively new driver and my fresh install definitely came with one.
 
It does. I've installed Windows 8 on a PC with a AMD 5770 and started playing Skyrim without installing a AMD gpu driver. You don't get Catalyst but Windows update will give a relatively new driver and my fresh install definitely came with one.

That still does not answer where the people that use that driver come from, no PC maker would sell you a computer without the full graphics drivers installed and anyone that is into making their own PCs is likely to know to download the drivers off the GPU vendor's website.
 
Too Many Ways to Do The Same Thing

The otther complaints are valid but this one bothers me. Having a background in coding choice is never a bad thing. You can do many things many different ways with vastly different performance. Code simplicity/perfomant etc. Choice is never a bad thing. Just cause I can do something 100 different ways doesn't mean I shouldn't use my favorite way of doing it in my code.
 
Not sure where this "consoles use OGL" narrative is coming from. Have seen it multiple times now.

It comes down to ignorance. They have no understanding of the API (or APIs in general), so they also have no idea how terrible OpenGL would be on a console. But since Xbox uses DirectX, the others must clearly use the other major PC graphics API. Even though Sony and Nintendo spend hundreds of millions on R&D for these systems, they couldn't possibly have created their own APIs and tools...
 
OpenGL sure is broken all right, on AMD hardware. RAGE comes to mind. Wolfenstein seems like another contender wherein top end AMD hardware isn't getting the performance it should be thanks to crappy drivers.

Fortunately for them and their long suffering users, the number of AAA PC commercial games made using an OGL renderer in the past decade can be counted on 2 hands.

Thankfully, on the other side of the fence, NVIDIA still gives a crap about making half decent OGL drivers. One of the many reasons to go team green this past decade or so, especially if you're into the emulation scene which is heavily OpenGL based. (e.g. Dolphin, PPSSPP and so on.)
 
OpenGL sure is broken all right, on AMD hardware. RAGE comes to mind. Wolfenstein seems like another contender wherein top end AMD hardware isn't getting the performance it should be thanks to crappy drivers.

Fortunately for them and their long suffering users, the number of AAA PC commercial games made using an OGL renderer in the past decade can be counted on 2 hands.

Thankfully, on the other side of the fence, NVIDIA still gives a crap about making half decent OGL drivers. One of the many reasons to go team green this past decade or so, especially if you're into the emulation scene which is heavily OpenGL based. (e.g. Dolphin, PPSSPP and so on.)

I must be the only AMD user who had 0 complaints with Rage and Wolfenstein
(280X which is a rebadge of the 7970 Ghz)
 
Top Bottom