• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Directx and OpenGL getting low-level access soon (GDC 2014)

penis

Banned
nice I hope it's for recent cards too like a GTX 660

since it's the 3rd most used GPU behind Intel HD 4000 and 3000.
 

ExcelTronic

Member
Sorry for going a bit away from the main thread, but the OP's looking in to the GDC catalog made me wonder what else it could have. I a found this.

Creating Unique Interactive Experiences with the PlayStation 4.

VR?

But to be on topic. This would probably affect Mantle.
 
I hope this results in OpenGL dealing a huge blow to DirectX: there is a open, multiplatform, standard graphics language that should be much more supported than a Windows only implementation.
 

sirap

Member
Any performance they can squeeze out of opengl is welcome. Would love getting an extra boost when playing games with my iris pro.
 

Jtrizzy

Member
Well this is certainly good news for Nvidia. I'm sure I'm not the only one who was considering switching to AMD for my next gpu. I just hope all this bitcoin shit goes away so I can get the true successor to my 580 in the 8xx series for ~$700 whenever they become available.
 
The number is fine. According to Timothy Lottes, draw call overhead is up to 100 times higher on PC than on consoles.

I hope this will allow hetero-core processors to gain some momentum on PCs. AMD already said that DirectX in its current form can't really use the features of HSA APUs.

Forgot about this, when will windows support such a system first? To be clear a system architecture similar to the PS4.
 

DieH@rd

Banned
This is pretty much the best possible consequence of Mantle arriving on the scene. Driver divisions of DX and OGL finally woke up and remembered suddenly that devs begged them for years to remove overhead.

:)

Sucks for Mantle, awesome for everyone.

No, this was exactly the point of Mantle. Either to bring developers to use it or to force competition to become better.
 

AmyS

Member
This would be good. That means when I build my gaming PC in the next 7-8 months or so, I can get an Nvidia Maxwell card, have the peace of mind of solid NV drivers, while possibly getting somewhat better DirectX performance, not being restricted to AMD GCN GPUs.
 
I hope it still happens this year and the new Maxwell cards will support it then.

Mantle can be nice in non-optimal situations, but there is only support for few games (even with support coming for many others, there are still few games) and I'd prefer a Nvidia card next time.
 
I don't often code, but when I do I like to do it close to the metal

I+Don+t+always+drink+beer+but+when+i+do.jpg
 
Does this mean that you can code to the metal/GDDR5 on PC?

You always could, it was just a pain in the dick. Back in the day every graphics vendor had its own API--3Dfx had GLide, S3 had Metal, and Matrox I believe it was had something that nobody used. Basically it meant you got insane performance on games using that specific API, but it also meant the developers had to work two, three, even four times as hard to support all these different graphics API's. The reason OpenGL and especially Direct3D (DirectX) came to be was because developers were complaining about the difficulty of development for so many different system configurations, and wanted a "hardware agnostic API".
 

SmartBase

Member
Hope these improvements coincide with the release of Maxwell this year, I don't think I can hold off on system building for longer than that.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
Sure, your points are true but where is the developer support and enthusiasm? Mantle has that, OpenGL does not at the moment.

wat

OpenGL is much bigger overall. High end CG houses like Pixar, WETA, ILM, etc are primarily Linux houses running on OpenGL for anything real-time. They also have taken advantage of OGL's low-level extensions ever since they were available, but then again they also have industry-leading engineers in graphics and performance optimization as well as access to the hardware vendors top support engineers. Some of the real-time tests I saw at ILM before I left were mind-blowingly good, and running at 30 FPS, and cross-platform on Windows and Linux.

OpenGL is actually the most commonly used 3D API on the planet, just like Linux is the most commonly used operating system. Only a tiny fraction of the computers out there are PCs, not even all of those use Windows, and even on Windows PCs, the vast majority of pro applications and quite a few games use OpenGL instead of DirectX.

Yep!

You always could, it was just a pain in the dick. Back in the day every graphics vendor had its own API--3Dfx had GLide, S3 had Metal, and Matrox I believe it was had something that nobody used. Basically it meant you got insane performance on games using that specific API, but it also meant the developers had to work two, three, even four times as hard to support all these different graphics API's. The reason OpenGL and especially Direct3D (DirectX) came to be was because developers were complaining about the difficulty of development for so many different system configurations, and wanted a "hardware agnostic API".

And even with Direct3D and earlier versions of OGL, if you had the right talent you could still do the "code to the metal" nonsense if you wanted to just take the generated compiler code and hand-tune the assembly. That could cause more problems if you don't have a really really good optimization wizard given the potential mess of hardware/driver compatibility issues, but it was still possible.
 

Toski

Member
Huh? I believe they paid DICE to develop it.

They worked with DICE, or should I say "Frostbite." AMD had to do work on their side and I would guess they would pull primarily from their OpenGL team, seeing as OpenGL wasn't a priority due to the low gaming marketshare of Linux.
 
I have no clue what this is all about, my take from it... pc gamers may get a few more frames out of ther hardware down the road!
 

Walshicus

Member
I'm sure there's improvements to be wrung from these systems, but I kind of get the impression some people forget *why* they were built as they were in the first place.
 
Oh man, this is such promising news, I hope they can deliver on it, PC gaming is just getting better and better.

I am under the impression that it would not be at GDC if it was not promising in some sense of that word.

But then again, this is MSFT. Everything has to be written with a large *.
 

LiquidMetal14

hide your water-based mammals
This will change all the benchmark methodology once this happens. It's for the better but if things are that much more optimized then this type of advancement will have to be taken into account.
 

Zarx

Member
Mantle has DICE and Oxide Games, the former being notoriously bad at getting games working without being a buggy mess anyway. Not exactly the ideal poster child for a Graphics API. There still has yet to be a situation that really shows off Mantles benefits. Battlefield 4 is not a hugely CPU-bound game, yes it is fairly heavily threaded, but it is still largely GPU bound, and the performance gain between DX11 and Mantle was only 10-15% on high end PC's. I want to see something like an RTS which uses almost entirely the CPU and bottlenecks the ever living fuck out of it using Mantle.



Unreal 4 is yet to see the light of day. The only games to actually use UE4 are games coming out for this generation of consoles and on PC in the coming year or two, so it's hardly a valid example. Cryengine 3 still has DirectX 9 support and Crysis 3 still ran in DX9 (I can't remember if it was 2 or 3 but one of them launched without DX11 support and 64-bit executables and caused a huge uproar), it wasn't built for DirectX 10/11 like Frostbite 3 was. Nitrous Engine is one of the few taking advantage of Mantle from the get-go and most of their marketing is been questionable, Snowdrop is about a year away from materializing, and I can't find any confirmation on the specifics of Luminous.

The fact is, other than Frostbite 3 no major Engine has really taken advantage of DirectX 11. Civ V and WoW allow you to run in DX11 mostly for their increased efficiency, but that's probably the biggest use of it. Developers need to stop pretending DX is hamstringing them when they don't even use the newer versions of it...


Crysis 2 was DX9 based at the start yes (DX11 was patched in later) but Crysis 3 was DX11 only, and the latest iteration of the engine is one of the most feature rich on the market today. Ironically the latest version of Cryengine probably supports more DX11 features than Frostbite at this point, Frostbite (or at least BF3/4) doesn't even use tessellation iirc. And there are plenty of games/engines that have great DX11 implementations even if they weren't designed from the ground up for it. For example Metro: LL and the 4A engine, Glacier 2 with Hitman Absolution (actually that is another DX10/11 only engine), Aliens vs. Predator and Max Payne 3 come to mind. In fact games that only use it for performance improvements aren't really that common, most use it for tacked on effects like tessellation and call it a day. And I don't see how using it's efficiency gains doesn't count as really using the API, especially given the context of DirectX adding grater efficiency via lower API overhead. And you will probably find the devs that are complaining about being hamstrung by DX are the ones that were the ones to use DX11 and now Mantle, like DICE.

Tho that doesn't mean I don't think that the adoption of DX11 has been slow, tho that is more MS's fault than anything. First by botching DX10's launch, and then limiting 10/11 to Vista and later. XP was the industry standard for a very long time and people were slow to move on, so most developers were forced to keep DX9 support around for a long time. Also the consoles basically being DX9 level machines also held devs back from going all out.
 
This is good thing to hear because I also have an i7 950. :D

Perhaps my i7 930s lifetime also just got extended another 5 years (end of this console gen). :D

Seriously the first i7 processors are amazing. They still compete due to their overclocking potential. It is ridiculous.
 

Naminator

Banned
AMAZING NEWS!

Thank you AMD, really, thank you.

I hoped that mantle would put a fire under MS ass for them start moving along with DirectX.

Here is the one thing I'm worried about though.

Since this project has ALL vendors supporting it, I'm afraid that this brand new tech might be exclusive to new hardware, IF thats the case, then that would mean the benefits of this brand new API are only going be fully utilized in a decade AT LEAST, by then we will already have a couple more DX revisions.

They better make this shit AT LEAST Win8.1 and Dx11 hardware compatible!
 

-SD-

Banned
Good news.

I'm totally an NVIDIA/Intel user but AMD has a good history on inginiting interest and implementations for important stuff.
 
More options are great, but I have zero faith in Microsoft. The sooner PC gaming can move away from their platform the better, and unfortunately this might just keep Windows trucking along a wee bit longer. Still, that aside - I don't mind DX being improved in the meanwhile.
 
Fantastic news--with the way things are shaping up with Win 8 and their philosophy concerning Windows as an OS moving forward, I'm pretty comfortable with MS moving forward.

now the xbox one on the other hand..
 
This is great.

EDIT: It won't be that great, if these new DirectX features are Windows 9 exclusive.

ftfy...

I don't have much faith that microsoft won't pair up any major change to DX to it's next big windows.

(Edit) beaten not just many times over, but by the post above me no less.
 
Top Bottom