• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

CES: Mantle API news thread [BF4 up to 45% faster, Star Swarm demo flies on Kaveri]

Kysen

Member
Fixed fps in sync are always better than fluctuating. I'd rather buy a more powerful card than a Gsync capable card + monitor. Gsync setups will work out far more expensive.

Same, especially as I just bought a 27" IPS non Gsync monitor.
 

DieH@rd

Banned
Even "up to" 45% is way, way better than I expected Mantle would ever do. I figured best case scenarios would be 10-12%, with 3-5% on average.

Color me surprised.

AMD reps talked several times that they would never made Mantle if it could only add few percents of performance.
 
It is getting harder to decide between mantle and gsync as the next generation PC graphics feature to sell me a new card.

Up to 45% sounds nice but the benefits of Gsync are more visible and proven across all software.

Fortunately I don't have to decide for another 2-3 years. Will be interesting to see how it shakes out.
 

markot

Banned
Remember that time a gpu company put out a powerpoint about an upcoming product with accurate results and factual factualisations?

Me either!
 

AJLma

Member
So AMD built a demo solely for the purpose of showing off Mantle's strengths (lots of objects) and then we're shocked at the result? This isn't going to magically boost Battlefield 4 45%. It may come into play in the future, but it's one of those "will multiplats ever bother?" type things.

Not really. Considering both platforms are AMD.

AMD stated that one of the goals of Mantle is to make porting between PC and console easier for developers.
 
Nope Mantle is taking full advantage from CPU scaling, and is completely dependant on GPU speed.

DX games are CPU limited, Mantle games are GPU limmited. FX8350 downclocked to 2GHz introduces zero framerate loss to StarSwarm demo.

Oxide Games devs have said that people with midrange CPUs will have no bottlenecks.

Interesting.
 

bj00rn_

Banned
Sounds amazing, would be great if true. But I've seen that "up to" crap before, so it could end up being best case scenario pr babble. They said 10 to 20 percent (?) in the past, I think that's where the average will settle in the end unfortunately. But I'm talking out of my ass here, so who knows.
 

DieH@rd

Banned
New Mantle slide - Kaveri A8-7600 [$120] vs i7-4770 :D

AMD-Kaveri-With-Mantle1.jpg
 
No, they couldn't. Mantle works like it does as it targets specific hardware. If MS did that, then Direct3D wouldn't have support for a lot of cards, which was the whole point of D3D/

Not saying that D3D couldn't do with an overhaul, but it couldn't and shouldn't work in the same way as Mantle.


If im not mistaken AMD and NVIDIA also have shitty multicore support in their pc drivers.
And Dx11 multithreaded rendering model is kinda strange.
If i remember it right you can make/prepare commandlists on different threads but you have to replay those commandlists on your main rendering thread. For openGL im not sure haven't looked into OpenGL much.

Mantle if im not mistaken allows you to push commandlists to the GPU from every thread?
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Is it? A developer friend said that EA announcing they were supporting it with frostbite which the use for fucking everything now was the biggest third party news this gen in favor of the ps4 that no one paid attention to...

I need to ask further details I guess.

I may be miss reading you, but I don't see why Mantle being on PlayStation 4, or Xbox One for that matter, would matter. Both as a positive, or a negative. I've been under the belief that Mantle was explicitly designed as an AMD specific API to bypass DirectX on PC, giving console-like level control over programming and optimisation. Mantle wouldn't really be applicable to platforms already offering high level API programming, and running AMD hardware, as I don't see why AMD wouldn't be supplying Microsoft and Sony Mantle-like optimisation for their devkits.

I'm also sure it could work on a Steambox, as long as that Steambox was using AMD hardware, which it very well could as Steambox is in no way a closed hardware environment. Worst case scenario, in the near future, you build your own Steambox.

The way I see it, the issue is a lot less about an operating system running it, or even having the right hardware, so much as the games and engines themselves benefiting from Mantle. A Steambox with AMD hardware and a bunch of games that don't recognise the Mantle API won't see any advantage. Frostbite 3 games wont see an advantage on Steambox, unless you're running Windows (negating the Steambox part), because there's no way EA is going to bail on Origin for Steam.
 

gofreak

GAF's Bob Woodward
Dont forget people, those are numbers for Kaveri APU. Im hoping for even better numbers for R9 290/290X GPUs. :)

It could end up being the inverse. That the horsepower of the bigger cards helps mask DX shortcomings more than on lower end and APUs.

But even if that were the case, if something can lift the lower and mid end, that is great.

We'll see though!
 

Tenck

Member
Fixed fps in sync are always better than fluctuating. I'd rather buy a more powerful card than a Gsync capable card + monitor. Gsync setups will work out far more expensive.

The up front cost is of course more expensive, but I'd rather not worry about not hitting 60FPS. With Gsync you're getting the full power of your card, and not have to worry about being capped with Vsync.

Helps I already had a Gsync capable monitor before they even announced Gsync.
 

DieH@rd

Banned
Mantle is not on PS4/Xbone, but is similar because of the hardware architectures that were used there.

MS and Sony have crafted with AMD their own APIs and drivers that are very fast and optimized.
 

Durante

Member
"Up to 45%" is not very surprising to me, I guess you can construct scenarios where it's "up to" 200%. What I'm more interested in is how big the difference is in a typical game scenario on a fast CPU, and compared to a decent DirectX11 renderer. I guess we will see once it's finally public.
 

markot

Banned
Mantle will be dead by 2015.

Dollars to donuts the only support AMD has for it is out of pocket. None of these stupid things will ever take off if they need specific cards and limit themselves so much.

nvidia probably pays for physx shit to be put in games too.

The only things that will stick are the things that dont need extra money or resources from developers.
 

riflen

Member
but G-sync will work on all games old and new.

That's not true, Nvidia have stated that some games do not support g-sync mode.
If you have a g-sync capable monitor and GPU, you will always be able to select between g-sync, vsync or neither in the GPU config. It's important for compatibility.

I'm still very much looking forward to using the technology though.
 

Hasney

Member
New Mantle slide - Kaveri A8-7600 [$120] vs i7-4770 :D

AMD-Kaveri-With-Mantle1.jpg

Would love to know how they're measuring the i7 performance with no API, unless they threw in some AMD card to measure it?

If im not mistaken AMD and NVIDIA also have shitty multicore support in their pc drivers.
And Dx11 multithreaded rendering model is kinda strange.
If i remember it right you can make/prepare commandlists on different threads but you have to replay those commandlists on your main rendering thread. For openGL im not sure haven't looked into OpenGL much.

Mantle if im not mistaken allows you to push commandlists to the GPU from every thread?

Oh yeah, the reason for the shitty multicore drivers is due to the shitty DX11 implementation too. Those are a couple of benefits of Mantle that I'm sure would help D3D, but you have to expect that a lot of the performance comes from targeting the hardware, otherwise a lot of bitching to MS to fix the API would have probably been more beneficial than developing Mantle.
 

HariKari

Member
Everybody likes to quote John Carmack on this, but he said this before Steam Machines specs were released.../URL]

There is no singular steam machine spec. There will be boxes available with AMD cards in them. The drivers just weren't ready when SteamOS went public.
 

DieH@rd

Banned
Mantle will be dead by 2015.

Dollars to donuts the only support AMD has for it is out of pocket. None of these stupid things will ever take off if they need specific cards and limit themselves so much.

nvidia probably pays for physx shit to be put in games too.

The only things that will stick are the things that dont need extra money or resources from developers.

Oxide guys mentioned that the cost of implementation is 2 months of man hours. So, its cheap.

Johan Anderson said that once they implement Mantle in Frostbite 3, it will be standard feature of their engine that will be automatically used in all future FB3 games [15 of them that are in the pipeline, including Star Wars games, Mass Effect, DA3, Mirrors Edge, PvZ Garden Warfare]...


Would love to know how they're measuring the i7 performance with no API, unless they threw in some AMD card to measure it?

Fixed Radeon GPU card, tested on both Kaveri and i7. Mantle works on both AMD and Intel CPUs.
 
Let's see some actual numbers first. "Up to 45%" is much closer to my own estimations of 25-30% than to those of the people who expect magical 2-3x increases in performance.

45% is a big deal, and now you add up the fact that in a PC environment you are making a game to run on 40 different setups, and that's another bottleneck. You can't make a game that properly takes advantage of 40 different setups, at least in comparison to making a game that takes advantage of 1 or 2 setups. The time they spend "optimizing" a Xbox One or PS4 version won't be the same as the one they spend on each of those PC configurations. Which is why it's so usual to read about "unoptimized" PC versions of games.
 
Remember that time a gpu company put out a powerpoint about an upcoming product with accurate results and factual factualisations?

Me either!

lol. "Up to".

Im guessing itll be barely noticeable generally. 2 or 3 fps tops. Count on it!

Mantle will be dead by 2015.

Dollars to donuts the only support AMD has for it is out of pocket. None of these stupid things will ever take off if they need specific cards and limit themselves so much.

nvidia probably pays for physx shit to be put in games too.

The only things that will stick are the things that dont need extra money or resources from developers.

Somebody is working overtime.
 

markot

Banned
Oxide guys mentioned that the cost of implementation is 2 months of man hours. So, its cheap.

Johan Anderson said that once they implement Mantle in Frostbite 3, it will be standard feature of their engine that will be automatically used in all future FB3 games [15 of them that are in the pipeline, including Star Wars games, Mass Effect, DA3, Mirrors Edge, PvZ Garden Warfare]...

2 months of man hours isnt really cheap >.< especially considering there will be no increased sales to go with it.

Im just trying to stop people setting themselves up for dissapoint.

GPU guys are notoriously shady with their stuff before they can actually get into the publics hands.
 

Morph-0

Member
No, they couldn't. Mantle works like it does as it targets specific hardware. If MS did that, then Direct3D wouldn't have support for a lot of cards, which was the whole point of D3D/

Not saying that D3D couldn't do with an overhaul, but it couldn't and shouldn't work in the same way as Mantle.

I see.. but they could have just outdated the existing directX and added a directX+ that targets new hardware for increased performance whislt still supporting the existing standard. Couldn't they?

The point is a solution like Mantle would be better coming from a strictly software company.
 

wizzbang

Banned
45%. That's almost too good to be true, but at the same time that's what I was hoping for.

I'm shocked people believe all this to be honest, once you've been following PC gaming and hardware for long enough you see these kinds of claims every couple of years.

Someone actually clued up will investigate, identify how Mantle does X Y and Z completely incorrectly, causing buffer blah blah something or others and majorly bad AA or really shitty lighting in some situation or something like that.
I could believe a prioprietary API could get some benefits but 45% ? Overnight? Suuuuuuureeee
Oh and the developers then need to target a completely different output, the days of OGL or DirectX when playing a game seem mostly gone now, it's a lot of work, let's just add another one in, they won't mind that!

Yeah,... no.
 
What I'm more interested in is how big the difference is in a typical game scenario on a fast CPU, and compared to a decent DirectX11 renderer. I guess we will see once it's finally public.
In hard GPU limited situations, it doesn't seem that Mantle will increase your fps (it will still help with frame time fluctuations, however)
a10-7850k-vs-core-i5-skstv.jpg

(source AMD)
 
So AMD built a demo solely for the purpose of showing off Mantle's strengths (lots of objects) and then we're shocked at the result? This isn't going to magically boost Battlefield 4 45%. It may come into play in the future, but it's one of those "will multiplats ever bother?" type things.

Psst... first slide specifically shows Battlefield 4 with up to 45% improvement.

On games that specifically were built for mantle, it is up to 300% at times.
 

Hasney

Member
I see.. but they could have just outdated the existing directX and added a directX+ that targets new hardware for increased performance whislt still supporting the existing standard. Couldn't they?

The point is a solution like Mantle would be better coming from a strictly software company.

They could have, but since they have no control over the hardware cycles, MS would then have to put in more development everytime a new series of cards comes out to ensure they're supported. This pushes the onus to AMD which would make a lot more sense, since they can develop the API for the new generation of cards as they develop the hardware.

There would be no real benefit for MS to develop it, unless they made it Windows 8 exclusive to try and push more people to upgrade. But I don't think they'll do that again since it didn't help Vista too much.
 
I'm shocked people believe all this to be honest, once you've been following PC gaming and hardware for long enough you see these kinds of claims every couple of years.

Someone actually clued up will investigate, identify how Mantle does X Y and Z completely incorrectly, causing buffer blah blah something or others and majorly bad AA or really shitty lighting in some situation or something like that.
I could believe a prioprietary API could get some benefits but 45% ? Overnight? Suuuuuuureeee
Oh and the developers then need to target a completely different output, the days of OGL or DirectX when playing a game seem mostly gone now, it's a lot of work, let's just add another one in, they won't mind that!

Yeah,... no.

What are you talking about? With steam machines around the corner, this is the best time to come out with something like mantle. It's perfect for it, and if developers will use it for Steam OS, they will have little reason to not go for the Windows crossover.

This could actually take off.
 

ViciousDS

Banned
mmmmmmmm I wonder how well a Kaveri system with a R9 would run in Dual Graphics mode.....gosh those benchmarks I cannot wait.

I remember when just the first AMD trinity stuff came out and the graphics boost from the APU was enough for some insane FPS jump.....so much anticipation from AMD this year.
 

bbyybb

CGI bullshit is the death knell of cinema
AMD knows where their bread is being buttered now days as well.


In regards to Mantle, any improvements that can be made over Directx I think it is a good thing as hopefully it will put more pressure on Microsoft to improve it in the future.

Edit: Added more images.
 
A10-7850K-vs-Core-i5-4670K1.jpg

The new APUs seem great for 720p gaming, but gains of a Kaveri APU+ dedicated GPU over an i5+dedicated GPU seem non-existant. It can be a decent stopgap if you don't want to spend those extra 150&#8364; right away on a new GPU, but if you can afford both, a baseline i5 is still a bit cheaper.
 

kharma45

Member
mmmmmmmm I wonder how well a Kaveri system with a R9 would run in Dual Graphics mode.....gosh those benchmarks I cannot wait.

I remember when just the first AMD trinity stuff came out and the graphics boost from the APU was enough for some insane FPS jump.....so much anticipation from AMD this year.

It'll probably only Crossfire with an R7 card, and even then I'd try to avoid a multi GPU set up.

AMD knows where their bread is being buttered now days as well.

VCWyNMM.jpg


In regards to Mantle, any improvements that can be made over Directx I think it is a good thing as hopefully it will put more pressure on Microsoft to improve it in the future.

Haha there is no such processor from Intel as the i5 3870K. The only 3870K processor that exists is an AMD FM1 based APU.
 

hesido

Member
There is no singular steam machine spec. There will be boxes available with AMD cards in them. The drivers just weren't ready when SteamOS went public.

I'm aware of that, but the Steam Box Specs could require Mantle compatible hardware, I think this was what John Carmack imagined when making that comment, because when Mantle won't be supported throughout the entire Steam Box line, it wouldn't pose as the "SteamBox-boosting threat" to Sony and MS that John Carmack thought it would be.

If anyone can point to a more recent Carmack quote, after the Steam Box beta hardware specs is released, I'd be more inclined to think otherwise. With the current Steam Box philosophy (no Mantle requirement), there's not much left for Sony to be against. MS's stance wouldn't change much because Steam OS is already a direct competitor for Direct X and Games For Windows.
 

ViciousDS

Banned
It'll probably only Crossfire with an R7 card, and even then I'd try to avoid a multi GPU set up.



Haha there is no such processor from Intel as the i5 3870K. The only 3870K processor that exists is an AMD FM1 based APU.

Well its a hybrid crossfire......its a cool feature that I have always thought about using for my gaming rig....but end up going intel.nvidia instead.....however with AMD really gaining grounds in the power/performance ratio I have thought about going back. My old computer I have my cousin with the 6950(I think thats what it was) is still kicking strong and playing games perfectly.


so it's not really a full Crossfire but instead borrows extra video rendering from the APU and combines it with the power of the hard GPU.


Here is an example although old, but it shows the frame boost from dual graphics in tomb raider

TombRaider.png
 

HariKari

Member
It'll probably only Crossfire with an R7 card, and even then I'd try to avoid a multi GPU set up.

Kaveri has some improvements aimed at synergy between the APU and GPU.

The most significant enhancement Kaveri would adopt is the HSA (Heterogeneous System Architecture) powered with the new HUMA enhancements which allow coherent memory access within the GPU and CPU. HUMA would make sure that both the CPU and GPU would have uniform access to an entire memory space which would be done through the memory controller. This would allow additional performance out of the APU incase the GPU gets bandwidth starved. This also suggests that faster memory speeds would result in better overall performance from the graphics card. Now with all the architectural talks done, let’s get on with the A10-7850K itself.

On the graphics side, we are getting the latest GCN architecture over the VLIW4 featured on previous AMD APUs. The die has upto 8 GCN compute units which feature AMD AudioTrue technology, AMD Eyefinity tech, UVD, VCE, DMA Engine and the addition of coherent shared unified memory. Being based on the same GPU as Hawaii, the Kaveri APU die has 8 ACE (Asynchronous Compute Engine) which can manage 8 Queues and have access to L2 Cache and GDS .

http://wccftech.com/amd-kaveri-apu-...-revealed-241-billion-transistors-245mm2-die/
 

chaosblade

Unconfirmed Member
The new APUs seem great for 720p gaming, but gains over an i5+dedicated GPU seem non-existant. It can be a decent stopgap if you don't want to spend those extra 150&#8364; right away on a new GPU, but if you can afford both, a baseline i5 is still a bit cheaper.

I'm not sure what you're getting this from. 4670K + 270X + Z87 motherboard will be more expensive than a 7850K + 270X + A88X motherboard.

If that graph is accurate and frametimes don't suck on the AMD side, Kaveri would be a better value.
 

kharma45

Member
Well its a hybrid crossfire......its a cool feature that I have always thought about using for my gaming rig....but end up going intel.nvidia instead.....however with AMD really gaining grounds in the power/performance ratio I have thought about going back. My old computer I have my cousin with the 6950(I think thats what it was) is still kicking strong and playing games perfectly.

so it's not really a full Crossfire but instead borrows extra video rendering from the APU and combines it with the power of the hard GPU.

It's still a multi GPU solution and still brings the same issues. It's not a good solution next really if you want a proper gaming machine.
 

Minion101

Banned
it was announced a while back that mantle does not require GCN and works with NVidia cards as well.

this is good news going forward providing the performance gains over directx example isn't a best case scenario.

Would it still beat directx on an nvidia card?
 

bj00rn_

Banned
What are you talking about? With steam machines around the corner, this is the best time to come out with something like mantle. It's perfect for it, and if developers will use it for Steam OS, they will have little reason to not go for the Windows crossover.

This could actually take off.

Eh..Or maybe we should wait and see if "Mantle" ends up in Steam OS..? AFAIK, for now it's a Windows API.
 
Top Bottom