Oh man, something tells me the GPU Wars are gonna get crazy again very soon.
Mantle going to be used on PS4 / Xbox One?
"NONONONO those benchmarks are not right ! Mantle is only on AMD, you can't use mantle to compare performance of game !"
Something tells me it will be like Glide days where most of people who didn't have Voodoo where moving goal post like crazy. Meanwhile people who had 3dFX accelerators were playing their games with considerable better performance.
As of article. It is natural that lower GPU with Mantle should beat Titan.
Back then though we didn't have proper internet though...
I expect most of the sites now iwll take measures with both Direct and Mantle and post it in same articles comparing performance.
Durrherp derp
Educate yourselves, console kiddies.
Or keep telling yourselves that paying $400 - $500 for a closed box with paid online gaming and $60 games that struggle to hit 30 - 60 FPS at various resolutions that are all 1080p or lower is a "good deal."
Mandatory
It isn't really a fair fight anyways. Titan has like 20% more die area than what the 290X has if the amd person's statement is taken at face value.
AMD is the minority in the PC market, so people using mantle still have to use direct x if they want to do well.
Mantle pretty much guarantees better performance for cards using the GCN architecture vs the same cards using dx doesn't it?
You've got to be kidding me. 2900 series was the FX equivalent, that's hardly surprising.Most of Xbox 360 games ran better on 8800GTX than on 2900XT. 360 is using an ATI GPU. It will be the same this time. Doesn't matter what hardware is in a console - PC hardware is leagues above that already and the fight is completely different here.
Judging from what AMD is hinting at, it doesnt look like a full Titan on ridiculous clocks can even catch it let alone handing ass.GTX 800 launch is probably next year at 20nm, same as AMD with the R9-390X. As you said, Nvidia owns the high-end and overall GPU marketshare, so I expect Nvidia to ride out the GTX 700 until 20nm. The only way I see Nvidia launching a "TITAN Ultra" at 28nm is if they think people will buy it, and its hands the R9-290X it ass when comparing DX vs Mantle, just so they can claim the crown.
Is mantle AMD's API open?, so nvidia could benefit from it.
Hopefully they deliver. It would be nice to see nvidia have really serious competition.
Mantle going to be used on PS4 / Xbox One?
Is mantle AMD's API open?, so nvidia could benefit from it.
I just need some standards, openGL + Mantle + SteamOS = NextGen
So is mantle a piece of hardware or software? People here are making it sound like an add-on...and I'm now confused.
Mantle was made from ground up to work only with Radeon cards with GCN architecture. Its not compatible with any Nvidia architecture.
FTFYBasically, B4 is specifically AMD optimized with a new open AMD API.
Is mantle AMD's API open?, so nvidia could benefit from it.
I just need some standards, openGL + Mantle + SteamOS = NextGen
Its open but all the means is nVidia is allowed to implement it on their own. IE they are allowed to copy all the parts that are exposed to developers without getting sued for copyright infringement. AMD has a massive head start in that regard so nVidia cards wont be seeing any benefits for a while.
AMD provide the GPU, they will provide the software needed to run it as they have had the benefit of many years writing software for the hardware they develop. MS or Sony wouldn't even know where to start, so yes is my answer.
Expect AMD to be nuts deep in this stuff.
This is quite a coup for AMD, no two ways about it. All the developer community will be optimising for their hardware by default the easier they make it to move code around the various platforms the better it will be for them.
Its "open" yes. Nvidia will love trying to optimize around code that's designed to run around their competitors hardware, but sure its "open."
I hope Nvidia does something about this for those if us that just dropped hundreds of dollars on their cards in the last few months.
Worse-case scenario is that devs don't even bother optimizing for Nvidia cards anymore.
I'd say the main point is that AMD obviously designed this to take full advantage of its own GCN GPUs, so just copying everything and using it on a current Nvidia GPU simply won't yield the same advantage (or might very well actually make it perform significantly worse).
You've got to be kidding me. 2900 series was the FX equivalent, that's hardly surprising.
http://www.ilsistemista.net/index.p...nd-vliw-a-retrospective-analysis.html?start=32900 uses same architecture as 3800 and then 4800 series and with some modifications was also basic of 5800.
FTFY
point of diminishing returns.
400 bucks for a ps4 or 800 bucks for a video card to run games where you need to sit 4 inches from the screen to notice a difference.
I'm really getting tempted to switch back to AMD. I love and 670, but Nvidia has had a horrible time with their drivers for the last, I don't know, 5-6 months. Given that the consoles are going AMD only, I'm starting to think we'll see a performance shift just be virtue of parallels in coding.
I wouldn't do anything until we see how this pans out over the next 6 months. I won't be changing my 670's until I see how AMD leverage their position.
I'm fully expecting Nvidia to suddenly start have problems with driver optimisation and AMD to suddenly having drivers that just work because games have been coded on them from the start.
This is very similar to how Xbone buyers feel MS will achieve parity.And then nVidia will release an optimized driver and the gap will close to nothing, or the Titan/780 will take the lead. AMD isn't fooling anyone with this ridiculous PR.
True; however, this move is designed as a bit of future-proofing, since most current-gen AMD cards and all of upcoming ones are GCN (also, two out of three next-gen consoles are GCN). Besides, they've had (IIRC) sales increase with GCN cards (people bought more 77xx/78xx/79xx cards compared to their '5' or '6' prefixed predecessors).AMD has minority of market share and among those cards GCN is also only a part. It would be bad for AMD itself as it would be shitting on everyone who bought VLIW based cards in last few years.
This is very similar to how Xbone buyers feel MS will achieve parity.
This is very similar to how Xbone buyers feel MS will achieve parity.
True; however, this move is designed as a bit of future-proofing, since most current-gen AMD cards and all of upcoming ones are GCN (also, two out of three next-gen consoles are GCN). Besides, they've had (IIRC) sales increase with GCN cards (people bought more 77xx/78xx/79xx cards compared to their '5' or '6' prefixed predecessors).
This is very similar to how Xbone buyers feel MS will achieve parity.
Benchmarks and price please AMD.
Mantle is bigger thing thanks to securing all new consoles by AMD. Without it you would still have developers coding for different non GCN hardware and i doubt frostbite devs would do anything.
Thanks to securing consoles AMD now have option to use Mantle as their bullet point.
It is direct opposition to how things were done before where 80-90% of devs used nvidia as their primary lead platform. Thanks to this games on Nvidia hardware had less bugs and better performance.
I don't think so.
AMD has minority of market share and among those cards GCN is also only a part. It would be bad for AMD itself as it would be shitting on everyone who bought VLIW based cards in last few years.
So for next few years making game run badly on direct X would be commercial suicide for game maker.