• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[rumour] AMD R9 390X, Nvidia GTX 980 Ti and Titan X Benchmarks Leaked

cyen

Member
what? Higher resolutions is where SLI shines.

Nop, since AMD implemented XDMA on the r9 series even 290x crossfire is neck to neck with 980 sli and the 290x is a slower single GPU.

http://www.hardocp.com/article/2014..._980_sli_4k_video_card_review/11#.VQcGQqh_se8

"AMD hit NVIDIA hard with this new XDMA technology. Everyone was expecting NVIDIA would strike back with Maxwell by offering its own evolved SLI technology. However, it did not for this generation. That may end up biting NVIDIA in the butt as far as 4K gaming goes in the future."
 

riflen

Member
So with these leaked benchmarks emerging, what do you folks think the odds are that nVidia releases the 980Ti alongside the Titan X? Or is the 980Ti still likely to be at least a few months away?

Timing is all wrong for 6GB GM200 imo. We've seen nothing substantial in leaks regarding any other GM200 products. They'll want to get the very top spending customers on-board with Titan X while they can still boast about it. Titan X is a gigantic GPU. Yields are bound to be low, leading to limited availability and keeping the air of exclusivity. Those chips failing to make the cut as a Titan are being made ready for the 6GB GM200 variant.

March --> 12GB GM200 Titan X @ $1000 -- Fastest GPU available.
.
.
June --> Fiji XT (390X) @ $700+ -- Perhaps faster still at 4k and up.
Summer --> 6GB GM200 @ ~$700 -- Competitive with Fiji (390) @ ~$700.
 

viveks86

Member
Timing is all wrong for 6GB GM200 imo. We've seen nothing substantial in leaks regarding any other GM200 products. They'll want to get the very top spending customers on-board with Titan X while they can still boast about it. Titan X is a gigantic GPU. Yields are bound to be low, leading to limited availability and keeping the air of exclusivity. Those chips failing to make the cut as a Titan are being made ready for the 6GB GM200 variant.

March --> 12GB GM200 Titan X @ $1000 -- Fastest GPU available.
.
.
June --> Fiji XT (390X) @ $700+ -- Perhaps faster still at 4k and up.
Summer --> 6GB GM200 @ ~$700 -- Competitive with Fiji (390) @ ~$700.

Pretty much my expectations at this point
 
My inner pessimist combined with them calling the 390x an 'ultra enthusiast' card makes me think this means they want to have a go at a 'titan' brand too

If they make a new lineup they can build a gap filler between the 290x and the 4000sp + HMB 390x performance, call that the 390x, then call the 390x something stupid like 'fuckyougivemoney' price it at 800+ euros.

If the 290x became the 380x they couldn't do this as they wouldn't have a gap filler.

Called it (the pricing, not the naming but we'll see what they'll call it to match that stupid price when it's out ; not that I'm happy about being right)

Fuck hardwareland man, seriously.
 

DieH@rd

Banned
Interesting that they demonstrate the hair tech mostly on an Adam Jensen model. I guess he is a (main?) character in the Deus Ex: Human Revolution sequel?

Unknown. His fate at the end of DE:HR will maybe prevent him to return.

He died at the bottom of the ocean inside exploding science station. But this is cyberpunk, anything can happen. Brain activity can be saved, placed into new [natural or artificial] bodies, uploaded into the web/Matrix constructs as AIs... Perfect opportunity for Jansen to wake up in regenerated body and say... "I never asked for this." :D
 

d00d3n

Member
Unknown. His fate at the end of DE:HR will maybe prevent him to return.

He died at the bottom of the ocean inside exploding science station. But this is cyberpunk, anything can happen. Brain activity can be saved, placed into new [natural or artificial] bodies, uploaded into the web/Matrix constructs as AIs... Perfect opportunity for Jansen to wake up in regenerated body and say... "I never asked for this." :D

Haha, I like the scenario that you spoiler tagged. But yeah, I guess the choice of the AJ model for the demonstration may just be because it was conveniently available to use.
 

thelastword

Banned
Actually it is rumored that the 390x uses 3D stacked memory which increases the memory bandwidth by an insane amount. This picture is a good example:

Hynix-HBM-15-635x490.jpg


Ultimately this means that the 390x could have a memory bandwidth of 512 GB/s – 1024 GB/s. Considering that the Titan X has a memory bw of 336.0 GB/s this could theoretically allow 8GB of 3D Stacked memory to be accessed 2-3 times faster than the TitanX so they end result would actually be that 8GB of 3D Memory would destroy 12GB of "2D" memory.

If they really do have 3D Stacked Memory 1024 GB/s then 4GB would be pretty close to 12GB of 2D memory.
Interestingly enough, AMD has had higher performance at higher resolutions over Nvidia for a good minute. With this new memory technology and general bandwidth, they're just continuing from where they always had the edge. People will need to be schooled on the 8 vs 12GB spec though, as I'm sure Nvidia will be pushing the 12GB pretty hard in marketing their product, of course, we all know it will not compare to 8GB of HBM, but 12GB will fool many.

In my recent experiences with the 970, I've had more driver crashes with it than I had with any AMD drivers and cards in the last 3 years combined. Not an exaggeration. I'd say I had maybe ~10 driver crashes in the last 2.5 months with the 970 and the latest Nvidia drivers. I can only think of a couple of occasions in the last 3 years that my AMD drivers crashed. They had other issues like making the mouse cursor bug out...but driver crashes? Only when stress testing OC's.

AMD may not release drivers as often, but they're for damn sure more stable and reliable than what Nvidia is putting out. Fuck what anyone else tells you. I'll be happy to go back to AMD than deal with this sketchy shit. We'll see when the next Nvidia drivers are released how things go.
I've rarely had an issue with AMD drivers to be honest, on the other hand I've had quite a few issues with my Nvidia GPU's over the years, so many game crashes. My pc has an Nvidia GPU at the moment and there was a driver just a couple of months ago which caused a terrible scrambled image and would often freeze your pc, it wasn't fixed till two or three drivers later. Nvidia is certainly not this golden goose of drivers.
 
Unknown. His fate at the end of DE:HR will maybe prevent him to return.

He died at the bottom of the ocean inside exploding science station. But this is cyberpunk, anything can happen. Brain activity can be saved, placed into new [natural or artificial] bodies, uploaded into the web/Matrix constructs as AIs... Perfect opportunity for Jansen to wake up in regenerated body and say... "I never asked for this." :D

There are multiple endings, he might be alive.
 
Interestingly enough, AMD has had higher performance at higher resolutions over Nvidia for a good minute. With this new memory technology and general bandwidth, they're just continuing from where they always had the edge. People will need to be schooled on the 8 vs 12GB spec though, as I'm sure Nvidia will be pushing the 12GB pretty hard in marketing their product, of course, we all know it will not compare to 8GB of HBM, but 12GB will fool many.

This is incredible, and I'm wondering when will PCs start using memory like this?
 
Had 2 8800gtx in sli. I will never waste my money on mgpu again until afr is dead and gone

Since then they have added in not only software framepacing, but hardware frame pacing as well as the customizable profile system.

I also used SLI at the same time and it has improved drastically since then.
 
Since then they have added in not only software framepacing, but hardware frame pacing as well as the customizable profile system.

I also used SLI at the same time and it has improved drastically since then.

ive yet to see any concrete info that their is actually any hardware dedicated to frame pacing, just random nvidia claims. regardless, frametime variance in mgpu is far from a solved problem. in addition, very few games that matter even work properly with sli these days. its just glitch after glitch, bug after bug, terrible or complete lack of perf scaling for months and months until nvidia is able to hack around the engines mgpu issues.
 

thelastword

Banned
This is incredible, and I'm wondering when will PCs start using memory like this?
If you're talking about PC RAM, yes. There's a stacked dram technology similar to HBM from Micron, it 's called HMC (hybrid memory cube). It has insane bandwidth but that comes at the expense of high costs and higher power draw.

If AMD popularizes HBM with this new card, I'm sure Nvidia will follow on eventually. It will also make the possibility of HMC hitting consumers much more certain.
 
If you're talking about PC RAM, yes. There's a stacked dram technology similar to HBM from Micron, it 's called HMC (hybrid memory cube). It has insane bandwidth but that comes at the expense of high costs and higher power draw.

If AMD popularizes HBM with this new card, I'm sure Nvidia will follow on eventually. It will also make the possibility of HMC hitting consumers much more certain.

Pascal is set to use it. That was one of its main pitch.
 
ive yet to see any concrete info that their is actually any hardware dedicated to frame pacing, just random nvidia claims. regardless, frametime variance in mgpu is far from a solved problem. in addition, very few games that matter even work properly with sli these days. its just glitch after glitch, bug after bug, terrible or complete lack of perf scaling for months and months until nvidia is able to hack around the engines mgpu issues.
There have been tests by PC per into framepacing for both Crossfire and SLI, and SLI more often than not is a higher framerate version of the single GPU.

And as someone who has used SLI for 10 years, your claims are rather outrageous. The most recent problems of SLI not working in a number of high profile cases does not reflect its entire catalogue of games. Hence why there are huge threads on Guru3d or, 3d center, or computer base dedicated to SLI users cataloging older games and newer games in SLI configs.
Of the last 5 games I have played recently - Wolfenstein TNO, Darksiders, Alien Isolation, MGS GZ, and s uper old DX 8 game, Ravenshield - only 1 hasnt had working SLI (which is an engine problem, not NVs). The rest scaled perfectly with TriSLI.

Your opinion doesnt seem reinforced by any proof or even any recent anecdotal evidence.
 

martino

Member
There have been tests by PC per into framepacing for both Crossfire and SLI, and SLI more often than not is a higher framerate version of the single GPU.

And as someone who has used SLI for 10 years, your claims are rather outrageous. The most recent problems of SLI not working in a number of high profile cases does not reflect its entire catalogue of games. Hence why there are huge threads on Guru3d or, 3d center, or computer base dedicated to SLI users cataloging older games and newer games in SLI configs.
Of the last 5 games I have played recently - Wolfenstein TNO, Darksiders, Alien Isolation, MGS GZ, and s uper old DX 8 game, Ravenshield - only 1 hasnt had working SLI (which is an engine problem, not NVs). The rest scaled perfectly with TriSLI.

Your opinion doesnt seem reinforced by any proof or even any recent anecdotal evidence.

i used to have a 9800gx2 and it had some problems
then i switch for CF 6850 and now 7870xt and i don't notice more problems
Most of the time forcing CF works pretty well when there is no profile (which is often)
 
What is this hardware frame pacing? I've never heard of it.

It just means there is a dedicated hardware unit in NV gpus which controls the framepacing in SLI.

NV revealed their history of framepacing when everyone started doing FCAT testing and it showed how behind CFX was in terms of framepacing (they had their gloating moment). Previously there was only anecdotal evidence from hardware review sites that the NV SLI rig felt "smoother" than the CFX rig at the same framerates.
i used to have a 9800gx2 and it had some problems
then i switch for CF 6850 and now 7870xt and i don't notice more problems
Most of the time forcing CF works pretty well when there is no profile (which is often)

How does forcing a profile work in CFX? Do you rename .exes? Or add .exes to a profile? Or can you set up your own?
 

Ty4on

Member
What is this hardware frame pacing? I've never heard of it.

In SLI the GPUs render every other frame and ideally one is done when the other is halfway finished. If there is no frame pacing they will at times get out of sync and you'll end up with them basically rendering the same frame and you get the same effective FPS as with a single GPU. The extra frame that extra GPU is rendering either turns into this tiny tear or nothing at all if vsync is on.
runt.jpg
 

tuxfool

Banned
How does forcing a profile work in CFX? Do you rename .exes? Or add .exes to a profile? Or can you set up your own?

It is much the same as nvidia. You can choose a premade profile (from a list of games) or manually choose 3 different modes in case no profile exists (AFR, compat and another which I'm forgetting as I longer do cfx). There are no SLI bits as there is no nvidia inspector equivalent.
 
There have been tests by PC per into framepacing for both Crossfire and SLI, and SLI more often than not is a higher framerate version of the single GPU.

And as someone who has used SLI for 10 years, your claims are rather outrageous. The most recent problems of SLI not working in a number of high profile cases does not reflect its entire catalogue of games. Hence why there are huge threads on Guru3d or, 3d center, or computer base dedicated to SLI users cataloging older games and newer games in SLI configs.
Of the last 5 games I have played recently - Wolfenstein TNO, Darksiders, Alien Isolation, MGS GZ, and s uper old DX 8 game, Ravenshield - only 1 hasnt had working SLI (which is an engine problem, not NVs). The rest scaled perfectly with TriSLI.

Your opinion doesnt seem reinforced by any proof or even any recent anecdotal evidence.

which of those games did sli not work in? with the exception of alien(somewhat), all of those games are graphically pretty ancient. sli being near useless in todays big graphics titles is certainly relevant to my statement as i qualified it with the words "these days". the situation is only going to get worse going forward. sli mostly works well in graphically simple titles based around aging dx9 technology.
 
which of those games did sli not work in? with the exception of alien, all of those games are graphically pretty ancient. sli being near useless in todays big graphics titles is certainly relevant to my statement as i qualified it with the words "these days". the situation is only going to get worse going forward. sli mostly works well in graphically simple titles based around aging dx9 technology.

Graphically ancient games? MGS V? Hence that somehow makes SLI work better? Why are you qualifying your condemnation of SLI in such a goal post moving way?

FYI, the titel it did not work in was Wolfenstein TNO; idTech 5 does not work with any multi GPU. SLI works with many many modern games. It is not left to DX9 engines or something. Other games that I own which work perfectly fine with SLI: Far Cry 3. Crysis 1-3, All battlefield games, Metro games (all of them), all the total war games. It doesn't work well under DX11 with games which share information between frames, for example: Company of Heros 2.

You are posing your criticism based up on old anecdotes (nigh 10 year old at this point) and a handful of AAA games which have notoriously poor QA problems (Far Cry 4, ubisoft games). SLI isnt perfect, but it definitely isnt "shit" as you put it. Just ask Maldo.
 
Graphically ancient games? MGS V? Hence that somehow makes SLI work better? Why are you qualifying your condemnation of SLI in such a goal post moving way?

FYI, the titel it did not work in was Wolfenstein TNO; idTech 5 does not work with any multi GPU. SLI works with many many modern games. It is not left to DX9 engines or something. Other games that I own which work perfectly fine with SLI: Far Cry 3. Crysis 1-3, All battlefield games, Metro games (all of them), all the total war games. It doesn't work well under DX11 with games which share information between frames, for example: Company of Heros 2.

You are posing your criticism based up on old anecdotes (nigh 10 year old at this point) and a handful of AAA games which have notoriously poor QA problems (Far Cry 4, ubisoft games). SLI isnt perfect, but it definitely isnt "shit" as you put it. Just ask Maldo.

mgs v is a ps360 port, ancient and yes it makes sli work better. as you have already stated, more modern dx 11 titles reusing frame data(and this is just 1 of the many problems) tends to break sli. all the other games you listed are ps360 titles at heart too. its hardly a handful of games. why dont you list all the graphically modern games you imply im ignoring where sli scales well without introducing bugs and breaking graphics features.

btw, epic isnt even developing for it as a use case in ue4. i see that engine being very popular. it also took nvidia a long ass time to get sli working in the recent total war games. so for that entire waiting period your money was just wasted.
 

MaLDo

Member
mgs v is a ps360 port, ancient and yes it makes sli work better. as you have already stated, more modern dx 11 titles reusing frame data(and this is just 1 of the many problems) tends to break sli. all the other games you listed are ps360 titles at heart too. its hardly a handful of games. why dont you list all the graphically modern games you imply im ignoring where sli scales well without introducing bugs and breaking graphics features.

btw, epic isnt even developing for it as a use case in ue4. i see that engine being very popular. it also took nvidia a long ass time to get sli working in the recent total war games. so for that entire waiting period your money was just wasted.

Two days ago I was testing Assetto Corsa scaling. A DX11 new game.

Single gpu versus 3xSLI

2880x1620 + 2xAA + 16xAF. 50 fps -> 145 fps. Pretty good if you ask me.

I can post similar screenshots of pCARS, Unity, DragonBall Xenoverse, Gas Guzzlers Extreme, Borderlands Pre-sequel, NBA2K15, Evolve, Dying Light, Sherlock Holmes: Crimes and Punishments or any game I've bought lately.
 

Zaptruder

Banned
Apparently multi-GPU will be the new hotness with DX12.

Unified memory pool, developers controlled access, asynchronous frame rendering (meaning that one GPU renders some parts of the scene, the other GPU renders other parts - and then the whole frame is stitched together into one image).

Also results in lower latency between each frame as each frame needs to render less of the environment.

And there's also VR multi-GPU which just renders to each eye.

Either way, the premium on of fast single GPU cards will probably start to draw down as we move more and more games onto DX12 and developers become acclimatized to the new development modality.
 

Marlenus

Member
So looking at some Titan X reviews.

TPU
Anandtech
HardOCP - Preview

It seems that the performance gain over the 980 at 4k is in the region of 30-35% which is exactly where these benchmarks had it pegged. That suggests that these benchmarks are fairly legit so it is looking like it will be a very close scrap between the Titan X and the R9-390X WCE (what a mouthful that is) for the performance crown. I just hope the delay to the 390X is to get it to 8GB of HBM rather than sticking it out with just 4GB because that is not going to cut it at 4k resolutions. I also cannot see the 390X being much more than $700 but that is just speculation on my part.
 
Apparently multi-GPU will be the new hotness with DX12.

Unified memory pool, developers controlled access, asynchronous frame rendering (meaning that one GPU renders some parts of the scene, the other GPU renders other parts - and then the whole frame is stitched together into one image).

Also results in lower latency between each frame as each frame needs to render less of the environment.

And there's also VR multi-GPU which just renders to each eye.

Either way, the premium on of fast single GPU cards will probably start to draw down as we move more and more games onto DX12 and developers become acclimatized to the new development modality.

dx12 does indeed enable developers to make better use of mgpu, but its extremely optimistic to think many developers are going to do the gargantuan amount of extra work for such a miniscule fraction of the market. dx12 is going to be much harder to code for than dx11.
 

MaLDo

Member
asseto corsata is not a graphically modern game, neither is pcars. the only titles you listed that are relevant from a graphics perspective are unity and dying light. iirc dying light has flickering problems w/ sli and unity has stuttering problems with sli. scratch that, sli appears to be completely useless in dying light

http://www.hardocp.com/article/2015/03/10/dying_light_video_card_performance_review/7#.VQlHA-FfB8E

Sorry, this review is bullshit. Dying Light flickering in SLI was only a thing using custom sli bits, I mean, modifying the default driver values in order to achieve another bump in performance. After patches, that modification was not necessary anymore. Dying Light works perfect in SLI and I can prove it. Unity stutering is not related with SLI but with a bug in the loading assets functions combined with how the save game files stores lists of preloaded textures. Was a problem of the game that can affect every user in multigpu or singlegpu.
 
Sorry, this review is bullshit. Dying Light flickering in SLI was only a thing using custom sli bits, I mean, modifying the default driver values in order to achieve another bump in performance. After patches, that modification was not necessary anymore. Dying Light works perfect in SLI and I can prove it. Unity stutering is not related with SLI but with a bug in the loading assets functions combined with how the save game files stores lists of preloaded textures. Was a problem of the game that can affect every user in multigpu or singlegpu.

i dont get stuttering in unity on my 980. that hardocp review is only 8 days old btw. i also remember people saying sli was broken countless times in the dying light perf thread.
 

MaLDo

Member
What is this hardware frame pacing? What does it do (beyond the obvious)? I've never heard of it.


Ideally, you want every output frame from your GPU array to be equidistant in time to its neighbors. But I'm talking about GAME WORLD time, not real time.

If you have 3 GPUs rendering a game with very tiny CPU tasks, you will have 3 frames with a little time difference in the game world but those will be drawn in your screen with a 16.666 ms difference between them. The framerate output in screen will be perfect with one frame for every refresh, but those frames are very similar because represents three near identical moments of the game world. And the fourth frame will be too different. Framerate can be perfect but smooth feeling will be broken.

Framepacing means the driver hold every gpu and only allow it to work when the driver thinks that the rendered output would be equidistant with the next frame. Imagine playing a RTS and you want to order your gatherer units to go in an homogenous rhythm because letting they go faster they can collapse in the resource point.

For benchmarks with vsync off, framepacing reduces the measured performance. But it increase enormously the smooth feeling.

Doing this math in the hardware is very efficient because they can foresee render times in the source with less error factor.
 

wiggleb0t

Banned
asseto corsata is not a graphically modern game, neither is pcars. the only titles you listed that are relevant from a graphics perspective are unity and dying light. iirc dying light has flickering problems w/ sli and unity has stuttering problems with sli. scratch that, sli appears to be completely useless in dying light

http://www.hardocp.com/article/2015/03/10/dying_light_video_card_performance_review/7#.VQlHA-FfB8E

Pcars is not 'graphically modern'.???
SLI with dying light doubling my framerate with no bugs - completely useless?

Troll on.
 
Pcars is not 'graphically modern'.???
SLI with dying light doubling my framerate with no bugs - completely useless?

Troll on.

'"Basically, SLI is not working. We are using GeForce 347.52, we have GeForce Experience installed and performed an update check for the latest profiles. However, as you can see, there is no performance improvement with SLI. We also tried forcing AFR from the driver control panel using AFR 1 and AFR 2 methods. All this did was make the game choppy and reduce performance. We verified SLI was working in other games.

SLI support in this game has been an issue since game launch. Early patches supposedly addressed some multi-GPU issues, and even NVIDIA has said newer profiles and game updates will be needed. However, as of right now, SLI is still a no-go for us. We know other gamers have reported SLI not working issues as well, and for others only minor improvements in performance when it does work. This is a shame, we really need SLI to play at 4K."

sure thing clown
 
Top Bottom