• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Next Generation GPUs With Up To 7552 Cores Benchmarked 40% Faster Than TITAN RTX

pawel86ck

Banned
970 gtx got bottlenecked by a 9900k in they are billions and anno 1800. So its just whatever u want to do with it man.



It won't its basically just a super charged HDD and that's about it. SSD main problem is access times they are like a factor of 1000 times slower then ram. They will never replace ram or v-ram. And the fact those consoles ship with v-ram should already tell you enough.

And yea HBM2 is probably a big mistake they are pushing, they should have gone for gddr. I could see that card being expensive. But nvidia will not let them take the crown even remotely. And i hope that AMD gpu pushes far past the 2080ti as result with a payable price.
Like I have said SDD doesn't need to be as fast as RAM in order to make a difference. It will still provide 40x streaming speed and that's a huge difference. Also Digital Foundry proved SDD makes a difference. They have compared Radeon pro (it has has build in SDD as additional memory), and when Nv GPU run out of VRAM and refused to run their test Radeon pro run it perfectly fine thanks to SDD. So SDD magic is obviously working.
 
Last edited:

Kenpachii

Member
Like I have said SDD doesn't need to be as fast as RAM in order to make a difference. It will still provide 40x streaming speed and that's a huge difference. Also Digital Foundry proved SDD makes a difference. They have compared Radeon pro (it has has build in SDD as additional memory), and when Nv GPU run out of VRAM and refused to run their test Radeon pro run it perfectly fine thanks to SDD. So SDD magic is obviously working.

40x means nothing when ram needs 1000-4000 times more access speed to actually read and write information. Which is a different league entirely.

The streaming speed that's actually useful is already in use by HDD's also, The reason why HDD's are limited is that they can only push x amount of stuff forwards without bottlenecking hard. SSD's solve this because they are faster so shit can be loaded into your memory where the game can actually read and write insane amounts of times towards. They are just super HDD's really and nothing more. Remove the ram and boom framerate is dead.

The DF video you talk about was a demonstration about as useful as running crisis on a CPU instead of a GPU entirely, sure u can do it with 64 cores and 128 threads, but a single utter garbo igpu with barely any capacity will demolish that performance in 10 fold. With ram 1000 that's a 1000 fold, this si why u see 16gb pools of ram in those boxes otherwise they wouldn't spend a dime on those expensive chips.

If you where at the demonstration, u should pick up the mouse and turn the camera watch that screen freeze up for multiple minutes to showcase you the next frame. Its a demonstration that its possible to do, but useful? not even remotely specially not in games.
 
Last edited:

psorcerer

Banned
They are just super HDD's really and nothing more.

Yes and no.
HDD bandwidth and latency is much worse.
You cannot do an efficient gather operation with HDD while you can do that with flash.
In fact flash can have comparable (to RAM) speed for gathers of your controller knows how to do it.
Obviously flash cannot replace RAM (for now) but it's streaming capabilities are really game-changing.
For example meshes and textures can be streamed pretty efficiently: KZ3 used 500mb buffer for hdd streamable data with flash it can be 50gb for the same latency numbers.
Means streaming actually becomes "free".
You can prefetch new assets and they will be ready by the time you get to that area.
With careful prefetch you may never run out of RAM (if your game immediate frames fit in RAM for at least 5 sec or so)
 

Kenpachii

Member
Yes and no.
HDD bandwidth and latency is much worse.
You cannot do an efficient gather operation with HDD while you can do that with flash.
In fact flash can have comparable (to RAM) speed for gathers of your controller knows how to do it.
Obviously flash cannot replace RAM (for now) but it's streaming capabilities are really game-changing.
For example meshes and textures can be streamed pretty efficiently: KZ3 used 500mb buffer for hdd streamable data with flash it can be 50gb for the same latency numbers.
Means streaming actually becomes "free".
You can prefetch new assets and they will be ready by the time you get to that area.
With careful prefetch you may never run out of RAM (if your game immediate frames fit in RAM for at least 5 sec or so)

I don't get your yes and no. Streaming more data into the memory faster is great and allows for more complex games through faster data that streams through it, yet everything has to go through the ram pool because SSD's are far to slow to deal with this. The endless amount of v-ram possibility is possible if u have a static screen with barley any motions in it that require swapping information out of that ram.

Aka if sony wanted to get infinity amounts of ram through a storage solution they shouldn't have gone for SSD's they should have gone for a loads of ram that remembers data which is actually on its way in the industry. the current SSD solution isn't even remotely going to replace ram in any way all we will see is more ram.

Can you prove it? Star citizen runs extremely bad on standard HDD.

Because it has to load in more data faster then a HDD can provide.
 
Last edited:

psorcerer

Banned
I don't get your yes and no.

Aka if sony wanted to get infinite v-ram they shouldn't have gone for SSD's they should have gone for a infinity ram pool that remembers data which is actually on its way ( the ram that stores data ) the current SSD solution isn't even remotely going to replace ram in any way.

If flash could replace RAM Sony and MSFT would ditch RAM altogether from consoles. It's too expensive.
But they don't.
Which means that RAM still will hold the hot data for 5 sec of the game. But next 5 sec can be loaded from SSD.
 

pawel86ck

Banned
Because it has to load in more data faster then a HDD can provide.
Before you wrote usefull streaming speed is already achieved on HDD, but now you say star citizen runs better on SDD because it can provide more data. Arnt you contradicting yourself?
 
Last edited:

Kenpachii

Member
Before you wrote usefull streaming speed is already achieved on HDD, but now you say star citizen runs better on SDD because it can provide more data. Arnt you contradicting yourself?

With that i mean the normal drive read/write performance not the access time. Even while that is a improvement over HDD for sure it's nowhere near fast enough to replace ram. SSD's are just super HDD's basically.

Starcitizen needs a SSD because it needs access towards more data faster and probably also fares with access time a lot better that is improved over the HDD. There are more games that needs SSD's on PC already because of more data that is required. This doesn't mean that SSD's can replace ram.

Anyway its late here so i see double i am out.
 
On average 2080ti is only around 30-40% faster than 1080ti in raster, however there are games that shows even above 64% and not to mention huge difference in RT performance (in quake 2 RTX 1080ti is slower in 480p than 2080ti in 1440p).

iVpwG6s.jpg


So Turing wasnt as bad as people think (DLSS and VRS are also very important features). These cards were simply too expensive.

In my opinion using the 2080ti as an indication of Turing performance is ingenious. It's price is farcical and its out of reach for 99% of gamers. ALL the other Turing cards had Pascal equivalent performance, it took a refresh for the 2080 to be ~16% faster than a 1080ti and even then you would be insane to spend $700 for that small increase. Utterly pointless cards.
 

psorcerer

Banned
On average 2080ti is only around 30-40% faster than 1080ti in raster, however there are games that shows even above 64% and not to mention huge difference in RT performance (in quake 2 RTX 1080ti is slower in 480p than 2080ti in 1440p).

iVpwG6s.jpg


So Turing wasnt as bad as people think (DLSS and VRS are also very important features). These cards were simply too expensive.

Sad story of current day PC gaming. Cycles are spent on unimportant features like DLSS and a joke of RT. Because the GPUs are so underutilized.
 
Sad story of current day PC gaming. Cycles are spent on unimportant features like DLSS and a joke of RT. Because the GPUs are so underutilized.
Are you implying that DLSS is not useful? That is very ingenious to say the least. Games are played much better with much higher frame rates, and upscaled with very little diminishing qualities, to achieve HIGH FRAME RATE. Why play at 4k 60+fps, when you can upscale from 1440p, with barely any loss of fidelity, but over 100fps @4k. Yeah Turing wasn't as big of a performance gap, as Pascal, compared to previous gen... But to say those features or are a joke? You can't be that foolish, can you? If Nvidia didn't put this much of an effort, would raytracing even be a thing for next gen? AMD had no intention to jump into raytracing, anytime soon, if it weren't for Nvidia!!! Which would affect consoles and pc's, right now, and for next gen. We can jump to 4k or even 8k in resolution, but it doesn't make as much of a difference as raytracing or DLSS.
 
Last edited:

pawel86ck

Banned
Sad story of current day PC gaming. Cycles are spent on unimportant features like DLSS and a joke of RT. Because the GPUs are so underutilized.
DLSS 2 in wolfenstein young blood looks amazing, it's comparable to native 4K with much better performance. Also RT make a big difference in for example control, lighting during gameplay looks mich better than even during cutscens in other games. To me it looks like people who criticize RT just dont want better graphics.
 
Last edited:

Leonidas

Member
Sad story of current day PC gaming. Cycles are spent on unimportant features like DLSS and a joke of RT. Because the GPUs are so underutilized.

Yeah, thanks to Nvidia we already have ray-tracing on PC and GPUs more powerful than next-gen consoles.

The only sad story of PC gaming today is AMD Radeon graphics. Still catching Turing almost two years after the fact. Having a console GPU announced with ray-tracing and VRS before PC. Embarrassing.

Nvidia GPUs have saved PC gaming from falling behind consoles.
 

pawel86ck

Banned
In my opinion using the 2080ti as an indication of Turing performance is ingenious. It's price is farcical and its out of reach for 99% of gamers. ALL the other Turing cards had Pascal equivalent performance, it took a refresh for the 2080 to be ~16% faster than a 1080ti and even then you would be insane to spend $700 for that small increase. Utterly pointless cards.
2070S is much cheaper compared to 2080ti and has all turing features. VRS boost performance in 3Dmark by 75% on this card.



DLSS 2 looks amazing and also provide big perfotmance boost.



HW RT runs few times faster compared to pascal.

Turing GPU are more expensive compared to Pascal but price aside Turing GPUs are also much better compared to Pascal. Maybe people still dont see it now because developers still dont use Turing features how they should (we only now started seeing games like Control or Wolfenstein 2 youngblood), but soon next gen consoles will launch and I'm sure the difference between pascal and turing will be only bigger.

2070S can run wolfenstein 2 youngblood with 4K DLSS + RT and still game runs at 60fps. I no longer have 1080ti to test it in this game but I'm sure with comparable settings (RT enabled in 4K) there would be a slideshow on 1080ti.

RTX performance impact even on Turing is big, but that's first gen HW RT. Shaders run equally bad on Geforce 3 (performance went from 80 to 20 fps in half life 2) yet these days pretty much all games use shaders. In some ways Turing GPUs are comparable to Geforce 3, because we will see much better effects in games now thanks to it.
 
Last edited:

Bootzilla

Banned
I'm not a huge resolution whore (1440 is fine for monitor-sized screen) so I'm more interested to see what improvements they've made to the the ray-tracing and tensor cores than the traditional performance. RT is still in its infancy but gen 2 could change that.

I'm feeling about due for an upgrade. I was considering one for Alyx but I'll probably hold off a little longer and see what the RTX 3070 looks like.
 

pawel86ck

Banned
Ray Tracing and DLSS is not available in 99.99% of the games available today. Why should anyone buy/upgrade to expensive RTX cards with rasterization performance of 2 year old cards?
Pascal owners arnt forced to upgrade just yet, but the clock is ticking. Price aside Turing GPUs are supperior compared to Pascal GPS (Mesh Shading, VRS, DLSS, HW RT) but these unique features have to be implemented first. If you haven't noticed more and more new games are using these Turing features already. Just wait and see what will happen when PS5 / XSX games will be ported to PC.

then? Seems like hogwash, particularly the part that RDNA2 was designed exclusively for MS. yeah right.

Whole Transcript is very detailed and it doesn't look fake to me although some parts could be somewhat altered if xbox fanboy wrote it😃.

For some reason MS has build DXR in windows 10 and planned it a long time ago for a fact (Nv has only made hardware for it). I think it's possible MS has co-engineered HW RT with AMD although I doubt RDNA2 will be PC and XSX exclusive feature.
 
Last edited:

psorcerer

Banned
DLSS 2 in wolfenstein young blood looks amazing, it's comparable to native 4K with much better performance. Also RT make a big difference in for example control, lighting during gameplay looks mich better than even during cutscens in other games. To me it looks like people who criticize RT just dont want better graphics.

Youngblood has pretty low freq stylized visuals. No wonder.
In Control RT path was emphasized by gimping non-RT path into oblivion.
Not convinced.
 

SmokSmog

Member
Another benchmark 7936 cuda cores, compared to this RTX 2080Ti looks like slow GPU

Stock RTX 2080Ti scores 129.000, new gpu is 72% faster

 
Last edited:

pawel86ck

Banned
[/QUOTE]
Youngblood has pretty low freq stylized visuals. No wonder.
In Control RT path was emphasized by gimping non-RT path into oblivion.
Not convinced.

7CfVZU0.png


Lighting during normal gameplay in Control looks better than cutscenes in other games thanks to RT. It's a huge milestone in computer graphics but go ahead and tell us you cant tell the difference 😅.
 

psorcerer

Banned
7CfVZU0.png


Lighting during normal gameplay in Control looks better than cutscenes in other games thanks to RT. It's a huge milestone in computer graphics but go ahead and tell us you cant tell the difference 😅.

Left one is just ugly. On purpose.
P.S. offline renderers use >500 rays per pixel, how many are here? 20?
 
Last edited:

pawel86ck

Banned
Left one is just ugly. On purpose.
P.S. offline renderers use >500 rays per pixel, how many are here? 20?
So if you think it was gimped on purpose then you should be able to provide many examples with better ingame character lighting. The best looking game I have ever played is probably uncharted 4 and ingame character lighting was equally flat. Only lighting during cutscens looked good.
 

pawel86ck

Banned
Get a grip, AMD keeps grabbing market share and increasing its margins.
Navi has better perf/transistor than Turing.
AMD might or might not skip oversized overpriced chips that less than 1% of the gamers buy, but it will obviously be very present in mid and high mid range.
Little reminder, Navi GPUs are build in 7nm and Nv GPUs in 12nm.
 

psorcerer

Banned
So if you think it was gimped on purpose then you should be able to provide many examples with better ingame character lighting. The best looking game I have ever played is probably uncharted 4 and ingame character lighting was equally flat. Only lighting during cutscens looked good.

Ingame Uncharted looked better than no-RT Control.
 

Kenpachii

Member

7CfVZU0.png


Lighting during normal gameplay in Control looks better than cutscenes in other games thanks to RT. It's a huge milestone in computer graphics but go ahead and tell us you cant tell the difference 😅.
[/QUOTE]

Control models are buttugly by default, practically last gen quality.

maxresdefault.jpg
 
Last edited:
Top Bottom