• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Next Generation GPUs With Up To 7552 Cores Benchmarked 40% Faster Than TITAN RTX

Eotheod

Member
Much like the last motherboard i bought claiming to be locked behind windows 10 only, I wonder how long it will be before they start locking gpus behind windows 10?
Uh what? I just tried to find any mention of recent motherboards that only allow Win10 images but I can't find a single hit. Are you just making shit up?
 

Shin

Banned
wait for reviews
Solid stance and advice to take.
However one needs to remember that nVidia is moving from 14/12nm to 7nm (EUV) so that sounds about right.
If they aren't switching to 5nm (which is another full node from 7 AFAIK) next year then I'd be skeptical as well TBH.
We shall see though, these things are always filled with hype prior to their official announcement but usually tend to disappoint upon release.
 

GymWolf

Member
source.gif
 

kraspkibble

Permabanned.
will be selling my 2080 to upgrade to the 3080 Ti.

the 2080 was fine for me until I got a 144hz monitor. now i need the absolute most powerful card i can get.
 
Last edited:

GamingKaiju

Member
Rumors aside. Let’s be realistic here. Does anyone actually think AMD with ‘Big Navi’ is going to dethrone whatever Nvidia is cooking up next. They’re (Nvidia) continuously pushing the standards with graphic tech regardless of how much they want to charge. This isn’t a Intel situation.

No chance Nvidia are the GPU kings and are the market leaders for a reason. AMD have made great strides in their CPU department but CPU tech isn’t comparable to GPU tech. My experience with AMD GPU’s and I’ve had a few is that they don’t perform as good as their nivida counterparts runner hotter, pull more power of the wall and have a higher failure rate in my experience.

Whatever AMD cook up for their GPU’s Nvidia are already ahead of them and working on the next break through.
 

diffusionx

Gold Member
AMD has higher IPC atm...….AMD TFLOPS do more now, will do even more after RDNA 2....

And besides guys, the issue was never TFLOPS, higher TFLOPS are always better, the problem was the architecture or rather the efficiency and how much devs got out of it...….Vega had more raw power than Nvidia, it's just that at most times 50% of Vega's power was not utilized......NV in the past squeezed more out of it's arch, its arch was more suited to gaming and they also lowered IQ as well to push frames vs AMD......AMD always had better picture quality...

teraflops are good shorthand this console gen because the two systems had the same architecture and their theoretical flop numbers mapped to their capabilities. But in reality it’s not a good way to compare GPUs at all. Especially with this muh RDNA2 efficiency nonsense. We won’t know about anything until the actual systems/cards drop and get put through their paces, teraflops is all hype and PR at this point.

Read articles about GPUs on reputable PC gaming sites and you won’t see all this talk about teraflops, it’s about CUs/CUDA cores, clock speed, memory bandwidth, and most important, benchmarks. Teraflops is just hype shit. I recommend Anandtech to start.
 
Last edited:

diffusionx

Gold Member
Rumors aside. Let’s be realistic here. Does anyone actually think AMD with ‘Big Navi’ is going to dethrone whatever Nvidia is cooking up next. They’re (Nvidia) continuously pushing the standards with graphic tech regardless of how much they want to charge. This isn’t a Intel situation.

I hope so, we have had the overpriced and disappointing 2xxx series since 2018 because AMD can’t get their shit together. The 5700 XT is a nice card and Nvidia responded with the Super line which was what the 2xxx should have been. Realistically though Nvidia isn’t even on 7nm yet and hasn’t shown anything in a year and a half. Ampere is probably going to be a monster.
 

nemiroff

Gold Member
Nah, heard it before, now immune . I literally have no hope for a reasonably priced product in a performance gain context coming out of this. And this is coming from a loooong time Nvidia buyer. I'm sick and tired of mouse step performance gains and big price leaps.
 

psorcerer

Banned
The difference between NV card and AMD is 80-90% in the drivers.
AMD cards were better hw-wise since GCN.
But AMD just doesn't have the manpower to tailor the drivers to each game like NV does.
 

thelastword

Banned

ZywyPL

Banned
Those specs don't make any sense at all - Maxwell was already build mainly with high frequencies in mind, Pascal build upon that even further, and both were extremely successful because of that, so it would be really hard to believe NV all of a sudden cut the clock speeds in half basically, just to to build those massive GPUs, just to gain... 1-2TF literally, compared to 2080/2080Ti.

Which speaking of, a 2080Ti already does even 100-150FPS at 4K, and you need one of those 1000$ displays to actually utilize all that raw power, so again, that's would be the point of making such big chips for little-to-no performance gains? I understand that going to 7nm process will allow them to pack more CUDA cores on the same die size, but not for such a clock penalty, if anything, they would opt to keep the high clocks and just slightly increase in SPs over Turing GPUs, resulting in more performance but on smaller dies, which translates into more profit. Seriously, an OCed 2080Ti already puts out 16-17TF, this rumored design looks like a serious step back.

My bet would be using the 7nm to seriously boos the RT performance, because that's what's seriously lacking at their current GPU lineup, going from those mentioned 100-150FPS at 4K to barely 60 at 1080p means nothing less that the RT performance needs to be boosted roughly 4-5 times, and I'm not sure the 7nm process node can even allow for that TBH.
 

base

Banned
AMD has higher IPC atm...….AMD TFLOPS do more now, will do even more after RDNA 2....

And besides guys, the issue was never TFLOPS, higher TFLOPS are always better, the problem was the architecture or rather the efficiency and how much devs got out of it...….Vega had more raw power than Nvidia, it's just that at most times 50% of Vega's power was not utilized......NV in the past squeezed more out of it's arch, its arch was more suited to gaming and they also lowered IQ as well to push frames vs AMD......AMD always had better picture quality...
Oh yeah? Check Vega 64 ;)
 
I don't quite think you know what you're talking about. TERRORFLOPS. DO. NOT. MATTER. It's only in the console world, that PR wants you to believe TF numbers matter. Just do a basic Google search at pc gpu's, and you won't see mentions of TF's cause, THEY DON'T MATTER. The 2080 (not even the super), will beat out the xbsx. This is very obvious at this point. The ps4 is like a gtx 750/750 ti. The xb1x is like a rx 480/580. Compare those gpu's to the consoles, and you'll see why even with the equivalent gpu's, pc will always win, hands down, every single time.

It is an exciting time to be a gamer that’s for sure and this big powerful next jump in graphics cards feels like it should have been what the 20xx line should have been to me.

I believe there is potential for this generation to upset the ‘pc wins every time’ argument. The -only- reason will be the ssd and the ability for devs to use the ssd as slower memory.

You’ll say but pc as done SSD for years and yes that’s true but devs creating games for PC need to keep the lowest common denominator in mind and despite all the SSD speed we see on PC’s these days it doesn’t make any difference to the games where as I do believe it will make a significant difference to the console games for next generation when it is utilised properly by the high end devs. I’m not sure it’s going to be possible to transition that to PC hardware. Maybe it will and they’ll make the minimum specs a PCIe3.0+ SSD for the PC version but I don’t really believe it.
 
Last edited:

sendit

Member
It is an exciting time to be a gamer that’s for sure and this big powerful next jump in graphics cards feels like it should have been what the 20xx line should have been to me.

I believe there is potential for this generation to upset the ‘pc wins every time’ argument. The -only- reason will be the ssd and the ability for devs to use the ssd as slower memory.

You’ll say but pc as done SSD for years and yes that’s true but devs creating games for PC need to keep the lowest common denominator in mind and despite all the SSD speed we see on PC’s these days it doesn’t make any difference to the games where as I do believe it will make a significant difference to the console games for next generation when it is utilised properly by the high end devs. I’m not sure it’s going to be possible to transition that to PC hardware. Maybe it will and they’ll make the minimum specs a PCIe3.0+ SSD for the PC version but I don’t really believe it.

Agreed. We still don't know details on how the SSD will be utilized on next gen consoles. This (the SSD) is the more important factor. Will it be used in a traditional sense? Or will it be utilized as a central design element when developing a game? Consoles aren't going to beat a PC in terms of raw output (TFlops).
 
Last edited:
It is an exciting time to be a gamer that’s for sure and this big powerful next jump in graphics cards feels like it should have been what the 20xx line should have been to me.

I believe there is potential for this generation to upset the ‘pc wins every time’ argument. The -only- reason will be the ssd and the ability for devs to use the ssd as slower memory.

You’ll say but pc as done SSD for years and yes that’s true but devs creating games for PC need to keep the lowest common denominator in mind and despite all the SSD speed we see on PC’s these days it doesn’t make any difference to the games where as I do believe it will make a significant difference to the console games for next generation when it is utilised properly by the high end devs. I’m not sure it’s going to be possible to transition that to PC hardware. Maybe it will and they’ll make the minimum specs a PCIe3.0+ SSD for the PC version but I don’t really believe it.
You mind showing me where you can have speeds of vram or ram, on a ssd? Also it must be affordable, not something that costs several hundreds of dollars. Also it would need to have almost unlimited write/read iops. If you think this is going to be a thing.... I got some oceanfront property I could sell you....i wish people would use critical thinking instead of eating up PR and rumors.... And passing it along as facts.
 
It is an exciting time to be a gamer that’s for sure and this big powerful next jump in graphics cards feels like it should have been what the 20xx line should have been to me.

I believe there is potential for this generation to upset the ‘pc wins every time’ argument. The -only- reason will be the ssd and the ability for devs to use the ssd as slower memory.

You’ll say but pc as done SSD for years and yes that’s true but devs creating games for PC need to keep the lowest common denominator in mind and despite all the SSD speed we see on PC’s these days it doesn’t make any difference to the games where as I do believe it will make a significant difference to the console games for next generation when it is utilised properly by the high end devs. I’m not sure it’s going to be possible to transition that to PC hardware. Maybe it will and they’ll make the minimum specs a PCIe3.0+ SSD for the PC version but I don’t really believe it.
Paging Gavin Stevens Gavin Stevens to answer this question for you.
 
Uh what? I just tried to find any mention of recent motherboards that only allow Win10 images but I can't find a single hit. Are you just making shit up?


Il find the details of my motherboard (intel) and get back to you, but no.

When I asked the salemans at the store (I knew it was a waste of time but I thought hey, humor me) and he said that you cannot install anything other than windows 10 on it. I knew this would be bull even though he was backed up by what it said on the box, and sure enough Ive win 8.1 on there.
 

diffusionx

Gold Member
Its always the same 50% , 100% faster fake shit.

https://wccftech.com/nvidia-rtx-208...mance-50-faster-vs-pascal-but-is-it-worth-it/

And how did that turn out? :messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy: Nvidia’s most disappointing range of cards in years!

I for one will be wait for reviews, blind fanboy I am not.

The 2xxx line is very disappointing, and it got called out immediately. But sometimes, it's true. The 1xxx line was 50-100% faster than the 9xx line across the board - see the benchmarks in the link below. With a die shrink and new architecture the 3xxx could get there.

 

Kenpachii

Member
Give 24gb of v-ram with it call it the ti version release it for 700 bucks will buy one the moment aftermarket coolers come out. Also thing needs to be 100% faster then my 1080ti, so it could actually do raytracing half decent in games.
 
Last edited:

pawel86ck

Banned
Exactly, which is why I said, I read an article saying 50% last week, now it's 40%, when they release it, most likely it will be 20-30%.....
40%? You are talking about ampere leak? in the same article it was mentioned ampere GPU was running at 1.1GHz 16.6TF. 2080ti OC also has over 16TF, so that suggest 40% IPC gains and when we add to that 2GHz clock 3080ti should be mighty impressive (30TF with 40% higher IPC). This ampere leak looks almost too good to be true, because it suggest literally 2x faster GPU.
 

Kenpachii

Member
40%? You are talking about ampere leak? in the same article it was mentioned ampere GPU was running at 1.1GHz 16.6TF. 2080ti OC also has over 16TF, so that suggest 40% IPC gains and when we add to that 2GHz clock 3080ti should be mighty impressive (30TF with 40% higher IPC). This ampere leak looks almost too good to be true, because it suggest literally 2x faster GPU.

Men if true i need this shit badly.
 

pawel86ck

Banned
Its always the same 50% , 100% faster fake shit.

https://wccftech.com/nvidia-rtx-208...mance-50-faster-vs-pascal-but-is-it-worth-it/

And how did that turn out? :messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy: Nvidia’s most disappointing range of cards in years!

I for one will be wait for reviews, blind fanboy I am not.
On average 2080ti is only around 30-40% faster than 1080ti in raster, however there are games that shows even above 64% and not to mention huge difference in RT performance (in quake 2 RTX 1080ti is slower in 480p than 2080ti in 1440p).

iVpwG6s.jpg


So Turing wasnt as bad as people think (DLSS and VRS are also very important features). These cards were simply too expensive.
 

thelastword

Banned
40%? You are talking about ampere leak? in the same article it was mentioned ampere GPU was running at 1.1GHz 16.6TF. 2080ti OC also has over 16TF, so that suggest 40% IPC gains and when we add to that 2GHz clock 3080ti should be mighty impressive (30TF with 40% higher IPC). This ampere leak looks almost too good to be true, because it suggest literally 2x faster GPU.
They also said on redgamingtech, that these early samples almost always shows the wrong clock in the software. So 1100 are not really early clocks.
 

Leonidas

Member
On average 2080ti is only around 30-40% faster than 1080ti in raster, however there are games that shows even above 64% and not to mention huge difference in RT performance (in quake 2 RTX 1080ti is slower in 480p than 2080ti in 1440p).

Yup, Turing is quite impressive, especially considering it is on a similar manufacturing process as Pascal.

Can't wait to see what Nvidia does on 7nm.
 

pawel86ck

Banned
They also said on redgamingtech, that these early samples almost always shows the wrong clock in the software. So 1100 are not really early clocks.
That could explain a lot but assuming 7552 SP number was correct it's still mighty impressive. I wasnt expecting 30TF GPU in the near future and even without additional IPC gains 3080ti should eat 2080ti for breakfast.
 
40%? You are talking about ampere leak? in the same article it was mentioned ampere GPU was running at 1.1GHz 16.6TF. 2080ti OC also has over 16TF, so that suggest 40% IPC gains and when we add to that 2GHz clock 3080ti should be mighty impressive (30TF with 40% higher IPC). This ampere leak looks almost too good to be true, because it suggest literally 2x faster GPU.
How would my i7 8700 pair up with a card like that?
 

Kenpachii

Member
Just wait before Nv will reveal the price 😅. To me 30TF sounds like 3000$.

9bac27aa4222e0462464d4d6b024aec5.jpg


Honestly i aspect the 3080ti to be twice the 1080ti performance at a absolute minimum, with all the stuff like dlss that other thing i forgot the performance could be going throught the roof on those cards. the whole 2000 series has been a dud this generation and it was nvidia holding back big time because no competition.

With AMD entering the ring again, i would not be shocked if the 3080 is already 2x 1080ti performance and 3080ti 2x 2080ti performance.

And that 1080ti sold for 700 bucks by the way while 2080ti was pushing 1200 bucks area.

So yea. The question is to see how far AMD pushes Nvidia. If we see a 3080 rtx card and not a 3080ti we know they are holding back and AMD isn't pushing them. IF they release a 3080ti straight out of the gate. We know they are pushing.
 
Last edited:

pawel86ck

Banned
If amd is aggressive enough we could see a better price this time around.
Something tells me 80CUs 1000$ rumor is correct, so I'm not expecting RDNA2 GPUs to be cheap either.

How would my i7 8700 pair up with a card like that?
There are always CPU bottlenecks, however on fast GPU you will be able to run games with much higher resolution. 2080ti cant run all games at 4K 60fps and especially games with RT.
 
Last edited:

pawel86ck

Banned
You mind showing me where you can have speeds of vram or ram, on a ssd? Also it must be affordable, not something that costs several hundreds of dollars. Also it would need to have almost unlimited write/read iops. If you think this is going to be a thing.... I got some oceanfront property I could sell you....i wish people would use critical thinking instead of eating up PR and rumors.... And passing it along as facts.
SDD doesn't need to reach RAM speed to make a difference. Streaming speed will be much improved and will allow to load on the fly 40x more data. But go ahead and say SDD will not make much difference in games besides loading times 😅
 
Last edited:

Kenpachii

Member
How would my i7 8700 pair up with a card like that?

970 gtx got bottlenecked by a 9900k in they are billions and anno 1800. So its just whatever u want to do with it man.

SDD doesn't need to reach RAM speed to make a difference. Streaming speed will be much improved and will allow to load on the fly 40x more data. But go ahead and say SDD will not make much difference in games besides loading times 😅

It won't its basically just a super charged HDD and that's about it. SSD main problem is access times they are like a factor of 1000 times slower then ram. They will never replace ram or v-ram. And the fact those consoles ship with v-ram should already tell you enough.

And yea HBM2 is probably a big mistake they are pushing, they should have gone for gddr. I could see that card being expensive. But nvidia will not let them take the crown even remotely. And i hope that AMD gpu pushes far past the 2080ti as result with a payable price.
 
Last edited:
NVIDIA said:
NVIDIA founder and CEO Jensen Huang will still deliver a keynote address, which will be available exclusively by livestream. We’re working to schedule that and will share details once they’re available.
 
Last edited:

Ironbunny

Member
I will be amazed if we see 2080ti under 1500 bucks. With the corona running wild the prices will hike up even without NVIDIA's help. Add shortages to supply and the 1080ti crowd finally upgrading...its gonna be a bloodshed.
 
Top Bottom