• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unknown 3000 series NV GPU "leaks" on Time Spy, 30% faster than 2080Ti FE

llien

Member
An unknown NVIDIA GeForce "Ampere" GPU model surfaced on 3DMark Time Spy online database. We don't know if this is the RTX 3080 (RTX 2080 successor), or the top-tier RTX 3090 (RTX 2080 Ti successor). Rumored specs of the two are covered in our older article. The 3DMark Time Spy score unearthed by _rogame (Hardware Leaks) is 18257 points, which is close to 31 percent faster than the RTX 2080 Ti Founders Edition, 22 percent faster than the TITAN RTX, and just a tiny bit slower than KINGPIN's record-setting EVGA RTX 2080 Ti XC. Futuremark SystemInfo reads the GPU clock speeds of the "Ampere" card as 1935 MHz, and its memory clock at "6000 MHz." Normally, SystemInfo reads the memory actual clock (i.e. 1750 MHz for 14 Gbps GDDR6 effective). Perhaps SystemInfo isn't yet optimized for reading memory clocks on "Ampere."

Q7zVCkET2FQQ1J3I.jpg


TPU
 
30% is pitiful, hope this is the 3080 and not the 3080 Ti/3090.

Edit: it can't even beat the overclocked 2080 Ti :messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy:
 
Last edited:

Mister Wolf

Member
compared to what? Isn’t 30% on the high end when compared to historical improvements between generations? Wasn’t that huge Pascal leap in this ballpark?

Compared to me already having a 2080TI that runs all these current games extremely well.
 

jigglet

Banned
If that upcoming DP 1.4 to HDMI 2.1 adapter works well then I may keep my 2080TI until 2022. I was always happy with its power I just need access to 4K 120hz on my OLED TV.

I think a lot of people are going to keep a keen on eye this.

That being said, I'm using HDMI 2.1 as an excuse to upgrade my monitor, even though mine is only 12 months old. The transition to IPS is too tempting :)
 

Alexious

Member
Compared to me already having a 2080TI that runs all these current games extremely well.

What resolution do you use? Because at 4K, the 2080Ti (which I have owned for two years) does not run all current games extremely well, that is to say at max settings and constant 60 frames per second or more.

A 31% performance boost could be huge if this unknown card turns out to be the 3080. The rumored 3090/3080Ti could be around 50% faster, in that case.
 

Mister Wolf

Member
What resolution do you use? Because at 4K, the 2080Ti (which I have owned for two years) does not run all current games extremely well, that is to say at max settings and constant 60 frames per second or more.

A 31% performance boost could be huge if this unknown card turns out to be the 3080. The rumored 3090/3080Ti could be around 50% faster, in that case.

I dont care much about native 4K especially in an era where most of these game engines have a resolution scaler. Having to play at 75-95% of 4K which is much greater than 1440p(only 44% of 4K) doesnt bother me at all. Certainly not enough for me to run out and drop another g-stack in 2020 for 30%.
 

Alexious

Member
I dont care much about native 4K especially in an era where most of these game engines have a resolution scaler. Having to play at 75-95% of 4K which is much greater than 1440p(only 44% of 4K) doesnt bother me at all. Certainly not enough for me to run out and drop another g-stack in 2020 for 30%.

That also depends on the display you're using. For those that have larger ones, the lower resolution is going to be much more noticeable.

Anyway, as others have said above, in raytraced games the boost will be likely significantly larger.
 

Mister Wolf

Member
That also depends on the display you're using. For those that have larger ones, the lower resolution is going to be much more noticeable.

Anyway, as others have said above, in raytraced games the boost will be likely significantly larger.

I game on a 65" LG C9 OLED. Take Resident Evil 2 for example. At 6ft away you dont notice much difference between native 4K and 90% of 4K. None of us are using a magnifying glass while playing like those DF videos.
 
Last edited:

Leyasu

Banned
Would have been orders of magnitude faster had they included an SSD...

Nvidia didn’t get the memo
 
I game on a 65" LG C9 OLED. Take Resident Evil 2 for example. At 6ft away you dont notice much difference between native 4K and 90% of 4K. None of us are using a magnifying class while playing like those DF videos.
You're sitting quite close to your TV. Most people sit at 3 meters plus and at that distance with a 55" you cant see the differennce between 1080p and 4k.

Pushing 4k resolution for consoles is the biggest hardware resources waste I've ever seen. 1440p is more than enough unless 75" plus TVs become the norm in peoples living rooms.
 
Last edited:

ZywyPL

Banned
The leak has been posted and discussed in other Ampere threads already. But anyway, 21TF level of performance on a stock card is a ~5TF bump compared to OCed 2080ti, and almost twice of what XBX will offer. And who knows what the OC potential will be thanks to 7nm. Not to mention that for anyone jumping from 1080/1080Ti the performance boost will be brutal, we're talking about 65-70% bump.
 

Rikkori

Member
You're sitting quite close to your TV. Most people sit at 3 meters plus and at that distance with a 55" you cant see the differennce between 1080p and 4k.

Pushing 4k resolution for consoles is the biggest hardware resources waste I've ever seen. 1440p is more than enough unless 75" plus TVs become the norm in peoples living rooms.

3m for 55" too close?! Might as well stream to a fucking tablet at that point. I'm sitting 1.4m away from my 55" and that's the sweet-spot, if I go to the couch that's 2m away then it's tiny.
Even RTINGS' calculator which has lower requirements says for 3m you need 70"+ (https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship) for mixed viewing, and for 3m for "cinema viewing" you need >90".
 

Allandor

Member
Ah, another leak.
Now it is only 30%, wasn't it 100% a few weeks ago?

Well, 30% seems more realistic to me for the top-end model.
2080Ti has already a monster chip. Even shrinked to 7nm it would still be big. So they might have packed in more front and backend, a bit more clockspeed and 300W target and voila you have your 30%.
As timespy doesn't use RT, I really think the next nvidia chip will much more focus on RT and therefore need much more space on the Die than the current chip. Therefore the result won't be that much better.
 
3m for 55" too close?! Might as well stream to a fucking tablet at that point. I'm sitting 1.4m away from my 55" and that's the sweet-spot, if I go to the couch that's 2m away then it's tiny.
Even RTINGS' calculator which has lower requirements says for 3m you need 70"+ (https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship) for mixed viewing, and for 3m for "cinema viewing" you need >90".
Yes I know the correct distances. I was talking about the average living room and how far the TV is usually placed.
 
30% on 1935MHz which is paltry when a piece of crap 2080 Ti can easily boost to 2000MHz. Expect this card to reach 2.2GHz+ which will give it a 35-40% improvement over the 2080 Ti. Add much superior RT and you’re looking at a substantially better card.

This is likely the 3080 Ti. Doubt a 3080 is 30% faster at 1935MHz.
 

GymWolf

Member
Wasn't people talking about huge performance gain? Like 3070 being slightly more powerfull than a 2080ti?!

If the jump is not very big i may stay with my 2070super until 4000 series or amd new series...
 
Last edited:

ethomaz

Banned
Wasn't people talking about huge performance gain? Like 3070 being slightly more powerfull than a 2080ti?!

If the jump is not very big i may stay with my 2070super until 4000 series or amd new series...
I expect the 3080 to be on pair a bit better than 2080TI.
AMD will lag way behind yet.
 

GymWolf

Member
I expect the 3080 to be on pair a bit better than 2080TI.
AMD will lag way behind yet.
Paying 800 to 1000 dollars to have a gpu slightly more powerfull (not even certain) than a 2+ years old gpu?
Yeah, no.

If the jump is not a substantial one i'm done with nvidia until next series.

Maybe amd will awake the fuck up this gen instead of being always 5 steps behind, we can only hope.
 

RespawnX

Member
Paying 800 to 1000 dollars to have a gpu slightly more powerfull (not even certain) than a 2+ years old gpu?
Yeah, no.

If the jump is not a substantial one i'm done with nvidia until next series.

Maybe amd will awake the fuck up this gen instead of being always 5 steps behind, we can only hope.

You may stopped buying GPUs half a decade ago? "More power" ist not the only thing. Efficiency of GPUs increased permanently over the past years also technical capabilities. Focus on "more power" is over, since years. AMD and Nvidia are going for AI capabilities. And yes, we have AMD several steps behind Nvidia. That's why prices will remain that high. Last time performance increased more than 30% between 2 generations was 8 years ago with GTX 580 to 680. When you compare X80 to X80 Ti you we always get a 20-30% performance increase ratio or less. People tend to forget that Ti are enthusiast cards with 50-60% + Price point for 10-20% performance increase. At this point we could get a 3070 with 2080 Ti performance for 500 dollars as X70 series moved from the 400 price point to 500 with RTX introduction. In Nvidia's world this is a 66% price cut and the usual one switching generations..
 

pawel86ck

Banned
30% sounds disappointing, but maybe that's just standard 3080 and not Ti.

Just for comparison 2080ti was 40% faster on average compared to 1080ti (25-65% depending on the game) and up to 6x times faster in RT and that's not even including DLSS 2.0 mode.
 

diffusionx

Gold Member
It's always the best to jump 2 gens, you should wait for 4080/90ti, people with 1080ti will get exactly what they need with this GPU.

Compared to 1080TI, you're looking at 70%+ performance increase and ray tracing, potentially significantly better RT compared to the 2080TI.

I was honestly hoping and expecting 50% over 2080TI but it's still a big enough leap over the 1080TI to make me still want it. If this is the high end variant.
 
Last edited:

GymWolf

Member
You may stopped buying GPUs half a decade ago? "More power" ist not the only thing. Efficiency of GPUs increased permanently over the past years also technical capabilities. Focus on "more power" is over, since years. AMD and Nvidia are going for AI capabilities. And yes, we have AMD several steps behind Nvidia. That's why prices will remain that high. Last time performance increased more than 30% between 2 generations was 8 years ago with GTX 580 to 680. When you compare X80 to X80 Ti you we always get a 20-30% performance increase ratio or less. People tend to forget that Ti are enthusiast cards with 50-60% + Price point for 10-20% performance increase. At this point we could get a 3070 with 2080 Ti performance for 500 dollars as X70 series moved from the 400 price point to 500 with RTX introduction. In Nvidia's world this is a 66% price cut and the usual one switching generations..
Yeah i know that 30% is usually what nvidia gpu gains from series to series, but a lot of people talked like this 3000 series was the holy grail and a substantial improvement, so seeing these results is a little bit sad for me.

I'm not gonna pay 1000 euros for a gpu that is still not enough for CURRENT GEN heavy games at 4k60 high details.
 

RespawnX

Member
Yeah i know that 30% is usually what nvidia gpu gains from series to series, but a lot of people talked like this 3000 series was the holy grail and a substantial improvement, so seeing these results is a little bit sad for me.

I'm not gonna pay 1000 euros for a gpu that is still not enough for CURRENT GEN heavy games at 4k60 high details.

You may change your perspective a bit and won't be disappointed so much. We still don't have 100% confirmation but leaks are pointing out that the new Nvidia GPUs have 2x> Ray Tracing performance and AI operations. This way, the frame rate is no longer so strongly affected by ray tracing at high resolutions. In combination with things like DLSS I think we will see a bigger generational leap in a whole package. 3080 should more be around 750-800 Euros and that's normal since years for this class auf GPUs. Could be 650-700 if AMD launches some good cards but don't think so.
 

GymWolf

Member
You may change your perspective a bit and won't be disappointed so much. We still don't have 100% confirmation but leaks are pointing out that the new Nvidia GPUs have 2x> Ray Tracing performance and AI operations. This way, the frame rate is no longer so strongly affected by ray tracing at high resolutions. In combination with things like DLSS I think we will see a bigger generational leap in a whole package. 3080 should more be around 750-800 Euros and that's normal since years for this class auf GPUs. Could be 650-700 if AMD launches some good cards but don't think so.
I just don't care enough about rtx tbh.
 

Krappadizzle

Gold Member
30% is pitiful, hope this is the 3080 and not the 3080 Ti/3090.

Edit: it can't even beat the overclocked 2080 Ti :messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy:
This was literally the sentiment of 1080ti owners when the 2080ti came out. Still is. Except now that jump is closer to 60% going to the 30xx and raytracing won't absolutely crush performance either.
 
Last edited:

DGrayson

Mod Team and Bat Team
Staff Member
This was literally the sentiment of 1080ti owners when the 2080ti came out. Still is. Except now that jump is closer to 60% going to the 30xx and raytracing won't absolutely crush performance either.

Ya f9r me its like even if 30% I have a gtx 1080 and i want to push 1440p and 144-165 fps with the newest games and a 2080ti won't do that
 
Last edited:

Krappadizzle

Gold Member
Ya f9r me its like even if 30% I have a gtx 1080 and i want to push 1440p and 144-165 fps with the newest games and a 2080ti won't do that
The 2080ti really felt like a stop-gap as opposed to a new series to me. If it didn't have RT support, I don't think anyone would have given a shit other than people who are freshly coming into PC gaming. I'd even wager that retrospectively looking at the 2080ti, it's not a good buy, whereas the 10xx series has had phenomenal value. There just wasn't enough RT support to warrant it and the performance hit was too noticeable to be worth it for most games. On top of that they raised the prices by $250+ almost across the board which felt like a giant slap in the face compared to the previous series and made the performance delta even less worth it.
 
Last edited:

Myths

Member
compared to what? Isn’t 30% on the high end when compared to historical improvements between generations? Wasn’t that huge Pascal leap in this ballpark?
This is exactly why I said what I did. %30 has been the consistent improvement from what I’ve seen. Anymore, and it’s a definite buy assuming the price point doesn’t jump overboard. $1300 would be the upper limit for me.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Too bad Leonidas is banned right now as he would be making his big old "Rip Big Navi" posts.

Exactly, it better be that or the 3070. Anything more and it's no good.

You clearly have no idea on the history of generational improvements. 30% for each equivalent is very much par for the course, except for last gen.
In previous gens, the 70 series would offer equivalent performance to the Ti of the previous gen.

if the 3070 offers 2080 Ti performance, that would be an amazing video card.
 
Last edited:
Top Bottom