• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Indie Dev suggests both consoles exhibit RTX 2070 performance, says "we don't see many differences"

Kenpachii

Member
Mar 23, 2018
4,842
5,262
655
It's hard for me to believe when both also have mandatory SSDs in them. Unless he's ONLY talking about GPU performance.
We don't have plans for it, but plans can change! If we would do it, we think the next-gen consoles can deliver a result similar to what high-end PCs achieve.

The computing power of the new consoles is very promising, and we're very excited to see ray tracing come to next-gen consoles. It is difficult to say since we don't know the exact ray tracing specifications yet, but early snippets of info do suggest similar performance to an RTX 2070 Super, which will definitely be enough for similar results to what we have now on PC.

For now, we don't see too many differences, they seem to be competing well against each other and both are pushing new boundaries.

Each next-gen console sporting an SSD will allow us to significantly shorten loading times, which is something we really look forward to.
 

hunthunt

Banned
Mar 22, 2019
942
1,687
360
Chile
I believe devs said the same thing at the start of this gen.
I dont remember it like that at all, they were silent about the Xbox One (lol) and happy about the PS4 but just because they had been fighting against CELL for almost 10 years.

Now most devs seems to be genuinely happy about these consoles and for the right reasons.
.
 
  • Like
Reactions: shubhang

Spukc

Member
Jan 24, 2015
13,768
11,661
830
let's be real tho indie devs is the last reason i will get nextgen consoles.. all that shit should run just fine on ps3
 

mckmas8808

Member
May 24, 2005
43,720
7,441
1,835
1) he claimed "2070 Super" not "2070",
2) That'ts about 10-15% performance difference
3) They don't have any of the consoles or kits, it's an assumption based on the specifications. A million other people have come to the same conclusion
4) wccftech again? Why people keep quoting this clickbait site?



Since a 2080 has a slightly better performance than a 2070 Super, that suits well.

12 TFLOPS are 12 TFLOPS, no more, no less, just as 10.2 remain 10.2 TFLOPS. With the difference that one console has far more processing units. But at some point, even the last fanboys will hit the ground ...
There is no magic either. This consoles are based on PC hardware and they will deliver comparable performance based on comparable hardware.
Yes, if the PC has bespoke hardware to decompress data from super fast SSDs. Then you'd be right.
 
  • Like
Reactions: luca_29_bg

RespawnX

Member
Oct 25, 2018
85
125
220
Yes, if the PC has bespoke hardware to decompress data from super fast SSDs. Then you'd be right.
I'm not talking about the SSD but about the processing power. The SSD will do some cool things but it will certainly not render any shaders. SSD will primary make life easier for developers and we will get rid of HDDs in a few years. It will take time for the engines to natively take advantage of the SSDs' capacities, and by then the technology will have moved on.
 

JLB

Member
Dec 6, 2018
1,349
1,535
405
if the ps5 is 10.2 tflops then it should be a 2070 super.
xsx should be a little better than a 2080. somewhere between 2080 and 2080 super.

it seems these devs dont have the devkits and are going by released info. 2 tflops should put the xbox above 2080 for sure.
but ps5 its not 10.2. Anyway, lets not even try to start discussing this, it wont end for the thread.
 

mckmas8808

Member
May 24, 2005
43,720
7,441
1,835
I'm not talking about the SSD but about the processing power. The SSD will do some cool things but it will certainly not render any shaders. SSD will primary make life easier for developers and we will get rid of HDDs in a few years. It will take time for the engines to natively take advantage of the SSDs' capacities, and by then the technology will have moved on.
I seriously doubt the bolded. Epic has already shown us that they are taking advantage of the super fast SSDs now. And UE5 will be released in Holiday 2021.

but ps5 its not 10.2. Anyway, lets not even try to start discussing this, it wont end for the thread.
You're right. It's 10.28 TFs.
 
Last edited:

wordslaughter

Member
Apr 17, 2019
568
1,449
405
Yeah, I'm sure when the Digital Foundry comparison videos start rolling out we'll see no difference in performance at all.

/s
 

Golgo 13

The Man With The Golden Dong
Jun 14, 2014
4,134
1,068
805
Then neither should have any issue hitting 60fps minimum @1440p
That’ll never be what next-gen consoles aim for, as has been discussed ad nauseam. They will go for full 4k at 30FPS %99 of the time. Want frame rates? Become a PC enthusiast, console will never be a place for universally high frame rates.
 
  • Like
Reactions: chilichote

thelastword

Member
Apr 7, 2006
10,544
6,950
1,760
Its simple. Stop comparing AMD flops to Nvidia flops, they are not 1:1. The only true barometer you have to go on right now is The Coalition demonstrating with witnesses that the Series X benchmarked Gears 5 the same as a RTX 2080. Now you can directly compare flops with the Series X and PS5 because they are using the same AMD architecture.
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..


The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
 

INCUBASE

Member
Jan 8, 2018
1,800
2,538
415
That’ll never be what next-gen consoles aim for, as has been discussed ad nauseam. They will go for full 4k at 30FPS %99 of the time. Want frame rates? Become a PC enthusiast, console will never be a place for universally high frame rates.
I already am, 144fps here lol

Just saying theres really no excuse for not having 60fps at 1440p
Fuck 4k tbh
 
  • Thoughtful
Reactions: DoctaThompson

Thugnificient

Member
May 29, 2020
353
879
305
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..


The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
Lol
 

Mister Wolf

Member
Sep 21, 2014
3,314
1,386
580
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..


The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
 

Thugnificient

Member
May 29, 2020
353
879
305
Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
Because NVIDIA uses low boost clocks to determine the compute power. On their page the 2080 is rated at 1710MHz but retail models usually clock at around 1900-1950MHz out of the box. So it's really a 11-12 TFLOPs card.
 

Mister Wolf

Member
Sep 21, 2014
3,314
1,386
580
Because NVIDIA uses low boost clocks to determine the compute power. On their page the 2080 is rated at 1710MHz but retail models usually clock at around 1900-1950MHz out of the box. So it's really a 11-12 TFLOPs card.
I see. Thanks for the insight.
 

GHG

Member
Nov 9, 2006
19,677
11,969
1,605
Not quite, though I did underestimate PS5 a bit just based on a heuristic (= 5700 XT = 2060 Super more or less; because those 2 are within 5% of each other).

2060 Super, avg clock is actually 1910, which means it's 8.3 TF
2070, 1934 avg clock, 8.9 TF (but note how the performance difference of .6TF is under 5% perf difference)
2070 Super, 1945 avg clock, 9.95 TF
So it would be closer to a 2070 Super. I use average for TF calculation, but we don't know what PS5 average is so I'm ok with both of them being 10 TF. Plus there are further OC you can do to the GPUs, especially for memory, so I'm ok with it. And finally, a 5700 XT is close to 10 TF but falls short of a 2070 Super that's also 10 TF, so clearly Nvidia still has an advantage even if tflops are equal, so we'll see how that plays out.

(5700 XT = 85% of 2070 Super)



sources:
Why the fudge are you bringing factory overclocked cards into the equation?

Just stick to the reference cards for the sake of clarity.
 

geordiemp

Member
Sep 5, 2013
8,725
5,995
790
UK
Yeah, I'm sure when the Digital Foundry comparison videos start rolling out we'll see no difference in performance at all.

/s
Probably not much no, 22 % faster GPU has likely faster triangles, faster culling, faster ROPS and caches.

Why do you think Nvidia dont mention TF on their web site, its not just TF is it.

/s
 
Last edited:
  • Like
Reactions: chilichote

Shmunter

Member
Aug 25, 2018
4,035
8,305
635
Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
It’s been long articulated by techies, TF numbers are only a small part of the picture. There is no such thing as running a GPU at max computational efficiency 100% of the time. The software workload varies as does the overall cohesion of the entire system to deliver results.

PS5 may be be able to realise its peak performance more consistently. All speculation at this point.
 
  • Like
Reactions: chilichote

Ascend

Member
Jul 23, 2018
1,901
2,768
485
Based on the quote in the OP, this is talking specifically about ray tracing performance and nothing else.
 
  • Thoughtful
Reactions: Falc67

Rikkori

Member
May 9, 2020
536
750
300
Why the fudge are you bringing factory overclocked cards into the equation?

Just stick to the reference cards for the sake of clarity.
Don't be triggered bro, those are normal numbers. All turing cards sit at 1900+ core unless they're using paper as heatsinks. Boost behavior is standard on PC.
 

Skifi28

Neo Member
Jun 28, 2020
71
126
145
The difference does seem pretty minimal in the grand scheme of things when compared to previous generations . I doubt games will be identical, probably a higher resolution on the Xbox side, but with the usual TAA and dynamic res you are gonna need a microscope in order to tell the difference next gen.
 
  • Like
Reactions: shubhang

Ar¢tos

Member
Oct 24, 2017
3,670
5,028
685
Krynn
I can imagine the difference between them being very small.
The difference In power is smaller than previous gen, then there is the effect of diminishing returns and the use of techniques like CB and temporal reconstruction will obscure resolutions even more.
It will be down to dev skills mostly.
 

Thirty7ven

Member
Apr 13, 2020
1,247
4,816
405
I can imagine the difference between them being very small.
The difference In power is smaller than previous gen, then there is the effect of diminishing returns and the use of techniques like CB and temporal reconstruction will obscure resolutions even more.
It will be down to dev skills mostly.
It's the closest these two have ever been, and the difference is significantly less than last generation according to developers. By far the biggest difference between the two is in I/O.
 

vkbest

Member
Jan 23, 2017
642
422
330
2070 super performance, 16 GB RAM, Ryzen CPU, best ssd on the market, and we will have games without 60fps options yet

On PS5 vs XBOX Series X, people are only thinking only about TFLOPS, PS5 looks like have several custom chips (Custom chip sound, ssd special hardware for accessing, Geometry engine), whereas Microsoft is trusting on brute force on GPU area
 
Last edited:
  • Like
Reactions: shubhang

hyperbertha

Member
Nov 24, 2018
849
2,233
410
Aren''t RDNA 2 flops supposed to be stronger than RTX 2000 series flops? That would mean both consoles could outdo even the 2080 or 2080ti. Didn't digital foundry claim the series X outdid the 2080ti at some point?
 
  • LOL
Reactions: DoctaThompson

chilichote

Member
Jan 6, 2020
224
513
300
Germany
yes

but do you know a bigger CU count has a bigger impact on performance?
Not necessarily. Everything runs 20% faster in the PS5, everything. It only has fewer shaders, while the XSeX GPU is inferior in all other respects. It cannot yet be said with any certainty how it will affect this. If you trust Cerny and what he says, there is a reason why he prefers to use fewer CUs and (like all other parts of the GPU) prefers to run higher.

And so, thanks for your comment.
 
  • Like
Reactions: shubhang

Bernkastel

Formerly 'Lady Bernkastel'
Mar 8, 2018
3,157
5,622
840
Didnt Grant Kot say that Xbox Series X is more powerful than his 8700K RTX 2080 ?



The dev of Scorn also said that Xbox Series X is on parity with RTX 2080 Ti.
 
Last edited:
  • Like
Reactions: Ascend
Jun 10, 2019
320
369
320
Not necessarily. Everything runs 20% faster in the PS5, everything. It only has fewer shaders, while the XSeX GPU is inferior in all other respects. It cannot yet be said with any certainty how it will affect this. If you trust Cerny and what he says, there is a reason why he prefers to use fewer CUs and (like all other parts of the GPU) prefers to run higher.

And so, thanks for your comment.
lol so you are just a cerny/sony fanboy, why should I trust just sony and not MS?
 
  • LOL
Reactions: chilichote

VincentMatts

Member
Aug 21, 2014
6,089
650
495
Man, people need to chill out with this power stuff. Last gen, the xbox one was really under-powered and that definitely hurt them the first half of the gen.

This isnt that situation. Yes, one is more powerful but neither of them are under-powered. So stop worrying for Christ sakes.

So damn fed up of this power and spec talk.
 
  • Like
Reactions: shubhang

Croatoan

Member
Jun 24, 2014
3,975
1,797
845
When 3000 series is launched Consoles will already be obsolete. Your "Super Special SSD" is not going to render shit for you.
 
Last edited:
  • Fire
Reactions: DoctaThompson

Kumomeme

Member
Mar 20, 2017
835
591
365
Malaysia
i dont know why there is someone still want to downplay one of the console performance despite the devs here hinted completely different things

2070 super performance for both console as next gen baseline....considering the devs didnt squeeze all juice from both console yet.
Ofcourse each of console has their strength, but as as rough performance measure, this is already good.
 
Last edited:

Life

Member
Jul 25, 2019
383
346
365
Shhhhhhhhhhhhhh. Don't compare next-gen console stats to PC hardware. Compare them to preceding consoles instead!
 

Thugnificient

Member
May 29, 2020
353
879
305
Didnt Grant Kot say that Xbox Series X is more powerful than his 8700K RTX 2080 ?



The dev of Scorn also said that Xbox Series X is on parity with RTX 2080 Ti.
No, the dev of Scorn never said that. The demo of Scorn was revealed to be running on a 2080 Ti.

Also, the guy Grant Knot isn't even certain. He's saying he "thinks" the SX is more powerful than his PC. We're all expecting it to be in the ballpark of a 2080/S.

No one expects it to be equal to let alone surpass a 2080 Ti.