• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Indie Dev suggests both consoles exhibit RTX 2070 performance, says "we don't see many differences"

Kenpachii

Member
It's hard for me to believe when both also have mandatory SSDs in them. Unless he's ONLY talking about GPU performance.

We don't have plans for it, but plans can change! If we would do it, we think the next-gen consoles can deliver a result similar to what high-end PCs achieve.

The computing power of the new consoles is very promising, and we're very excited to see ray tracing come to next-gen consoles. It is difficult to say since we don't know the exact ray tracing specifications yet, but early snippets of info do suggest similar performance to an RTX 2070 Super, which will definitely be enough for similar results to what we have now on PC.

For now, we don't see too many differences, they seem to be competing well against each other and both are pushing new boundaries.

Each next-gen console sporting an SSD will allow us to significantly shorten loading times, which is something we really look forward to.
 

hunthunt

Banned
I believe devs said the same thing at the start of this gen.

I dont remember it like that at all, they were silent about the Xbox One (lol) and happy about the PS4 but just because they had been fighting against CELL for almost 10 years.

Now most devs seems to be genuinely happy about these consoles and for the right reasons.
.
 

Spukc

always chasing the next thrill
let's be real tho indie devs is the last reason i will get nextgen consoles.. all that shit should run just fine on ps3
 

mckmas8808

Banned
1) he claimed "2070 Super" not "2070",
2) That'ts about 10-15% performance difference
3) They don't have any of the consoles or kits, it's an assumption based on the specifications. A million other people have come to the same conclusion
4) wccftech again? Why people keep quoting this clickbait site?



Since a 2080 has a slightly better performance than a 2070 Super, that suits well.

12 TFLOPS are 12 TFLOPS, no more, no less, just as 10.2 remain 10.2 TFLOPS. With the difference that one console has far more processing units. But at some point, even the last fanboys will hit the ground ...
There is no magic either. This consoles are based on PC hardware and they will deliver comparable performance based on comparable hardware.

Yes, if the PC has bespoke hardware to decompress data from super fast SSDs. Then you'd be right.
 

RespawnX

Member
Yes, if the PC has bespoke hardware to decompress data from super fast SSDs. Then you'd be right.

I'm not talking about the SSD but about the processing power. The SSD will do some cool things but it will certainly not render any shaders. SSD will primary make life easier for developers and we will get rid of HDDs in a few years. It will take time for the engines to natively take advantage of the SSDs' capacities, and by then the technology will have moved on.
 

JLB

Banned
if the ps5 is 10.2 tflops then it should be a 2070 super.
xsx should be a little better than a 2080. somewhere between 2080 and 2080 super.

it seems these devs dont have the devkits and are going by released info. 2 tflops should put the xbox above 2080 for sure.

but ps5 its not 10.2. Anyway, lets not even try to start discussing this, it wont end for the thread.
 

mckmas8808

Banned
I'm not talking about the SSD but about the processing power. The SSD will do some cool things but it will certainly not render any shaders. SSD will primary make life easier for developers and we will get rid of HDDs in a few years. It will take time for the engines to natively take advantage of the SSDs' capacities, and by then the technology will have moved on.

I seriously doubt the bolded. Epic has already shown us that they are taking advantage of the super fast SSDs now. And UE5 will be released in Holiday 2021.

but ps5 its not 10.2. Anyway, lets not even try to start discussing this, it wont end for the thread.

You're right. It's 10.28 TFs.
 
Last edited:

Golgo 13

The Man With The Golden Dong
Then neither should have any issue hitting 60fps minimum @1440p
That’ll never be what next-gen consoles aim for, as has been discussed ad nauseam. They will go for full 4k at 30FPS %99 of the time. Want frame rates? Become a PC enthusiast, console will never be a place for universally high frame rates.
 

thelastword

Banned
Its simple. Stop comparing AMD flops to Nvidia flops, they are not 1:1. The only true barometer you have to go on right now is The Coalition demonstrating with witnesses that the Series X benchmarked Gears 5 the same as a RTX 2080. Now you can directly compare flops with the Series X and PS5 because they are using the same AMD architecture.
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..


The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
 

INC

Member
That’ll never be what next-gen consoles aim for, as has been discussed ad nauseam. They will go for full 4k at 30FPS %99 of the time. Want frame rates? Become a PC enthusiast, console will never be a place for universally high frame rates.

I already am, 144fps here lol

Just saying theres really no excuse for not having 60fps at 1440p
Fuck 4k tbh
 
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..


The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
Lol
 

Mister Wolf

Member
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..


The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....

Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
 
Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
Because NVIDIA uses low boost clocks to determine the compute power. On their page the 2080 is rated at 1710MHz but retail models usually clock at around 1900-1950MHz out of the box. So it's really a 11-12 TFLOPs card.
 

MCplayer

Member
What about asking big triple AAA developers? Indie games don't normally push heavly on graphical power and optimization cmpared to the big guns.
 

GHG

Gold Member
Not quite, though I did underestimate PS5 a bit just based on a heuristic (= 5700 XT = 2060 Super more or less; because those 2 are within 5% of each other).

2060 Super, avg clock is actually 1910, which means it's 8.3 TF
2070, 1934 avg clock, 8.9 TF (but note how the performance difference of .6TF is under 5% perf difference)
2070 Super, 1945 avg clock, 9.95 TF
So it would be closer to a 2070 Super. I use average for TF calculation, but we don't know what PS5 average is so I'm ok with both of them being 10 TF. Plus there are further OC you can do to the GPUs, especially for memory, so I'm ok with it. And finally, a 5700 XT is close to 10 TF but falls short of a 2070 Super that's also 10 TF, so clearly Nvidia still has an advantage even if tflops are equal, so we'll see how that plays out.

(5700 XT = 85% of 2070 Super)
relative-performance_3840-2160.png



sources:

Why the fudge are you bringing factory overclocked cards into the equation?

Just stick to the reference cards for the sake of clarity.
 

geordiemp

Member
Yeah, I'm sure when the Digital Foundry comparison videos start rolling out we'll see no difference in performance at all.

/s

Probably not much no, 22 % faster GPU has likely faster triangles, faster culling, faster ROPS and caches.

Why do you think Nvidia dont mention TF on their web site, its not just TF is it.

/s
 
Last edited:

Shmunter

Member
Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
It’s been long articulated by techies, TF numbers are only a small part of the picture. There is no such thing as running a GPU at max computational efficiency 100% of the time. The software workload varies as does the overall cohesion of the entire system to deliver results.

PS5 may be be able to realise its peak performance more consistently. All speculation at this point.
 

Rikkori

Member
Why the fudge are you bringing factory overclocked cards into the equation?

Just stick to the reference cards for the sake of clarity.

Don't be triggered bro, those are normal numbers. All turing cards sit at 1900+ core unless they're using paper as heatsinks. Boost behavior is standard on PC.
 

Skifi28

Member
The difference does seem pretty minimal in the grand scheme of things when compared to previous generations . I doubt games will be identical, probably a higher resolution on the Xbox side, but with the usual TAA and dynamic res you are gonna need a microscope in order to tell the difference next gen.
 
I always enjoy these threads for the Nvidia benchmarks. It's no wonder these consoles are set to cost so much running the new NDNA architecture
 

Ar¢tos

Member
I can imagine the difference between them being very small.
The difference In power is smaller than previous gen, then there is the effect of diminishing returns and the use of techniques like CB and temporal reconstruction will obscure resolutions even more.
It will be down to dev skills mostly.
 

Thirty7ven

Banned
I can imagine the difference between them being very small.
The difference In power is smaller than previous gen, then there is the effect of diminishing returns and the use of techniques like CB and temporal reconstruction will obscure resolutions even more.
It will be down to dev skills mostly.

It's the closest these two have ever been, and the difference is significantly less than last generation according to developers. By far the biggest difference between the two is in I/O.
 

vkbest

Member
2070 super performance, 16 GB RAM, Ryzen CPU, best ssd on the market, and we will have games without 60fps options yet

On PS5 vs XBOX Series X, people are only thinking only about TFLOPS, PS5 looks like have several custom chips (Custom chip sound, ssd special hardware for accessing, Geometry engine), whereas Microsoft is trusting on brute force on GPU area
 
Last edited:

hyperbertha

Member
Aren''t RDNA 2 flops supposed to be stronger than RTX 2000 series flops? That would mean both consoles could outdo even the 2080 or 2080ti. Didn't digital foundry claim the series X outdid the 2080ti at some point?
 

chilichote

Member
yes

but do you know a bigger CU count has a bigger impact on performance?
Not necessarily. Everything runs 20% faster in the PS5, everything. It only has fewer shaders, while the XSeX GPU is inferior in all other respects. It cannot yet be said with any certainty how it will affect this. If you trust Cerny and what he says, there is a reason why he prefers to use fewer CUs and (like all other parts of the GPU) prefers to run higher.

And so, thanks for your comment.
 

Bernkastel

Ask me about my fanboy energy!
Didnt Grant Kot say that Xbox Series X is more powerful than his 8700K RTX 2080 ?





The dev of Scorn also said that Xbox Series X is on parity with RTX 2080 Ti.
 
Last edited:

MCplayer

Member
Not necessarily. Everything runs 20% faster in the PS5, everything. It only has fewer shaders, while the XSeX GPU is inferior in all other respects. It cannot yet be said with any certainty how it will affect this. If you trust Cerny and what he says, there is a reason why he prefers to use fewer CUs and (like all other parts of the GPU) prefers to run higher.

And so, thanks for your comment.
lol so you are just a cerny/sony fanboy, why should I trust just sony and not MS?
 
Man, people need to chill out with this power stuff. Last gen, the xbox one was really under-powered and that definitely hurt them the first half of the gen.

This isnt that situation. Yes, one is more powerful but neither of them are under-powered. So stop worrying for Christ sakes.

So damn fed up of this power and spec talk.
 

Croatoan

They/Them A-10 Warthog
When 3000 series is launched Consoles will already be obsolete. Your "Super Special SSD" is not going to render shit for you.
 
Last edited:

Kumomeme

Member
i dont know why there is someone still want to downplay one of the console performance despite the devs here hinted completely different things

2070 super performance for both console as next gen baseline....considering the devs didnt squeeze all juice from both console yet.
Ofcourse each of console has their strength, but as as rough performance measure, this is already good.
 
Last edited:

Life

Member
Shhhhhhhhhhhhhh. Don't compare next-gen console stats to PC hardware. Compare them to preceding consoles instead!
 
Didnt Grant Kot say that Xbox Series X is more powerful than his 8700K RTX 2080 ?





The dev of Scorn also said that Xbox Series X is on parity with RTX 2080 Ti.

No, the dev of Scorn never said that. The demo of Scorn was revealed to be running on a 2080 Ti.

Also, the guy Grant Knot isn't even certain. He's saying he "thinks" the SX is more powerful than his PC. We're all expecting it to be in the ballpark of a 2080/S.

No one expects it to be equal to let alone surpass a 2080 Ti.
 
Top Bottom