• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Indie Dev suggests both consoles exhibit RTX 2070 performance, says "we don't see many differences"

Jigsaah

Gold Member
Tech heads, get on this! How does a 10TF machine and a 12TF machine both exhibit 2070 performance? It should be noted that the developer says "for now" suggesting there may be more potential to unlock once they've had more experience with the kits. But to not see many differences?



The computing power of the new consoles is very promising, and we're very excited to see ray tracing come to next-gen consoles. It is difficult to say since we don't know the exact ray tracing specifications yet, but early snippets of info do suggest similar performance to an RTX 2070 Super, which will definitely be enough for similar results to what we have now on PC.

For now, we don't see too many differences, they seem to be competing well against each other and both are pushing new boundaries.
 

Abriael_GN

RSI Employee of the Year
If people keep asking this kind of stuff to developers who make smaller games that don't push hardware in any shape or form, yeah, we'll keep getting this kind of answer.

A certain website started the trend "ask indie developers that aren't working on next-gen consoles about details on next-gen consoles that they can't possibly know about" early last gen. It does fairly well in trashy console war-ridden places like N4G, so other sites are now following suit.

Basically. This has no value whatsoever. It's meaningless clickbait written purposely to bait N4G into an angry flame war as this specific site (just like the one that started the trend) often does.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Because it's a reply to a general comment, they weren't give you a precise estimate of the performance.

And it's the truth also. Note: they say 2070 SUPER, not 2070. Which is true, for XSX it would be 2070 Super, and for PS5 2070 (2060 Super).

This is just click-bait.
if the ps5 is 10.2 tflops then it should be a 2070 super.
xsx should be a little better than a 2080. somewhere between 2080 and 2080 super.

it seems these devs dont have the devkits and are going by released info. 2 tflops should put the xbox above 2080 for sure.
 

Mister Wolf

Member


The Coalition ran The Gears 5 benchmark on the Series X in front of Digital Foundry and followed that up by running the same benchmark at the same settings on PC with an RTX 2080 and produced similar/equal results.
 

IntentionalPun

Ask me about my wife's perfect butthole
That statement is about ray tracing though... and appears to be based on "early snippets of info", not a dev actually working with both dev kits.

In a recent chat with KeokeN founder and game director Koen Deetman, we asked whether there are any plans to enhance Deliver Us The Moon for the upcoming PlayStation 5 and Xbox Series X next-gen consoles and what are the studio's thoughts on the respective architectures.

We don't have plans for it, but plans can change! If we would do it, we think the next-gen consoles can deliver a result similar to what high-end PCs achieve.

Dinky indie dev w/ no current plans to release next-gen = "I don't have dev kits."
 
Last edited:

Night.Ninja

Banned
Devs have to be very careful about what they say about the new consoles, one wrong word and the warriors are beating their war drums.

gallery-1457975540-battlecry.gif
 

tkscz

Member
I'm assuming they mean GPU performance sense it would be really strange to compare an entire board to just a GPU. But I see that as being about right in terms of just raw GPU performance. I don't think anyone really expect 2080(Ti) performance from consoles. Flops are just one part of the GPU's overall performance, and as anyone would tell you, AMD FLOPs and nVidia FLOPs don't calculate the same as they use different compute technology.
 
Last edited:

Caio

Member


The Coalition ran The Gears 5 benchmark on the Series X in front of Digital Foundry and followed that up by running the same benchmark at the same settings on PC with an RTX 2080 and produced similar/equal results.


On XSX there are more particle effects, definitely a beast of a console.
 
Because it's a reply to a general comment, they weren't give you a precise estimate of the performance.

And it's the truth also. Note: they say 2070 SUPER, not 2070. Which is true, for XSX it would be 2070 Super, and for PS5 2070 (2060 Super).

This is just click-bait.

A 2060 Super is 7.2 Teraflops.
A 2070 Super is 9.2 Teraflops.
A 2080 Super is 11.1 Teraflops.

The PS5 GPU is 10.28 Teraflops. That would put it between a 2070 Super and a 2080 Super.

No idea why you suggest it's like a 2060 Super when that card is only 7.2 Teraflops.
 
Last edited:

martino

Member
A 2060 Super is 7.2 Teraflops.
A 2070 Super is 9.2 Teraflops.
A 2080 Super is 11.1 Teraflops.

The PS5 GPU is 10.28 Teraflops. That would put it between a 2070 Super and a 2080 Super.

No idea why you suggest it's like a 2060 Super when that card is only 7.2 Teraflops.
Still having hard time with tflops and how irrelevant they are to compare directly between different arch after all this time ?
But if this info is correct it can give infos on how RDNA2 tflops compare to nvidia ones (and deduce improvement or not from RDNA 1.0 by transitivity)
 
Last edited:
Still having hard time with tflops and how irrelevant they are to compare directly between different arch after all this time ?
But if this info is correct it can give infos on how RDNA2 tflops compare to nvidia ones (and deduce improvement or not from RDNA 1.0 by transitivity)

The PS5 being equal to a 2060 Super seems a bit low to me especially since the 2060 Super is 7.2 Teraflops.

Maybe you could explain this as I'm having trouble understanding this.
 

tkscz

Member
Still having hard time with tflops and how irrelevant they are to compare directly between different arch after all this time ?
But if this info is correct it can give infos on how RDNA2 tflops compare to nvidia ones (and deduce improvement or not from RDNA 1.0 by transitivity)

Been wondering that myself. How do the RDNA2 CUs compare to the CUDA cores in Turing? Or Ampier for that matter, since those are going to be 7nm.
 
Last edited:
A 2060 Super is 7.2 Teraflops.
A 2070 Super is 9.2 Teraflops.
A 2080 Super is 11.1 Teraflops.

The PS5 GPU is 10.28 Teraflops. That would put it between a 2070 Super and a 2080 Super.

No idea why you suggest it's like a 2060 Super when that card is only 7.2 Teraflops.
When you use NVIDIA's official clocks which are much lower than actual clocks. The 2080 Ti is rated at 1545MHz which is pathetic. It easily does 1900MHz and if you got one stuck at sub 1800MHz, return it.
 

Mister Wolf

Member
The PS5 being equal to a 2060 Super seems a bit low to me especially since the 2060 Super is 7.2 Teraflops.

Maybe you could explain this as I'm having trouble understanding this.

Its simple. Stop comparing AMD flops to Nvidia flops, they are not 1:1. The only true barometer you have to go on right now is The Coalition demonstrating with witnesses that the Series X benchmarked Gears 5 the same as a RTX 2080. Now you can directly compare flops with the Series X and PS5 because they are using the same AMD architecture.
 
Last edited:

Rikkori

Member
A 2060 Super is 7.2 Teraflops.
A 2070 Super is 9.2 Teraflops.
A 2080 Super is 11.1 Teraflops.

The PS5 GPU is 10.28 Teraflops. That would put it between a 2070 Super and a 2080 Super.

No idea why you suggest it's like a 2060 Super when that card is only 7.2 Teraflops.

Not quite, though I did underestimate PS5 a bit just based on a heuristic (= 5700 XT = 2060 Super more or less; because those 2 are within 5% of each other).

2060 Super, avg clock is actually 1910, which means it's 8.3 TF
2070, 1934 avg clock, 8.9 TF (but note how the performance difference of .6TF is under 5% perf difference)
2070 Super, 1945 avg clock, 9.95 TF
So it would be closer to a 2070 Super. I use average for TF calculation, but we don't know what PS5 average is so I'm ok with both of them being 10 TF. Plus there are further OC you can do to the GPUs, especially for memory, so I'm ok with it. And finally, a 5700 XT is close to 10 TF but falls short of a 2070 Super that's also 10 TF, so clearly Nvidia still has an advantage even if tflops are equal, so we'll see how that plays out.

(5700 XT = 85% of 2070 Super)
relative-performance_3840-2160.png



sources:
 
Last edited:

JMarcell

Member
Maybe for a Indie game it won't make any difference. For AAA games the story will be a lot different. 52 Compute Units versus 36 is a lot and will make a lot of difference in bigger, GPU heavier games.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
It's readily apparent this dev does not have dev kits, and is literally speculating on the same info we all have.. but this thread continues lol

I hate "games journalism"... read the damn interview, this dev isn't currently planning any next gen games.. AKA they do not have a fucking dev kit lol
 

martino

Member
The PS5 being equal to a 2060 Super seems a bit low to me especially since the 2060 Super is 7.2 Teraflops.

Maybe you could explain this as I'm having trouble understanding this.

5700xt ~ compare to 2070 using a collection of games benchmark
yet 2070 is 7.4 tlfops
and 5700xt is 9.7 tflops
so in real efficiency you could say (between the two) 1 RDNA tflops is 0.7 Turing tlfops
if ps5 is 2070 super then we can deduce RDNA 2.0 is ~0.87 Turing tlfops and RDNA 2.0 is ~20% improvement over rdna 1.0
this is actually good and believable for an architecture refresh

edit : note that this real world comparison between architectures evolve with games and how game use game engines / gpu features
edit2 : the fact nvidia oc beyond boost clock make the comparison trickier and favor it a little here (often card are 100-200 more than official boost without oc so gap is lower than that inital math ) . what is sure is that actual game result show performance that is not at advertised boost clock (unlike console who will never beyond them) it's even more a reason why console gpu will not perform that much better than desktop gpu because advertised tflops are not the ones you get actually using the card.
 
Last edited:

Mister Wolf

Member
Again ... care to present your evidence for this?

Xsex gpu 1.8Mhz vs PS5 2.2Mhz.

Please update me with your new data or if your data is incorrect , delete it...

The teraflops are higher for the Series X which is directly comparable since the are using the same AMD GPU architecture. The RTX 2080 is clocked higher than a 2080ti as well so I hope you can understand that citing clock speed means anything.

geforce-rtx-specs-01.jpg
 

RespawnX

Member
1) he claimed "2070 Super" not "2070",
2) That'ts about 10-15% performance difference
3) They don't have any of the consoles or kits, it's an assumption based on the specifications. A million other people have come to the same conclusion
4) wccftech again? Why people keep quoting this clickbait site?



The Coalition ran The Gears 5 benchmark on the Series X in front of Digital Foundry and followed that up by running the same benchmark at the same settings on PC with an RTX 2080 and produced similar/equal results.


Since a 2080 has a slightly better performance than a 2070 Super, that suits well.

12 TFLOPS are 12 TFLOPS, no more, no less, just as 10.2 remain 10.2 TFLOPS. With the difference that one console has far more processing units. But at some point, even the last fanboys will hit the ground ...
There is no magic either. This consoles are based on PC hardware and they will deliver comparable performance based on comparable hardware.
 

M1chl

Currently Gif and Meme Champion
I don't know, sounds like bullshit, but RTX2070 (OG) is an an amazing card, for it's price. People really downtalk RTX2XXX line, although I think that consoles are more capable.

And for the record, I don't understand, how that's a bad thing remeber, when this gen consoles launched, withlow/ mid-range HW? These consoles are beast in comparisong to day and age, when they are released.
 
well the series x was shown by DF to actually outdo a 2080ti slightly.

im not surprised a tiny indie company cant extract power out of new consoles
 

martino

Member
When you use NVIDIA's official clocks which are much lower than actual clocks. The 2080 Ti is rated at 1545MHz which is pathetic. It easily does 1900MHz and if you got one stuck at sub 1800MHz, return it.
aren't benchmark using stock clock (for reference card) ?
 
Last edited:

BlueAlpaca

Member
Wasn't there a developer that said a console GPU is equal to 2X its teraflops in a PC setup? Something about consoles being more efficient. So 9tf PS5 is 18tf in a PC, a lot more than a 2070... Of course those are tf on different architectures by different companies but still... And that SSD gives it a 3X multipalier to 54 tf. :messenger_winking_tongue:
 

ToadMan

Member
The teraflops are higher for the Series X which is directly comparable since the are using the same AMD GPU architecture. The RTX 2080 is clocked higher than a 2080ti as well so I hope you can understand that citing clock speed means anything.

geforce-rtx-specs-01.jpg

You said “faster”.

Tflops are not the same as “speed”.

So you may go and edit your incorrect post now.
 
Top Bottom