It's hard for me to believe when both also have mandatory SSDs in them. Unless he's ONLY talking about GPU performance.
We don't have plans for it, but plans can change! If we would do it, we think the next-gen consoles can deliver a result similar to what high-end PCs achieve.
The computing power of the new consoles is very promising, and we're very excited to see ray tracing come to next-gen consoles. It is difficult to say since we don't know the exact ray tracing specifications yet, but early snippets of info do suggest similar performance to an RTX 2070 Super, which will definitely be enough for similar results to what we have now on PC.
For now, we don't see too many differences, they seem to be competing well against each other and both are pushing new boundaries.
Each next-gen console sporting an SSD will allow us to significantly shorten loading times, which is something we really look forward to.
I believe devs said the same thing at the start of this gen.
When was that?well the series x was shown by DF to actually outdo a 2080ti slightly.
1) he claimed "2070 Super" not "2070",
2) That'ts about 10-15% performance difference
3) They don't have any of the consoles or kits, it's an assumption based on the specifications. A million other people have come to the same conclusion
4) wccftech again? Why people keep quoting this clickbait site?
Since a 2080 has a slightly better performance than a 2070 Super, that suits well.
12 TFLOPS are 12 TFLOPS, no more, no less, just as 10.2 remain 10.2 TFLOPS. With the difference that one console has far more processing units. But at some point, even the last fanboys will hit the ground ...
There is no magic either. This consoles are based on PC hardware and they will deliver comparable performance based on comparable hardware.
Yes, if the PC has bespoke hardware to decompress data from super fast SSDs. Then you'd be right.
To be fair the cpu is insignificantly better and the bandwidth is superior only for what? 6 or 10GB? Forgot the amount that has the fast bandwidth.Yeah, sure. One console has the better gpu, cpu and memory bandwidth but we wont see many differences. Lol
if the ps5 is 10.2 tflops then it should be a 2070 super.
xsx should be a little better than a 2080. somewhere between 2080 and 2080 super.
it seems these devs dont have the devkits and are going by released info. 2 tflops should put the xbox above 2080 for sure.
I'm not talking about the SSD but about the processing power. The SSD will do some cool things but it will certainly not render any shaders. SSD will primary make life easier for developers and we will get rid of HDDs in a few years. It will take time for the engines to natively take advantage of the SSDs' capacities, and by then the technology will have moved on.
but ps5 its not 10.2. Anyway, lets not even try to start discussing this, it wont end for the thread.
That’ll never be what next-gen consoles aim for, as has been discussed ad nauseam. They will go for full 4k at 30FPS %99 of the time. Want frame rates? Become a PC enthusiast, console will never be a place for universally high frame rates.Then neither should have any issue hitting 60fps minimum @1440p
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..Its simple. Stop comparing AMD flops to Nvidia flops, they are not 1:1. The only true barometer you have to go on right now is The Coalition demonstrating with witnesses that the Series X benchmarked Gears 5 the same as a RTX 2080. Now you can directly compare flops with the Series X and PS5 because they are using the same AMD architecture.
That’ll never be what next-gen consoles aim for, as has been discussed ad nauseam. They will go for full 4k at 30FPS %99 of the time. Want frame rates? Become a PC enthusiast, console will never be a place for universally high frame rates.
LolFlops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..
The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
Yes we can.What's the point of compering console gpu to pc gpu? Different architectures. Are we still comparing ps 4 gpu to 750ti? Can we imagine last of us 2 and god of war on 750ti? Doubt it.
Flops are Flops, whether it be a Texas Instrument calculator or a Titan RTX. The differences in gaming performances between NV and AMD in the past had to do with architecture and efficiencies. Nvidia did more with less TFLOPS just because the architecture was better designed for gaming, even with some reduction in IQ at points, color richness, more compressed textures etc.....Just take a look at Vega, it was way more powerful than Pascal in raw power, but the architecture did not suit gaming as much as Nvidia's Pascal GPU's or developers never really took advantage of the CU's in Vega because NV had the most popular product..
The difference now is AMD has a dedicated gaming GPU, architecture is pure gaming focused and it's extremely fast. A 5700XT is a smaller die than a GTX 2070/S but compares on par with that. The efficiency of Navi, it's architecture is what's impressive. NV no longer has the edge on the most efficient gaming GPU's. You will see that manifested when AMD brings larger dies and even better Performance power watt on their new cards and have cards that not only goes against the upcoming 3070, but the 3080 and 3080TI/90 as well. AMD cards will run at higher clocks and run cooler. There wont be one feature NV has in their card to boost performance that AMD won't. VRS is just one of those, we all know AMD has been on the ball with other methods like Radeon Chill and HBCC....
Because NVIDIA uses low boost clocks to determine the compute power. On their page the 2080 is rated at 1710MHz but retail models usually clock at around 1900-1950MHz out of the box. So it's really a 11-12 TFLOPs card.Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
Because NVIDIA uses low boost clocks to determine the compute power. On their page the 2080 is rated at 1710MHz but retail models usually clock at around 1900-1950MHz out of the box. So it's really a 11-12 TFLOPs card.
Do you even know what CUs is?The most parts in both GPUs goes with clockspeed. In all these parts, PS5 GPU is 20% faster than XSeX^^
Not quite, though I did underestimate PS5 a bit just based on a heuristic (= 5700 XT = 2060 Super more or less; because those 2 are within 5% of each other).
2060 Super, avg clock is actually 1910, which means it's 8.3 TF
2070, 1934 avg clock, 8.9 TF (but note how the performance difference of .6TF is under 5% perf difference)
2070 Super, 1945 avg clock, 9.95 TF
So it would be closer to a 2070 Super. I use average for TF calculation, but we don't know what PS5 average is so I'm ok with both of them being 10 TF. Plus there are further OC you can do to the GPUs, especially for memory, so I'm ok with it. And finally, a 5700 XT is close to 10 TF but falls short of a 2070 Super that's also 10 TF, so clearly Nvidia still has an advantage even if tflops are equal, so we'll see how that plays out.
(5700 XT = 85% of 2070 Super)
sources:
EVGA GeForce RTX 2060 Super SC Ultra Review
EVGA's RTX 2060 Super SC Ultra comes in at $400, so EVGA isn't asking for a price premium over the Founders Edition. In return, you get a better cooler with fan stop and an overclock out of the box. Just the pricing alone makes this card a strong candidate in the fight against the RX 5700 XT.www.techpowerup.comMSI GeForce RTX 2070 Gaming Z 8 GB Review
The MSI GeForce RTX 2070 Gaming Z comes at NVIDIA Founders Edition pricing of $600, yet beats that card in every regard. The cooler is whisper-quiet during even heavy gaming sessions, and when not loaded, the fans turn off completely because of the idle-fan-stop feature.www.techpowerup.comMSI GeForce RTX 2070 Super Gaming X Review
MSI's GeForce RTX 2070 Super Gaming X is a factory overclocked custom-design that's priced only $10 higher than the Founders Edition. It is cooled by a large triple-slot, dual-fan heatsink that has fan-stop and runs cooler and much quieter than the NVIDIA Founders Edition.www.techpowerup.com
Yeah, I'm sure when the Digital Foundry comparison videos start rolling out we'll see no difference in performance at all.
/s
It’s been long articulated by techies, TF numbers are only a small part of the picture. There is no such thing as running a GPU at max computational efficiency 100% of the time. The software workload varies as does the overall cohesion of the entire system to deliver results.Why is the 12 TFLOP GPU of the Series X only putting out performance equal to the 10 TFLOP RTX 2080?
Why the fudge are you bringing factory overclocked cards into the equation?
Just stick to the reference cards for the sake of clarity.
I can imagine the difference between them being very small.
The difference In power is smaller than previous gen, then there is the effect of diminishing returns and the use of techniques like CB and temporal reconstruction will obscure resolutions even more.
It will be down to dev skills mostly.
Yes^^Do you even know what CUs is?
Aren''t RDNA 2 flops supposed to be stronger than RTX 2000 series flops? That would mean both consoles could outdo even the 2080 or 2080ti. Didn't digital foundry claim the series X outdid the 2080ti at some point?
yesYes^^
Edit: but do you know, that a GPU isn't only CUs?
Not necessarily. Everything runs 20% faster in the PS5, everything. It only has fewer shaders, while the XSeX GPU is inferior in all other respects. It cannot yet be said with any certainty how it will affect this. If you trust Cerny and what he says, there is a reason why he prefers to use fewer CUs and (like all other parts of the GPU) prefers to run higher.yes
but do you know a bigger CU count has a bigger impact on performance?
I can hardly wait for TF comparisons!!
lol so you are just a cerny/sony fanboy, why should I trust just sony and not MS?Not necessarily. Everything runs 20% faster in the PS5, everything. It only has fewer shaders, while the XSeX GPU is inferior in all other respects. It cannot yet be said with any certainty how it will affect this. If you trust Cerny and what he says, there is a reason why he prefers to use fewer CUs and (like all other parts of the GPU) prefers to run higher.
And so, thanks for your comment.
YOU can trust who you want^^lol so you are just a cerny/sony fanboy, why should I trust just sony and not MS?
Didnt Grant Kot say that Xbox Series X is more powerful than his 8700K RTX 2080 ?
The dev of Scorn also said that Xbox Series X is on parity with RTX 2080 Ti.