• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Ali Salehi, a rendering engineer at Crytek contrasts the next Gen consoles in interview (Up: Tweets/Article removed)

oagboghi2

Member
No, but what if he had retweeted the Xbox reveal instead of PS5 reveal, Halo Infinite instead of TLOU2, and Gears 2 instead of MGS Guns of Patriots? Then went on to talk about X and Y reason why the Xbox is better...you wouldn’t be even a little suspicious?

Could you imagine the reaction if that was the case?
No that would mean he likes to play videogames.

I kinda assume if you work in this industry, you like videogames.

I mean, seriously? You are justifying my silly argument right now. You don't take him seriously because he doesn't have an equal number of tweets referencing a company. That is stupid
 
Last edited:
The problems arise when more than 10GB is needed by the GPU, I have no clue how often this will happen.
Literally never, and as I told the other guys because of the bandwidth uplift in the Series X's priority 10GB's, it can cache and flush the same amount of data as 12.5GB's of the PlayStation 5's memory in the same amount of time.

You also have to remember that many other facets of a system while running a game need memory, it's not just the GPU. There's no VRAM constraint here, it's hopes and dreams of a future bottleneck that will never present itself.
 
He doesn’t have a PS4.
To play Journey he had to rely on Cyber Cafe with PS4.
He owns since last year a Xbox One X... he even prefer to play PES on Xbox because the issues on PC.
Xbox One X is the first console he buys in years (he doesn’t give reason but said he took some years without playing games... it obvious playing game as hobby and not working on them).

That is all in his tweets.

How that sounds?
Sounds like we just had a 20 page discussion on an opinion based article written by an entry-level employee.
 

Clear

CliffyB's Cock Holster
It doesn't make any sense because it's just wrong.

2080 Super

QeHdFbB.png
2MYUrtC.png
LxCc7jU.png


2080 Ti

XBSMtnL.png
oTnR0t2.png
O2TO3D6.png



With the logic circling here the 2080 Super should be the superior piece of hardware because it's got a much higher frequency. The 20 additional SM's are wasted, the 160 extra tensor cores aren't being utilized properly, the 20 additional RT cores don't reflect heavily on RT performance, the teraflops don't matter, the ROPs don't matter, the TMU uplift doesn't matter, the increase in fill rates is all for not, the bus width doesn't matter, the bandwidth means nothing.

The frequency clearly dictates the performance of a GPU, not a culmination of literally every other function of the GPU and the bus. Imagine that. People have been wrong all this time and have been wasting money and throwing extra hardware at a task for no reason when all they had to do was increase their clock speeds.


The difference is that we aren't talking about graphics cards sat on identical buses and mobo's here, we're talking about custom soc's and I/O stacks where the APU's contain specifically paired CPU and GPU cores, plus numerous intermediary/ancilliary parts.

Its not the same thing. At all.

Which is why these consoles cost a fraction of the exorbitant price-tag of a 2080Ti.
 

onQ123

Member
For anyone worried about bandwidth I think the Cache scrubbers should help with that


Not Sony's paper but

Cooperative Cache Scrubbing


ABSTRACT Managing the limited resources of power and memory bandwidth while improving performance on multicore hardware is challenging. In particular, more cores demand more memory bandwidth, and multi-threaded applications increasingly stress memory systems, leading to more energy consumption. However, we demonstrate that not all memory traffic is necessary. For modern Java programs, 10 to 60% of DRAM writes are useless, because the data on these lines are dead - the program is guaranteed to never read them again. Furthermore, reading memory only to immediately zero initialize it wastes bandwidth. We propose a software/hardware cooperative solution: the memory manager communicates dead and zero lines with cache scrubbing instructions. We show how scrubbing instructions satisfy MESI cache coherence protocol invariants and demonstrate them in a Java Virtual Machine and multicore simulator. Scrubbing reduces average DRAM traffic by 59%, total DRAM energy by 14%, and dynamic DRAM energy by 57% on a range of configurations. Cooperative software/hardware cache scrubbing reduces memory bandwidth and improves energy efficiency, two critical problems in modern systems.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
So where did it all go since it was so legit? You don't just take an interview down and tweets because it all pans out. He also isnt even working with dev kit for PS5 and XSX
How do you know he doesn't have a devkit? And him taking the tweet and interview down doesn't mean the info isn't true. He probably got one of those calls
 

ClearMind

Report me for console warring (Xbot, Xbro etc.)
Wow Xbox fanboys certainly didn't disappoint in this thread. The damage control exceeds Baghdad Bob level. I must say I enjoyed drinking every single tear.

I can't wait for the next dev interview stating the obvious again : PS5 is at least on par with XSX. Higher frequency, unified memory pool, better software (no windows inherited constraints), less bottlenecks (SSD being the biggest one nowadays, it kinda helps being more than 100 % faster on that).

If a small interview like that produces such epic meltdowns, I can't imagine what the first video comparisons will produce. We must put some kids on suicide watch right now.
Its not really because of this interview in itself but an accumulation of things not going their way. Xbox fanboys are slowly realising that this situation is not as one sided as they hoped it would be so they lash out.

They thought they had a homerun with 12 > 10 TF but did not anticipate the twice as fast SSD in the PS5 for example. Thats why you see the SSD getting labeled as "secret sauce", dismissed when brought up and people getting mocked for it. Heck they bring up the SSD more than Playstation fans.

They also did not anticipate that developers would be showering the PS5 with praise. This is not the first time a reputable developer spoke up about the PS5 and Xbox fanboys hate them for it. Their countermeasure? They shit on the devs that speak positivly about the PS5 and push for bottom of the barrel sources like some trash youtube comments or a ex-playstation artist that is all buddy buddy with xbox extremists on twitter, as to change the narrative in a coordinated effort.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
The difference is that we aren't talking about graphics cards sat on identical buses and mobo's here, we're talking about custom soc's and I/O stacks where the APU's contain specifically paired CPU and GPU cores, plus numerous intermediary/ancilliary parts.

Its not the same thing. At all.

Which is why these consoles cost a fraction of the exorbitant price-tag of a 2080Ti.
This. Why some people don't understand that these things aren't apples to apples comparisons with graphic's cards is beyond me
 

Krisprolls

Banned
This. Why some people don't understand that these things aren't apples to apples comparisons with graphic's cards is beyond me

Yeah that and their buzzwords like "power of the cloud" or "teraflops", they're so tiring. They don't understand anything about that but that won't stop the debate.

PCs aren't consoles, stop comparing PC GPUs with consoles. And if Teraflops were meaningful, PS3 would have crushed 360 day 1 with the Cell, and we know it was the exact opposite for several years.
 
Its not really because of this interview in itself but an accumulation of things not going their way. Xbox fanboys are slowly realising that this situation is not as one sided as they hoped it would be so they lash out.

They thought they had a homerun with 12 > 10 TF but did not anticipate the twice as fast SSD in the PS5 for example. Thats why you see the SSD getting labeled as "secret sauce", dismissed when brought up and people getting mocked for it. Heck they bring up the SSD more than Playstation fans.

They also did not anticipate that developers would be showering the PS5 with praise. This is not the first time a reputable developer spoke up about the PS5 and Xbox fanboys hate them for it. Their countermeasure? They shit on the devs that speak positivly about the PS5 and push for bottom of the barrel sources like some trash youtube comments or a ex-playstation artist that is all buddy buddy with xbox extremists on twitter, as to change the narrative in a coordinated effort.
100% incorrect, you're trying to project your delusional viewpoint to demoralize people and shift mindshare.

Do you think we don't see this? The PS5 is going to get trounced computationally, no debate involved.
 

ClearMind

Report me for console warring (Xbot, Xbro etc.)
100% incorrect, you're trying to project your delusional viewpoint to demoralize people and shift mindshare.

Do you think we don't see this? The PS5 is going to get trounced computationally, no debate involved.
No, Im 100% correct.

Also I would like to see your quantification of "get trounced computationally" in numbers.

Is 1900p+ vs 2160p getting trounced?
How about 1800p vs 2160p?
 
Last edited:

Krisprolls

Banned
100% incorrect, you're trying to project your delusional viewpoint to demoralize people and shift mindshare.

Do you think we don't see this? The PS5 is going to get trounced computationally, no debate involved.

lol "trounced computationally", we read it all.

You're not a dev. We have hundreds of guys like you talking about things they don't understand like "teraflops" which mean nearly jack shit in gaming unless you're calculating the distance to the sun. The irony is they didn't even know this word 5 years go. The buzzwords change every gen. Last time it was the power of the cloud and I told everybody it didn't make sense (because local computing was already more than enough for everything in gaming, so it's very rarely useful using cloud services for intense calculations).

I'm not a game dev but at least I work in IT security and understand a bit more how computers work. It's a lot more complex than looking at a teraflops number, otherwise PS3 would have crushed X360 in performance with the Cell, and we all knew it was the opposite, at least at the start. And no, it's not easy using parallelism on more CUs... Most people don't even understand the idea of constant power budget like on PS5. Why fight so much on things you don't understand ? At least wait for the videos.

SSD doesn't matter because of loading screens, it matters because of constant assets streaming, the way modern games work. Ah yes I know, MS fanboys still think you load the game once then the hard drive goes to sleep while they play. Too bad it doesn't work like that at all.

So yes XSX SSD is fast (very fast compared to HDD), but it really helps being able to stream tons more. 120 % more assets streaming MAY help more than 18 % GPU compute, it depends on the game I'd say. You may have better textures, more details. Even a fast SSD is a bottleneck in gaming, GPU computing is that fast nowadays.

Spoiler : 3rd party games will look exactly the same on both consoles and will be impossible to differentiate with naked eyes. Fanboys will fail blind test telling which version is which. That's because even if one was 1800p+ dynamic res and the other was 2160p (I highly doubt there will be that big of a difference in either way), you would be hard pressed to see the difference. We reached that point where even 20 % difference in GPU compute (it will be much less overall one way or the other) can't be seen on screen unless you zoom ten times and freezes the pic like DF does. When you reach that point, why should it still matter, unless you can change your eyes like in Cyberpunk ?

1st party games will probably look best on PS5 though, since it's generally the case. It will have even more to do with Sony 1st party being better than with the PS5 being more powerful though.
 
Last edited:
lol "trounced computationally", we read it all.

You're not a dev. We have hundreds of guys like you talking about things they don't understand like "teraflops" which mean nearly jack shit in gaming unless you're calculating the distance to the sun. The irony is they didn't even know this word 5 years go. The buzzwords change every gen. Last time it was the power of the cloud and I told everybody it didn't make sense (because local computing was already more than enough for everything in gaming, so it's very rarely useful using cloud services for intense calculations).

I'm not a dev game but at least I work in IT security and understand things a bit more how computers work. It's a lot more complex than looking at a teraflops number, otherwise PS3 would have crushed X360 in performance with the Cell, and we all knew it was the opposite, at least at the start. And no, it's not easy using parallelism on more CUs... Most people don't even understand the idea of constant power budget like on PS5. Why fight so much on things you don't understand ? At least wait for the videos.

SSD doesn't matter because of loading screens, it matters because of constant assets streaming, the way modern games work. Ah yes I know, MS fanboys still think you load the game once then the hard drive goes to sleep while they play. Too bad it doesn't work like that at all.

So yes XSX SSD is fast (very fast compared to HDD), but it really helps being able to stream tons more. 120 % more assets streaming MAY help more than 18 % GPU compute, it depends on the game I'd say. You may have better textures, more details. Even a fast SSD is a bottleneck in gaming, GPU computing is that fast nowadays.

Spoiler : 3rd party games will look exactly the same on both consoles and will be impossible to differentiate with naked eyes. Fanboys will fail blind test telling which version is which. That's because even if one was 1800p+ dynamic res and the other was 2160p (I highly doubt there will be that big of a difference in either way), you would be hard pressed to see the difference. We reached that point where even 20 % difference in GPU compute (it will be much less overall one way or the other) can't be seen on screen unless you zoom ten times and freezes the pic like DF does. When you reach that point, why should it still matter, unless you can change your eyes like in Cyberpunk ?

1st party games will probably look best on PS5 though, since it's generally the case. It will have even more to do with Sony 1st party being better than with the PS5 being more powerful though.
You guys are going to have a really bad time, this discussion is over.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Nice
drSXFHY.png

Seems to reflect what other devs said: both consoles are very close, so much unnecessary bickering over a meager 17% difference in peak performance, literally the closest consoles have ever been
Some people's reaction when they read this:

SWN4RJl.jpg

And that's straight from a DICE dev's mouth
 

Shmunter

Member
Even if efficiency is off the table, the compute power difference is not large enough to be a massive talking point. Fast data delivery is what will define the gen and provide the next gen tech bump gaming needed. The interview did not explore the potentials this offers and focused on what is negligible to be honest. We wait for better insights and more interesting discussions.
 
Last edited:

ClearMind

Report me for console warring (Xbot, Xbro etc.)
You're forgetting Ray Tracing, Variable Rate Shading and Machine Learning capabilities. Resolution will no longer be the differentiator.
Go ahead and quantify those then.

I have been reading "getting trounced" and other hot air statements from xbox fanboys for a while now.
 

ClearMind

Report me for console warring (Xbot, Xbro etc.)
I asked two seperate xbox fanboys to quantify the "clear hardware implications".
Yet neither could come up with anything.
🤷‍♂️
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Old chaps like me played on Atari 2600. I can tell you that yes, at that time, there was a big difference with CBS Collecovision. But now ? Give me a break. Even the 40 % down on Xbox One vs PS4 weren't that bad, and it will be ten times less difference next gen on a 4k screen. People are so crazy.
Yep, 900P to 1080P wasn't even a huge deal and the gap is shrinking, especially at these higher resolutions
 

Gamerguy84

Member
We wait for better insights and more interesting discussions.

I cant see much more information coming out unless its by Sony or MS.

Everytime a dev opens their mouth they get attacked by rabid fanboys. People try and discredit, send death threats, ot at minimum want that person fired.
 

Krisprolls

Banned
Yep, 900P to 1080P wasn't even a huge deal and the gap is shrinking, especially at these higher resolutions

Of course, diminishing returns are real. What do you expect to see as a difference on a 4K screen ? Like you'd say "oh it's 1900p, that looks so so bad !" You can't see the difference with 2160p in most cases with naked eyes.
 
Old chaps like me played on Atari 2600. I can tell you that yes, at that time, there was a big difference with CBS Collecovision. But now ? Give me a break. Even the 40 % down on Xbox One vs PS4 weren't that bad, and it will be ten times less difference next gen on a 4k screen. People are so crazy.
It should be noted that the the Xbox One had less ROPS, much slower RAM bandwidth, a really tiny amount of esRAM, and a graphics card that lagged far behind the PS4's on GPGPU computing on top of the TFLOP disadvantage.

With the XSX and PS5, we're talking about a 17% difference in TFLOPS with the XSX having better RT capabilities while the PS5 has some of its own advantages like the faster (and more parallel) SSD, I/O, and GPU cache scrubbers.
 
Last edited:

BluRayHiDef

Banned
For those of you trying to reason with the Xbox fanboys, you're wasting your time. They will never accept the truth that the reported maximum performance level of a GPU in terms of teraflops is theoretical as it is attainable only under ideal conditions, which rarely occur in real-world scenarios. Furthermore, they will never accept that the PS5's simple design and blazingly fast data transfer rates make its likelihood of performing at its theoretical level of power much greater than that of the Xbox Series X since the latter has a rather traditional, complex architecture that's compounded by the asymmetrical bandwidth allocation of its volatile memory pool. They'll never accept these truths, so just leave them alone.
 
For those of you trying to reason with the Xbox fanboys, you're wasting your time. They will never accept the truth that the reported maximum performance level of a GPU in terms of teraflops is theoretical as it is attainable only under ideal conditions, which rarely occur in real-world scenarios. Furthermore, they will never accept that the PS5's simple design and blazingly fast data transfer rates make its likelihood of performing at its theoretical level of power much greater than that of the Xbox Series X since the latter has a rather traditional, complex architecture that's compounded by the asymmetrical bandwidth allocation of its volatile memory pool. They'll never accept these truths, so just leave them alone.
This isn't ARM, Cell or some custom hardware which has theoretical unknown ceilings, it's just reconfigured x86 PC hardware.

What is the obsession around here with trying to make generic PC hardware in a closed system appear elegant..
 
That we aren't creating topics on an interview with Lady Bernkestel, since that we would also see the bias. Just like we do now. Developers can also be fanboys, and looking at his twitter this one clearly is.
He gave us the reasons why he assesses the PS5 as the better console--this does not sound like your run of the mill fanbuy drivel to me... a fanboy dev could also have said something like: Sony really dropped the ball this time around, this and that is weaker than it should be, but it's easier to work with, etc. Otherwise being a Sony fan is pretty easy since the PS4 release they have provided the goods.
 

HawarMiran

Banned
For those of you trying to reason with the Xbox fanboys, you're wasting your time. They will never accept the truth that the reported maximum performance level of a GPU in terms of teraflops is theoretical as it is attainable only under ideal conditions, which rarely occur in real-world scenarios. Furthermore, they will never accept that the PS5's simple design and blazingly fast data transfer rates make its likelihood of performing at its theoretical level of power much greater than that of the Xbox Series X since the latter has a rather traditional, complex architecture that's compounded by the asymmetrical bandwidth allocation of its volatile memory pool. They'll never accept these truths, so just leave them alone.
I have most of them on ignore already 😂. After my one week ban I am not into talking to delusional people anymore or I might slip a ban worthy comment again
 
Nice
drSXFHY.png

Seems to reflect what other devs said: both consoles are very close, so much unnecessary bickering over a meager 17% difference in peak performance, literally the closest consoles have ever been

This is amazing. PS5's solution managed to be at a similar performance to an obviously bigger and more expensive XSeX solution. And probably surpass it in texture fidelity once the ultra-fast SSD is used to its fullest. Cerny is a genius.
 
Top Bottom