• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Crytek Engineer praising PS5 and Chris "Series X PS5 difference is staggering" Grannell may be both complete utter frauds

pawel86ck

Banned
There's no need to be desperate, he's telling us lies that the PlayStation 5 will struggle with open world games and won't have Real-Time Ray Tracing.

You want me to ignore those lies and believe him on everything else? Let's not forget that we can see that the consoles are close in power, but he still says the difference between both consoles are staggering.

We can look at these 3 main factors and conclude that he's not even reliable, even if he's a former Sony dev.

But look at some members from XboxGAF. They know the Lockhart will be weaker than the PS5 and XSX, but somehow the Lockhart will not struggle with open world games?
On paper XSX has 44% more RT hardware capability, and with such difference you cant say Grannel is just lying.

Alex from Digital Foundry also does not seem to have many doubts

Xbox series x will be superior in terms of Ray Tracing. It has a better GPU with more power on the Shaders and a higher band and this corresponds to better performance for Ray Tracing. It is only common sense
 
Last edited:

Shmunter

Member
What does the "higher quality" exactly mean? Higher poly models? That's what GPU allows for. Better shaders? Same. Better shadows? Same. Better lightning? Same. Higher draw distance? Same. More particles? Same. Better tessellation? Same. The list just goes on and on.

There's absolutely zero point in storing/streaming 20M polygon models when the GPU can only draw 15M, unless you're aiming for serious framerate or resolution drop.

The only graphical aspect that isn't computing power bound is the textures - now, X1X already does 4K textures, so I cannot believe that next-gen consoles will have a single issue with that, so let's assume that PS5 will be indeed able to stream 8K textures instead of 4K (which I doubt given their size and 825GB drive space) - will anyone be able to see the any difference/extra details at 4K or less? Press X to doubt.
Or Xbox will tile a 4K texture whereas PS4 will use 2 x different 4K textures. Or XsX will have 1 character model duplicated on screen, where PS5 will have 2 unique ones. I assume you can agree rendering load being equivalent in these scenarios despite more assets in another??? Or is the concept too complex?
 

Shmunter

Member
I thought geometry engine is just a fancy name for the universal feature in RDNA 2 architecture. Mesh shaders in XSX (or even on Nv GPUs) cant do the same job?

Also according to developers XSX also has a dedicated audio chip, although we dont know much about it unlike tempest chip.
We don’t know enough about geometry engine, so we can speculate. The likelyhood of it being baseline RDN2 is possible but unlikely considering what was presented by Sony.

Mesh shaders or primitive shaders are part of rdn2. They extrapolate additional detail into geometry in real-time. This is opposite to geometry engine seemingly culling workload before being sent to GPU. Both can work in tandem.

XsX audio chip unknown as you rightly say.
 
Last edited:

pawel86ck

Banned
Regarding ssd speed and assets it’s plain, e..g instead of holding gfx assets of a full room in ram e.g. 9gig, on PS5 you can stream another 9 gig as you turn making the room 18gig, aka twice the assets. Directly from Sony’s streaming
But where you want to hold 9 GB for the next second if GPU cant use (replenish) this data in real time therefore it must hold it somewhere. 10 TF demands 448 GB/s BW in order to use it in real time, not 9 GB/s because it will results in extreme bottleneck.
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
He was a designer and worked on Killzone 2, how is that evidence he was not a designer on Killzone 2? That being said, having been a game designer (which is not even a primarily tech-oriented job) for Sony a few years back is of course not the ultimate authority on the tech of PS5. It is just, like, his opinion, man.
 

ZywyPL

Banned
Or Xbox will tile a 4K texture whereas PS4 will use 2 x different 4K textures. Or XsX will have 1 character model duplicated on screen, where PS5 will have 2 unique ones. I assume you can agree rendering load being equivalent in these scenarios despite more assets in another??? Or is the concept too complex?

I agree, more variety like for exaple the infamous Megatextures from Rage might be the case, but then again, just as with 8K textures example - how far could it really be pushed with limited drive space? I think no one would want to see 150-250GB games with ~750 usable drive space...
 

DForce

NaughtyDog Defense Force
On paper XSX has 44% more RT hardware capability, and with such difference you cant say Grannel is just lying.

Alex from Digital Foundry also does not seem to have many doubts


“There’s only one next-gen console that will be able to do real time ray tracing. That’s a bit of a controversial statement…but Cerny skipped over it and didn’t talk about it…it comes down to CUs, consistency, what’s on the hardware.”


How is this not a lie? He said ONLY ONE next-gen console that will be able to do real time ray tracing. That's a lie, regardless of 44% less CUs
 

Shmunter

Member
But where you want to hold 9 GB for the next second if GPU cant use (replenish) this data in real time therefore it must hold it somewhere. 10 TF demands 448 GB/s BW in order to use it in real time, not 9 GB/s because it will results in extreme bottleneck.
You hold it on the ssd.

If you turn in 1 sec, over a span of 1 second the ssd can put new 9gig into ram. Rudimentary example.

The 448gb/s ram speed is almost completely needed to read & store calculations generated by the GPU, not just reading assets.
 

Shmunter

Member
I agree, more variety like for exaple the infamous Megatextures from Rage might be the case, but then again, just as with 8K textures example - how far could it really be pushed with limited drive space? I think no one would want to see 150-250GB games with ~750 usable drive space...
You’re right. Storage space is a consideration. We do know discs are 100gig this time. And no need for duplication of assets on the ssd any longer, so some new space will become available next gen. But seeing 200gig games would be eye watering.
 
Last edited:

pawel86ck

Banned
You hold it on the ssd.

If you turn in 1 sec, over a span of 1 second the ssd can put new 9gig into ram. Rudimentary example.

The 448gb/s ram speed is almost completely needed to read & store calculations generated by the GPU, not just reading assets.
Exactly, SDD can put new 9 GB into RAM every second (that's your own words) BUT you still need to allocate 9 GB RAM for that, so streaming pool would have to be equally large.

Who knows, myabe PS5 GPU will be able to read non essential data directly from SDD (without even copying anything into RAM), and somehow use it without performance penalty, but I can imagine all essential textures that are visible in the current frame must be loaded already into RAM.

How is this not a lie? He said ONLY ONE next-gen console that will be able to do real time ray tracing. That's a lie, regardless of 44% less CUs
Theoretically PS5 has HW RT, but Grannell was probably refering to it's real usability from a developers perspective. Many people expect RDNA 2 RT implementation will be even worse than already poor RTX implementation in Nvidia Turing GPU's. I will be surprised if XSX will match even standard 2080 in RT calculations, and If PS5 RT performance will be even 44% worse on top of that developers will not even bother to use it (maybe for RT sound or RT AI). I hope to be wrong though.
 
Last edited:

Shmunter

Member
Exactly, SDD can put new 9 GB into RAM every second (that's your own words) BUT you still need to allocate 9 GB RAM for that, so streaming pool would have to be equally large.

Who knows, myabe PS5 GPU will be able to read non essential data directly from SDD (without even copying anything into RAM), and somehow use it without performance penalty, but I can imagine all essential textures that are visible in the current frame must be loaded already into RAM.


Theoretically PS5 has HW RT, but Grannell was probably refering to it's real usability from a developers perspective. Many people expect RDNA 2 RT implementation will be even worse than already poor RTX implementation in Nvidia Turing GPU's. I will be surprised if XSX will match even standard 2080 in RT calculations, and If PS5 RT performance will be even 44% worse on top of that developers will not even bother to use it (maybe for RT sound or RT AI). I hope to be wrong though.
But you’re replacing the 9gig that was in your vision, it’s useless now that it’s out of sight. Rinse and repeat.

The no RT on PS5 conversation is lol.
 

DeepEnigma

Gold Member
Why? He's not in the industry and hasn't been for years. He's most famous for sharing the same podcast as this creep:



Context: Timdog is stalking customers to see how many pick up an Xbox so he can brag about a one month NPD win a few years back.

Yes, I'm not joking.


Holy fuck that is one of the most cringiest shit I have ever seen in gaming. Astroturfing at a whole new level.

Listen to the mouth-breathing. :pie_roffles:
 
Last edited:
You say lockhart won't hold back games but I don't see how that's the case unless lockhart has the exact same CPU and SSD as Series X.

Also , there are certain aspects of graphics that just cannot be scaled back to a lower system. For example, a lighting model designed to take full advantage of Series X cannot work on lockhart, which means the only way is to design the visuals around Lockhart and then bump up the scalable parts for Series X hardware. Which means the Series X and PS5 versions could be compromised just because lockhart exists even if its GPU is the only downgraded component.

Honestly IMO lockhart is such a braindead idea that makes the whole thing way more complicated for both developers and gamers. MS gets to tap into the super casual market, at the expense of gamers that care more about what next gen should truly deliver.

Everything points to Lockhart having the same CPU and SSD so those aren't a concern from a technology POV.

I agree there's parts of the graphics pipeline can't easily scale down, but that's why alternative techniques exist. If there's a lighting model in the XSX version that can't scale down properly to Lockhart, a simpler lighting model would be used instead. Many engines these days have those type of contingencies built right into them, so choosing one or the other is as simple as setting a mode in the engine. For example, if something on Series X can get away with a strong RT effect but it can't on Lockhart, then the effect can either be "faked" or simply not utilized.

The whole point of a budget console, IMO, is to use it as a hand-me-down for people who can't afford the premium next-gen system. Which means if the gaming experience on the lower-end has to sacrifice a few bells and whistles, so be it, as long as it doesn't hold back the proper next-gen system via being the baseline. Which, again, I would never assume MS would do in this situation, as it would render all goodwill generated through XSX wasted.

Again though, we probably have similar reservations on Lockhart but due to different reasons. Mines are, pretty much, logistical and economic. They have to assume what the market demand will be for TWO systems now instead of just the one (XSX), meaning resources in production budget, unit allocation and marketing is split between two systems now. If they overproduce Lockhart but fall short on XSX production numbers, that could create a base of would-be XSX owners who, instead of settling for Lockhart, pick up a PS5 instead. That's a lost sale in the ecosystem simply because you undervalued demand for the higher-end system, and the worst time to potentially make that type of mistake is at the start of the generation.

Not to mention, BOM-wise Lockhart could end up either selling at a loss for $299, or at-profit for $350. Both would hurt Lockhart, though, because the former is MS losing money (potentially lots if they overestimate demand for it), and the latter because that could put it in a bad pricing situation compared to a potentially aggressive PS5 pricing of $450 or even $399, at which prices I doubt the traditional early adopters (who make up the vast majority of early adopters) will have issues paying a bit more for a PS5.

Trying to focus on Lockhart as the "price" part of the strategy also means they could potentially price the XSX at-cost or at-profit, putting it in a weaker pricing value proposition relative PS5, when the vast majority of would-be Xbox owners of next gen who are early adopters, would vastly prefer an XSX over Lockhart. In fact, that MS have focused on spotlighting XSX for SO long while basically hiding Lockhart like it doesn't exist, suggests they know this too, which is one strong reason I suspect Lockhart actually isn't happening, at least for the first year or two.

But hey, maybe we are wrong about Lockhart in this regard. Maybe it is significantly scaled back, but is a streaming-optimized budget next-gen console. If that is the case MS could cut out a lot of the local hardware (disc drive, much smaller SSD, much smaller GPU, much less GDDR6 memory etc.) and maybe get it to $199 which would be great for budget-conscious gamers who want to get in early with next-gen albeit through streaming. And in that case, it absolutely wouldn't hold XSX back technically, either, since it'd just be streaming the games from servers. That would ALSO give an even bigger reason to increase XSX production numbers and benefit from cheaper prices through economies of scale, which would ALSO help with reducing XSX's BOM and making it that much more price-competitive with PS5, so they win out on "power" at the high end and "price" on BOTH systems, essentially.

That's the ideal situation for Lockhart imho, but I guess we will be finding out next month (June at latest) where things stand with it. If it's a 4 TF 1080p60 focused budget system for local gaming then I hope it doesn't exist, but it'll be out of logistical/economic/financial reasons rather than technical. If it does exist, I hope it doesn't come about until about two years after XSX, meaning they wouldn't need to reveal it this year or next year, either. If it does exist, and it's a streaming-only device, then that's great, because it means the BOM will be very low, it can price very low, there'll be no angle for anyone to latch on regarding it holding back XSX from a technical POV (none of the game code would be running locally on the system), it'd give a reason for even more XSX production units which would also help that system scale down its BOM and be even more price-competitive with PS5.

What's worse is you get people like Matt from ResetEra that say the difference between the two systems is hardly anything and he has/working knowledge on both systems.

People won't believe that guy but they will believe this Grannel chap and like the reasons you stated he is not even in the loop at all.

People Like dealer really are like the gaming equivalent of Russian propaganda.
They do so much damage to gaming.

That's because Matt (Era Matt) immediately dismissed the Github leaks basically saying "disregard it" (or something to that effect), even though in hindsight they were pretty accurate in regards to both next-gen systems. He said that while at the same time, never once telling people to put away ridiculous notions of PS5 at 13 TF or other similar numbers; if he was uncertain enough to let those speculations float why did he feel so certain as to outright throw away the Github leaks (and downplay the testing data)? What, because they "didn't tell the full picture"? Well, obviously, but neither did the other speculations regarding PS5. Somehow those were implied being more worthy of genuine speculation, however.

It is kind of like with how some people, after the Road to PS5 event, still tried saying Github was wrong the whole time because it got the PS5 clock wrong (never mind Oberon C0/E0 did not list a clock for the GPU in that particular revision), or XSX CU active count wrong (forgetting that Scorpion devkit APU did the exact same thing, i.e all CU were active on the devkit, 4 were disabled for retail unit), or the fact some of the Oberon data mentioned RDNA1 (never minding that Oberon may've been running an Ariel iGPU profile, because it's either that or Oberon is not a "full" RDNA2 chip i.e RDNA1 is the base. I'm not mentioning that as my own speculation, just providing what choices there are as to why some Oberon testing data mentioned RDNA1 when Oberon is the PS5 chip and Sony have already confirmed RT), etc. At some point, you just have to accept what is most likely.

There were some insiders that were very close regarding PS5 specs, namely Heisenberg (10.5) and possibly O'dium (Sony may've tried having the full chip on and clocking it even higher than what PS5's GPU clock is currently at, which would push to 11.6 but that obviously wasn't sustainable and probably beyond the limits of the silicon and their cooling system). The one insider who got XSX general specs virtually on-point was Tommy Fischer. The insiders essentially got the SSD part correct (at least the paper specs of it), but I don't which one started that part of the rumor.

Point being, there are reasons to have some doubt in Matt's speculations; as long as it is healthy doubt I see no problem with it.
 
Last edited:

DForce

NaughtyDog Defense Force
Theoretically PS5 has HW RT, but Grannell was probably refering to it's real usability from a developers perspective. Many people expect RDNA 2 RT implementation will be even worse than already poor RTX implementation in Nvidia Turing GPU's. I will be surprised if XSX will match even standard 2080 in RT calculations, and If PS5 RT performance will be even 44% worse on top of that developers will not even bother to use it (maybe for RT sound or RT AI). I hope to be wrong though.
But we don't know how much worse it will be in comparison to the PS5. If devs are able to do RT on the PS5 hardware, then maybe its not even bad as he think it is.
 
But we don't know how much worse it will be in comparison to the PS5. If devs are able to do RT on the PS5 hardware, then maybe its not even bad as he think it is.
We still dont know how many Tflops the PS5 has for raytracing (XSX has 13). If that number is 9, for example, thats a 40% dissadvantage for the PS5
 
Top Bottom