• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: All Developers Extremely Happy With PS5 Development But Not All Are Happy With XSX Development

LordOfChaos

Member
Isn’t this the same person people here were calling a MS shill for getting early access to the Series S?

4fnd9o.jpg
 

Romulus

Member
I simply don’t agree. Outside of still great looking games like RDR on 360 or the Dead Space series, features from some FP like Killzone 2, Drake’s Fortune or Drake’s Decemption are not even found in current gen games. Bluepoint themselves had to cut back and downgrade some stuff in their Remasters.
Other games like GoWIII have been ported well and in fact still look amazing.


Not a big loss. KZ and uncharted are extremely linear path pushers. GoWIII same thing. You could not even pan the camera to save processing power. Basically great devs using tricks to push shit hardware everything was extremely confined. Top 5 worst hardware ever. Without the best devs in the world, monster budgets, the ps3 was a complete turd.
 
Last edited:

Garani

Member
I remember seeing something similar in Ratchet with the instant teleportation.

250px-Ratchet_and_Clank_-_Rift_Apart_gameplay.gif


Notice the pop-in on the left of Ratchett?

The video shows it better:

Basically it's not an asset pop-in but a procedural one. Kinda interesting, because you can see that there are light reflections on those gold bars. In 4K you can really see them shine and reflect.
 

geordiemp

Member
Nothing. The console was already working and playing games on March when they gave one to Digital Foundry and Austin Evans, who tested it with Gears 5, Minecraft DXR and BC games.

I am not teh one claiming power or superiority, my stance is wait and see games.

Much More CU vs faster clocks and different shader arrangement and caches. Its unclear to me either way.
 
AS I have said, wait for the games so we can compare how they run, there is as much different in the designs as similarities so we need benchmarks. Both Sony and MS have keft out allot of information, Sony fast caches but no cache sizes, MS lots of CU but did not disclose L1 at Hotchips which is feeding allot of CUs....

We probably know less about ps5 than XSS, so its unknown, but we do have more games to observe running.

So as we dont know, its an unknown. You started the power narrative, not me.
Because it's a given. They wouldn't have more CU's by a landslide and not use them. Compare any midend to high-end gpu. There high end gpu always had more cu's, and lower frequency. Unless Cerny has some magic that defies physics, it should be a given which is the more performant console. And comparisons will be the deciding factor for sure.
 
Last edited:

geordiemp

Member
Because it's a given. They wouldn't have more CU's by a landslide and not use them. Compare any midend to high-end gpu. There high end gpu always had more cu's, and lower frequency. Unless Cerny has some magic that defies physics, it should be a given which is the more performant console. And comparisons will be the deciding factor for sure.

You dont get it, thats fine, look at below and figure it out yourself, CUs need fast feeding to be efficient in clock cycles and cache misses. What would eb better at feeding those hungry CUs ?


YOFxDwk.jpg


Internal cache bandwidths and utiilisation is very important. PC RDNA2 has shader arrays of 10 CU,,,,

Look at below, 9.7 TF smokes 12.66 TF at gaming, its about cache bandwidths....

nDTu59O.png


HOw do we compare bandwididth per CU (l2 to L0) on XSX and ps5 and relative size / arrangement - we cannot yet, its unknown....and so is performance not defined unless you have a dev kit of both.
 
Last edited:

Vick

Gold Member
Not a big loss. KZ and uncharted are extremely linear path pushers. GoWIII same thing. You could not even pan the camera to save processing power. Basically great devs using tricks to push shit hardware everything was extremely confined. Top 5 worst hardware ever. Without the best devs in the world, monster budgets, the ps3 was a complete turd.
Well, results speak and you’re objectively wrong if you think tech in the games i’ve listed is somewhat not impressive just because they appear linear. I would read more on GoWIII and U3 moving levels for instance, literal mathematical nightmares only possible thanks to PS3 unique architecture.
Just like i wouldn’t consider a Rolex or Ferrari engines turds due to their complicated nature, can’t really do it with a beast like Cell. An extremely obtuse beast maybe, but beast nontheless.
 
Last edited:
You dont get it, thats fine, look at below and figure it out yourself, CUs need fast feeding to be efficient in clock cycles and cache misses. What would eb better at feeding those hungry CUs ?


YOFxDwk.jpg


Internal cache bandwidths and utiilisation is very important. PC RDNA2 has shader arrays of 10 CU,,,,

nDTu59O.png
Aren't you missing the speeds of the cache though? Can't really argue something without the missing data, right? We don't know either, but we do know certain things that make a better gpu.
 

geordiemp

Member
Aren't you missing the speeds of the cache though? Can't really argue something without the missing data, right? We don't know either, but we do know certain things that make a better gpu.

Yes we do, and RDNA2 being + 50 % perf per watt, 5700 was around 1.9 GHz so we never had such a jump in speeds.

GPU Clock speed is the Cache speed, add in Cache coherency and cache scrubbers and each L1 cache on ps5 feeds 10 CU. Also there is Cerny and ND patents on compressing vertex data before it gets to shaders....

And XSX has each L1 cache feeding 14 CU and its a 20 % lower clock, and there are almost 50 % more of CUs to be fed by each L1 cache. There could be more unknowns about XSX for sure.

We dont know, but its certainly entertaining. :messenger_beaming: Lets wait for seeing how shit runs eh ?
 
Last edited:

FeiRR

Banned
Unlike the State of Decay demo the pop-ups don't just occur in the beginning. As the player moves around you notice some pop-in as well.

I remember seeing something similar in Ratchet with the instant teleportation.

250px-Ratchet_and_Clank_-_Rift_Apart_gameplay.gif


Notice the pop-in on the left of Ratchett?
Sorry, but in full resolution I can see now it's not a popup, just change of brightness/lighting.
 
Yes we do, and RDNA2 being + 50 % perf per watt, 5700 was around 1.9 GHz so we never had such a jump in speeds.

GPU Clock speed is the Cache speed, add in Cache coherency and cache scrubbers and each L1 cache on ps5 feeds 10 CU. Also there is Cerny and ND patents on compressing vertex data before it gets to shaders....

And XSX has each L1 cache feeding 14 CU and its a 20 % lower clock, and there are almost 50 % more of CUs to be fed by each L1 cache. There could be more unknowns about XSX for sure.

We dont know, but its certainly entertaining. :messenger_beaming: Lets wait for seeing how shit runs eh ?
Doubt Microsoft would still say "fastest, most powerful console", if it wasn't factually true. They would be sued for sure for false advertising. So I would definitely agree with what they say, as well as developers, and tech sites, over pure speculation by randoms. We'll definitely see the results soon though.
 

geordiemp

Member
Doubt Microsoft would still say "fastest, most powerful console", if it wasn't factually true. They would be sued for sure for false advertising. So I would definitely agree with what they say, as well as developers, and tech sites, over pure speculation by randoms. We'll definitely see the results soon though.

MS dont anymore, go to the web site, its changed to be most powerful Xbox ever.
 
MS dont anymore, go to the web site, its changed to be most powerful Xbox ever.
Because ps5 has the speed advantage, due to the SSD. But power still remains in Xbox court.

It's a piss poor comparison to use Vegas 64 against 5700xt. Why not look at 5700 vs 5700xt? The xt is the better performer.
 

geordiemp

Member
Because ps5 has the speed advantage, due to the SSD. But power still remains in Xbox court.

It's a piss poor comparison to use Vegas 64 against 5700xt. Why not look at 5700 vs 5700xt? The xt is the better performer.

I never mentiioned SSD did I ? You on same page ?

Speed as it clock and cache speed is 20 % more on ps5, nothing to do with SSD. 2.23 Ghz vs 1.825 GHz GPU clock and cache clock and shader clock.....shall I continue ???.

5700 vs Vega purpose is to show you importance of L1 cache, and internal cache speeds.

We dont have a 20 % clock disparity to compare do we, most peoplel post 5700 and its RDNA1 has a perf limit around 1.9 Ghz, RDNA2 Cerny tells you its 2.23 Ghz cap.

If someone clocked 5700 down from 1.9 Ghz by 20 % we could talk, and if teh slower one had 4 CU stuck on end of each shader array.....but we dont do we.

So Ps5 has cache and Cache bandwidth to feed CUs faster, XSX has more CUs. being fed slower,. how it works we need benchmarks.
 
Last edited:
I never mentiioned SSD did I ?

Speed as it clock and cache speed is 20 % more, nothing to do with SSD. 2.23 Ghz vs 1.825 GHz GPU clock and cache clock and shader clock.....shall I continue ???.

5700 vs Vega purpose is to show you importance of L1 cache, and internal cache speeds.

So Ps5 has cache and Cache bandwidth to feed CUs faster, XSX has more CUs.....Go figure it out.
I mentioned ps5 is the fastest in regard to the SSD. I never claimed the SSD has anything to do with speeds of the GPU. Not sure where you got that from.

So you know the cache sizes and speeds of both consoles? If not, you aren't showing me anything, besides fud examples between different architectures that hold no weight in the scenario of xsx vs ps5. Unless you are inferring rdna 1 vs 2?
 

geordiemp

Member
I mentioned ps5 is the fastest in regard to the SSD. I never claimed the SSD has anything to do with speeds of the GPU. Not sure where you got that from.

So you know the cache sizes and speeds of both consoles? If not, you aren't showing me anything, besides fud examples between different architectures that hold no weight in the scenario of xsx vs ps5. Unless you are inferring rdna 1 vs 2?

No, I do not know the cache sizes, all we know is GPU clocks.

We also know with Ps5 L1 cache is feeding 10 CU at 2.23 Ghz.

We know XSX L1 cache is feeding 14 CU at 1.825 Ghz.

I am just saying you are jumping to a power crown win, I am stating its unknown the delta in REAL WORLD gaming performance between the 2 consoles.

If you claim XSX is much more powerful, show us other than simple TF, you dont know do you. NEITHER DO I>

The examples I gave was just trying to get you to consider GPU is more than 1 number that MS seem transfixed on.
 
Last edited:
You dont get it, thats fine, look at below and figure it out yourself, CUs need fast feeding to be efficient in clock cycles and cache misses. What would eb better at feeding those hungry CUs ?


YOFxDwk.jpg


Internal cache bandwidths and utiilisation is very important. PC RDNA2 has shader arrays of 10 CU,,,,

Look at below, 9.7 TF smokes 12.66 TF at gaming, its about cache bandwidths....

nDTu59O.png


HOw do we compare bandwididth per CU (l2 to L0) on XSX and ps5 and relative size / arrangement - we cannot yet, its unknown....and so is performance not defined unless you have a dev kit of both.
Weird post considering they're two completely different architectures and are absolutely incomparable...
 

TTOOLL

Member
If you're wise you'll just remember "road to ps5" and how Cerny said it was designed with developers input in mind. You can't go wrong like that.
 

Romulus

Member
Well, results speak and you’re objectively wrong if you think tech in the games i’ve listed is somewhat not impressive just because they appear linear. I would read more on GoWIII and U3 moving levels for instance, literal mathematical nightmares only possible thanks to PS3 unique architecture.
Just like i wouldn’t consider a Rolex or Ferrari engines turds due to their complicated nature, can’t really do it with a beast like Cell. An extremely obtuse beast maybe, but beast nontheless.

What results specifically though? I just don't see anything. Even as a Sony gamer. Those games are extremely limited in what they're doing onscreeen, but with massive budgets and top tier devs. Of course the limited scope that is displayed will look good(at the time.)

The Ferrari comparison is awful. They perform at a high level with smaller engines. The ps3 cell was the hulking processor that could barely push 720p 30fps. And even after years of experience, the best devs in the world could not do much with it. The RAM sucked, and the GPU was terrible. The cell was supposedly good, but it never got much traction.
 
Last edited:

sinnergy

Member
If you're wise you'll just remember "road to ps5" and how Cerny said it was designed with developers input in mind. You can't go wrong like that.
True, but if rumours are true PS5 would be coming in 2019, so if, and that's a big if, it's not weird that their SDKs are better, the have at least 1 year more time to work on them, and developers with the games.
 
Last edited:

thelastword

Banned
I know, i don’t want to imply studios like Ninja Theory are on the same level but they’re not that far either, you really think we’re not going to see amazing looking games on Series X?
Ninja theory is ok, I still think Heavenly Sword was their most ambitious game made, but then they had help by Santa Monica on that title........I think the only studio that could surprise now is the Initiative.....MS needs new blood, they need new games and franchises and they need them to be of AAA quality....

I see most of their studios working on current and old franchises still, sequels to AA games for the most part. I'm not sure we will get a phoenix from that lot, but we shall see......
 
No, I do not know the cache sizes, all we know is GPU clocks.

We also know with Ps5 L1 cache is feeding 10 CU at 2.23 Ghz.

We know XSX L1 cache is feeding 14 CU at 1.825 Ghz.

I am just saying you are jumping to a power crown win, I am stating its unknown the delta in REAL WORLD gaming performance between the 2 consoles.

If you claim XSX is much more powerful, show us other than simple TF, you dont know do you. NEITHER DO I>
Guess even digital foundry feels xsx has the power crown. But nothing will convince you either way, even when the comparisons come out. Just more excuses. But we shall see.
 
Guess even digital foundry feels xsx has the power crown. But nothing will convince you either way, even when the comparisons come out. Just more excuses. But we shall see.

The real question is by how much though? I've seen claims of 18% all the way up to 100% more powerful. The most recent thing that's being said is that the PS5 will downclock all the way to 5TFs. But I think that's not true.
 
Last edited:
Not a big loss. KZ and uncharted are extremely linear path pushers. GoWIII same thing. You could not even pan the camera to save processing power. Basically great devs using tricks to push shit hardware everything was extremely confined. Top 5 worst hardware ever. Without the best devs in the world, monster budgets, the ps3 was a complete turd.

So we all agree talented dev>hardware? Within reason ofcourse.
 
The real question is by how much though? I've seen claims of 18% all the way up to 100% more powerful.
It may not be by much at all, but it's apparent it will have it, even if it's only a fraction of a single percentile, which is what started this whole debate. With having the better performance, and being able to also port to PC, it only makes sense to be the lead platform in development. It would be more work to do it the other way around.
 

thelastword

Banned
Will be fun to visit these threads later, now with everyone claiming XSX version of multiplats will look worse.

He literally says they develop for XSX and then scale back, contrary to your claims.
No one is saying they will look worse. People are asking..."Where is the backing behind the most powerful console rhetoric?".......

Whilst PS5 was supposed to be the weakest console by far according to some. It is the PS5 that has shown the most games running on it's kit. Whilst the PS5 was said to not have proper raytracing hardware, it is the PS5 that has shown next gen footage with raytracing intact.....Some people even recently said PS5 was hard to program for, devs were struggling with the PS5 kit....

Yet, if we accepted all of that and believed it, why are PS5 games being shown and running well and Series X games are AWOL.....If MS games were running twice the framerate as a PS5 game all at 4K whilst PS5 struggled with 1080p below 30fps, surely MS would be nailing that coffin as we speak.....Yet that's not the case. People want PS to show everything to prove a case, but with XBOX, a "trust me dawg" ticket seems to suffice in all scenarios.....They really don't have to prove anything for people to believe MS it would seem. Phil's words seem to be the honey on the proverbial tip...
 
It may not be by much at all, but it's apparent it will have it, even if it's only a fraction of a single percentile, which is what started this whole debate. With having the better performance, and being able to also port to PC, it only makes sense to be the lead platform in development. It would be more work to do it the other way around.

Don't they have to make sure that whatever they make for the XSX also works on the XSS?
 

geordiemp

Member
Guess even digital foundry feels xsx has the power crown. But nothing will convince you either way, even when the comparisons come out. Just more excuses. But we shall see.

I am convinced by facts or data,. I have neither, just people saying 12 is bigger than 10, but Ps5 has more L1 cache per CU and they are faster.

Comparisons will be good, I also wait - But I ahve seen the ps5 next gen games, and XSX is a very quiet nada ........have we seen any uncompressed 4k videos of XSX running anything demanding yet ?

DF seem to be starting to make excuses for XSX, looks like they are not so confident anymore in the real world performance of the paper specs. Lets wait and see is the correct approach, there are too many unknowns.
 
Last edited:

Vick

Gold Member
What results specifically though?
I really don’t have time at the moment, nor the means being on an iPad, so i’ll just refurbish an old post and call the day. What said for Uncharted 3 can be said as well for the other FP titles i mentioned.

adam-littledale-forest.jpg


hold-fight-start.png


underwater.png


stariwell-door-spray-start.png


stairwell-door-spray.png


stairwell-burst-1.png


hallway-vent.png


cabin-climb-end.png


tate-mosesian-uncharted3-chapter18-the-rub-al-khali-01.jpg


tate-mosesian-uncharted3-chapter18-the-rub-al-khali-06.jpg


tate-mosesian-uncharted3-chapter18-the-rub-al-khali-04.jpg


tate-mosesian-uncharted3-chapter18-the-rub-al-khali-02.jpg


tate-mosesian-uncharted3-chapter19-the-settlement-01.jpg


Results such as a game from 10 years ago looking better than a shameful amount of current gen games, making extensive use of global illumination, HDR, volumetric lighting, an ambient occlusion with less artifacts than SSAO on some current gen games, water more dynamic than most games in this generation with actual reflections with no visible artifacts (unlike most games on PS4, exclusives included), clothing simulations, the very highest resolution textures on the system, extensive use of an insanely high quality POM, high quality DOF, lots of samples for it’s per object and camera motion blur, the best sand simulation on any game on any platform to this day (with grains of sand falling into Drake’s footsteps!), the most mindblowing levels i’ll probably ever experience in a game like the burning chateau, sinking ship and falling plane, and still felt fluid and run at a stable framerate. Sure, MLAA, texture filtering and shadows haven’t aged too well, but that game was MENTAL technically. To be as impressed as i was with that game on a 500MB Ram system, i’d probably have to play Avatar in real time on current gen. Maybe if PS4 went with Cell processor again, costed 600€ and ND team stayed the same.. who knows.

None of this could have happened without Cell.

Those games are extremely limited in what they're doing onscreeen, but with massive budgets and top tier devs. Of course the limited scope that is displayed will look good(at the time.)
Except they’re not, unless you limit yourself into thinking only free-roam titles can be regarded as “not limited” which would be a pretty weird stance.

How could you possibly define the entire first hour of GoWIII limited for instance? Or said levels in U3?

The ps3 cell was the hulking processor that could barely push 720p 30fps. And even after years of experience, the best devs in the world could not do much with it. The RAM sucked, and the GPU was terrible. The cell was supposedly good, but it never got much traction.
I don’t know what to tell you. There’s making of, GDC papers and Digital Foundry articles proving you wrong.
 
Don't they have to make sure that whatever they make for the XSX also works on the XSS?
Which is part of the gdk. They can easily work with both at the same time.

Comparisons will be good, I also wait - have we seen any uncompressed 4k videos of XSX running anything demanding yet ?

DF seem to be starting to make excuses for XSX, looks like they are not so confident anymore in the real world performance of the paper specs. Lets wait and see is the correct approach, there are too many unknowns.
Just like we haven't seen a ps5 till just now. Wouldn't be surprised if they have have something to show soon, being that they supposedly learned from the last conference. Guess you love to believe DF when it's convenient, then blame them for making excuses. Pick one or the other lol.
 

geordiemp

Member
You'd have to explain to me first why you're trying to compare Vega to RDNA, it doesn't make any sense to begin with. You're out of your element.

I was not, I was pointing out to teh poster the importance of not just TF, but also cache size, CACHE bandwidth and how they feed the CU from the RDNA1 white paper. That is all, nothing more.

The RDNA1 white paper by AMD talks about cache speeds and L1 cache .......thats what to take from it,.

If you look at that table from amd white paper, AMD state speed of each cache bus in RDNA1.

Now, go look at XSX vs ps5, what do you see ? They both have 4 shader arrays, ps5 has faster clock by 20 %, XSX has more CUs to feed.

Do you think the CU utilisation and efficiency will be the same ?

WE can guess on why MS went with 4 shader arrays and not 5 or 6, they wanted 4 as its 4 for server.
 
Last edited:

geordiemp

Member
Which is part of the gdk. They can easily work with both at the same time.


Just like we haven't seen a ps5 till just now. Wouldn't be surprised if they have have something to show soon, being that they supposedly learned from the last conference. Guess you love to believe DF when it's convenient, then blame them for making excuses. Pick one or the other lol.

I believe in the DF video I am seeing, the conculsions and often non factual commentary is up for debate sometimes, like Alex saying lighting is all that was needed to improve halo infinite.....fucking lol,

I would prefer Cerny to do a teardown not DF.

They should stick to the facts, not try to guess at things or give us their possible take.
 
Last edited:
I have a stalker changing what I say in quotes, nice.

I dont need help, I have a good job, smart kids and nice life - whats your problem ?

You need to spend more time with your family and less time spamming literally the same tired shit over and over again on GAF, in that case.

It's boring and it makes you look unhinged.

Your an xbox fan, so I dont stalk you, but you seem to be obsessed with what I post.

The only current gen console I own, and have ever owned, is a Switch. As I've said elsewhere, I'll not be getting either new console at launch.
 
Last edited by a moderator:

Max_Po

Banned
This is hard to believe. Because, Microsoft/Phil Spencer were parading final specs on Xbox Ser. X like almost a year ago on that Gameshow by Doritos Pope ......
 

geordiemp

Member
You need to spend more time with your family and less time spamming literally the same tired shit over and over again on GAF, in that case.

It's boring and it makes you look unhinged.



The only current gen console I own, and have ever owned, is a Switch. As I've said elsewhere, I'll not be getting either new console at launch.

Poster on GAF tells other poster on GAF to spend less time on GAF as they dont have a good life.

Also ad hominem is the lowest form of IQ, so your doing well, I am impressed by your discourse.

And your post history suggests you are an Xbox fan, so stop lying.

Do people even engage brain before they post ?
 
Last edited:

geordiemp

Member
I don't own an Xbox and I'm not getting a new one.

Geordiemp lining up/readying his excuses after been proved a troll.

Ad hominem is for idiots without brains.

So if your a Ninetndo fan, why your last 20 posts about xbox and gamepass and liking xbox posts ?

I dont believe you.

I would talk with you about the RDNA2 architecture and what we expect form RDNA2 white paper, but its kinda poinless.

Your not here to discuss any points, just to insult people.
 
Last edited:
Top Bottom