• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - Assassin's Creed Valhalla: PS5 vs Xbox Series X/ Series S Next-Gen Comparison!

Bitmap Frogs

Mr. Community
How do you like yours? Extra crispy? After all the stuff Sony had to walk back these last few months, you would think you guys would be a little more cautious than to get hot and heavy with multi-platform cross gen games.

There's a reason MS dev tools are late. They waited at the absolute latest moment to get the best silicon AMD had. When the games start pulling away, they will pull away.

Dirt devs say microsoft tools are fine.

How long will you have to cling to the "tools" raft for you to accept Microsoft, once again, have released weaker hardware than the competition.
 
He is still being dishonest. the clip can be seen here. you can see the xbox framerate stay between 46-48 while the ps5 remains locked with just a couple of drops to 58 and 59. even if you take the lowest ps5 number and the lowest xbox number, the gap is 25% or more for the average through out that clip.

timestamped. 14:04 to 14:23. Xbox starts at 49 then stays at 47-48 for the rest of the 20 second clip dropping to 46 once. PS5 literally drops down to 58 once for a second then goes back to 60 fps locked for the entire 20 second clip. one drop.



And regardless, in the article, there is no mention of average. He wrote 'at its worst' and still mentioned 15%. absolute nonsense. no at its worst, its 46 vs 60 fps. which is a 30% difference.


Well it's very misleading then, has he not edited his article? That bit should be changed as it's flat out wrong.
 

SlimySnake

Flashless at the Golden Globes
Well it's very misleading then, has he not edited his article? That bit should be changed as it's flat out wrong.
i checked a few minutes ago. he hasnt. i tweeted him to point that out. lets see what he says. maybe if more people tweet him, he will actually change it.
 
So both john and richard from DF are lying ?

A technical director working on both dev kits, on camera, saying the Xbox software is NOT behind comparative to PS5, is 100% legit compared to a Twitter PM message from a gaming journo citing 'sources'. And I like John Linneman.

It seems DF asked devs and they said GDK is behind and misconstrued that to mean 'behind compared to PS5' when in fact both are behind as these are new consoles with new software.

The other astounding thing about all that is GDK being based on DX12 which has been bedded in to devs for years now, MS being a software company, and PS5's SDK having no equivalence on PC. It doesn't wash.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Imagine thinking that a results from games nullify the value of a machine that has a 15% determent over the other one.

I get it, this is a dick measuring thread. But 15% isn't that bad, BOTH machines are good.

Pick which one you like.

Imagine that.

The real tools in the console wars are the warriors.
15% isnt bad. but is 30% bad?

Screenshot_2020-11-18_at_16.27.43.png


DF's math is wrong. 'at it's worst' which is what they said in the article, it's 30% worse AND has a lot of tearing.

60/46=1.30x or 30%.
 

VFXVeteran

Banned
I just watched this video and I must say that MS needs to get their act together. I didn't think a 2 TFLOP advantage would be much of a difference in a real world scenario but it should at least perform on par with the PS5. This just shouts to me that MS still has a lot to sort out with their API. That's most definitely the reason we haven't seen much from the Xbox developer camp.

Congrats PS5 owners on a solid system!
 

Andodalf

Banned
15% isnt bad. but is 30% bad?

Screenshot_2020-11-18_at_16.27.43.png


DF's math is wrong. 'at it's worst' which is what they said in the article, it's 30% worse AND has a lot of tearing.

60/46=1.30x or 30%.

He said in a tweet he messed up and mean average difference was 15%, not the worst difference but also that it’s also obviously not a prefect metric due to v sync
 

SlimySnake

Flashless at the Golden Globes
Just a quick question.

How much of a performance hit is V-sync?

im gonna run some tests on my pc but i dont have ac valhalla, but i will try mafia which only runs at 45 fps on my rtx 2080.
He said in a tweet he messed up and mean average difference was 15%, not the worst difference but also that it’s also obviously not a prefect metric due to v sync
see my post on the last page, that section lasts 20 seconds and you can literally see the xsx version hover around 47-48 fps with a drop to 46 while the ps5 is a lock 60 fps with just one drop to 58 in the entire 20 second video. Alex is making no sense here.
 
Last edited:
You do know that its stuttering because the resolution is too high? the stuttering is a symptom of not a bug but poor performance.

Also, the PS5 has vsync and the xbox series x doesnt. vsync is usually a hit on the gpu as well. not sure how much.

The stuttering I’m referring to is a bug related to the camera and appears to only affect cutscenes which appear to run at a locked 60fps. Tearing itself is a symptom of performance dropping under the targer frame rate, the PS5 also tears but not as much as the general perf is slightly higher. My point was that the instance of particularly bad performance you are highlighting (equipping torch) appears to be a software issue given the dramatic loss of frames (roughly 14fps).
 
TF is not the right indicator of performance. Look at the 3080 vs 6800xt.

29tf vs 21tf. If you go by that the 3080 is almost 30% more powerful but they’re going blow for blow in some cases (discounting RT performance).

This thread is interesting and more and more I feel like if I want into the game pass ecosystem I’m probably going to be going PC. That said I still think there’s something iffy going on here I believe Ubisoft can probably patch fix a few of these issues.

Despite not really caring who ‘wins’ myself it is very interesting to not see certain outspoken people on gaf completely avoiding this thread. These threads are certainly entertaining.

The issue here is Microsoft’s marketing claims. Boy do they have egg on their faces for those Twitter posts.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The stuttering I’m referring to is a bug related to the camera and appears to only affect cutscenes which appear to run at a locked 60fps. Tearing itself is a symptom of performance dropping under the targer frame rate, the PS5 also tears but not as much as the general perf is slightly higher. My point was that the instance of particularly bad performance you are highlighting (equipping torch) appears to be a software issue given the dramatic loss of frames (roughly 14fps).
Nah, the torch is actually indicative of a gpu performance issue. the torch effects the lighting, shadows and pretty much everything else which impacts the gpu more than anything else. its not a cpu issue or a software bug. its a straight up gpu power issue.
 
Nah, the torch is actually indicative of a gpu performance issue. the torch effects the lighting, shadows and pretty much everything else which impacts the gpu more than anything else. its not a cpu issue or a software bug. its a straight up gpu power issue.

That must be it. It’s not at all possible a 14fps frame drop may be a result of some inefficiency or performance impacting bug. I’m glad you cleared that up.
 


Damn, so he confirms that PS5 is using V-Sync, which has an FPS penalty on PS5 in favor of avoiding tearing, yet it's much more stable both resolution and FPS! Without PS5's V-Sync and with unlocked framerates the difference is going to be staggering if it's already 15% average with more than 30% max (as in one point it was running 1440p on XSX and 1720p on PS5, so it's not a clean cut).

PS5 delivering that multiplat spanking to Series X.

PS5: "RDNA2?, RDNA2?. What you say? Did you say RDNA2?"

giphy.webp
 
Last edited:

kuncol02

Banned
MS just didn't apply blurring the new and old generation logic to their dev tools. They should have been matured enough for devs to fully take advantage (If tools are to blame)
MS is totally changing how games for XBox are written. Purpose of that is to allow developers to write game once for xbox and PC. Before that you had to use XDK for XBox and DirectX SDK for Windows.

Nah, the torch is actually indicative of a gpu performance issue. the torch effects the lighting, shadows and pretty much everything else which impacts the gpu more than anything else. its not a cpu issue or a software bug. its a straight up gpu power issue.
No. No mater what you render when you have torch equiped game stutter like crazy. You could be in small cave or room without windows looking into wall and it's still around ~45fps (Tested that today). When you drop that torch frames go back to normal. Other light sources like flames don't have that effect. In 40h of playing I had stuttering only in some cut scenes and when there is any character with torch and that torch is used as lightsource (inside or in night). There are lots of way more complicated scenes (night battles etc) which run on much higher frame rate.
 
Guess I should have waited a week and got it on PS5!

I noticed immediately how horrible the frame rate was with a torch out on XSX even by eye.

Maybe they will improve it on XSX but I'm not going to hold out any hope. It's perfectly playable on XSX but you'd be crazy to choose it over PS5 if you have both consoles.
That sucks man, but I'm amazed you were able to pick up both a XSX and PS5, some people have all the luck LOL

Hopefully Ubisoft can patch it to improve performance, the game seems to have sold really well so it should be worth to them. People laugh, maybe there is something to those tools rumors. How hard is it really to utilize all 52 CUs? If they don't have the right tools for it I'd imagine some CU's aren't even being used? That's my guess and it's a complete guess cause I'm not a hardware engineer.
 

Truespeed

Member
Unfortunately Series X uses RDNA1 CUs

Series X might have all the features of RDNA2, but its powered by last gen CU tech. Its a Ferrari body with a Ford Focus engine

And now we are seeing that play out in the real world comparisons

So the XSX CU is a Fiero?

Umme8nP.jpg
 
PS5 was supposed to perform 100% better at 13 seconds to be exact.
Loading times are not the point of the throughput of the SSD, otherwise they would have stuck a SATA III disk in there and called it a day, I don't even think Cerny mentioned it... not as the main benefit anyway, the SSD is about asset streaming, it's controller it about streaming immense amount of assets in real time to the memory sub-system, without hitting the CPU in any significant manner.

Opening a game is probably 75% CPU (initializing different aspects of the engine, displaying logos, etc.) I don't think it's extremely data intensive (this is why opening games with a regular SSD vs an nvme disk on PC is almost the same thing, it's very CPU intensive).
 

sendit

Member
Regardless of tooling. Logically this shouldn't be happening (for any comparison).

The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now. It doesn't matter how minimal the difference is based on specs. The XSX should technically be offering better performance here.
 
Last edited:

ethomaz

Banned
Regardless of tooling. Logically this shouldn't be happening (for any comparison).

The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now.
3.5Ghz is with SMT... 3.8Ghz is without SMT.

Actual Xbox CPU is 3.6Ghz with SMT.
 

sendit

Member
3.5Ghz is with SMT... 3.8Ghz is without SMT.

Actual Xbox CPU is 3.6Ghz with SMT.

Yep. However, it's still clocked faster (with or without SMT).

I can only think of two things here:

Tooling:
If it does come down to immature Microsoft tooling, then they royally fucked up. (Given how much better their BC support is, I would like to think this isn't the case)

TDP:
The SOC is being thermal throttled.
 
Last edited:
The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now.
You clearly did not watch Cerny's sermon back in February.
- The PS5 has a lot of custom hardware
- PS5 has higher fill rate and a few other less popular specs that offer benefits (keep the GPU busy doing actual calculations instead of waiting for some requests to be sent to it.

Also it came out recently that the CPU/GPU has unified cache, that the xbox systems are unlikely to have, which means cache coherency hits occur much less often (it's a big deal in many situations).

On the bright side the xbox has gamepass--assuming you like shovelware.
 

ethomaz

Banned
Yep. However, it's still clocked faster (with or without SMT).
100Mhz but the Xbox OS have way more overhead with the hypervisor than PS5.
Same happened with PS4 vs XB1... XB1 had a CPU clocked higher but PS4 have more CPU for game because it doesn't have hypervisors working in background.

Plus PS5 CPU is not used for a lot of render tasks because it is offload to other co-processors.
 
Last edited:

sendit

Member
You clearly did not watch Cerny's sermon back in February.
- The PS5 has a lot of custom hardware
- PS5 has higher fill rate and a few other less popular specs that offer benefits (keep the GPU busy doing actual calculations instead of waiting for some requests to be sent to it.

Also it came out recently that the CPU/GPU has unified cache, that the xbox systems are unlikely to have, which means cache coherency hits occur much less often (it's a big deal in many situations).

On the bright side the xbox has gamepass--assuming you like shovelware.

I've watched the video plenty of times. This doesn't apply here. This is Assassin's Creed we are talking about (a last gen game). Not something that would take advantage of Playstation 5's custom solution.
 
Last edited:
Top Bottom