• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Face-Off: Zombi

Journey

Banned
Do people honestly think the marginal overclock on the Xbox One CPU is even capable of more than a 1-2 fps difference? That's being generous also.


You forgot the extra core available to devs on XB1. I'm not saying the CPU had anything to do with it, but since you brought it up, an extra free core can be put to good use. Likely the real differenciator would be in SDK optimizations, or maybe XB1 is better with these kind of ports, I mean didn't we already see something similar with another old port? I think RE IIRC.
 

Three

Member
I think the original Wii u version looks the best. It has more effects and seems less buggy. On the PS4 and XB1 textures seem to disappear randomly while walking which is a bit jarring. The XB1 version seems darker than the others too. Black crush or incorrect gamma? The PS4 and XB1 versions have dips during lightning/thunder too with the ps4 more than the xb1 version
 

c0de

Member
That has been proven wrong over and over again, the xbox CPU has never given it any advantage in framerate outside of bad optimization on PS4 code. Some devs have even highlighted that you can get more out of the PS4 CPU, so I don't know why people keep repeating that. Well.... I know why...but lets keep it honest and factual please.

Some devs? On GAF it was only Matt, IIRC. Than we had a compression benchmark close after launch and the Ubisoft benchmark which many like to show when showing the GPGPU performance advantage PS4 has over Xbone but it also shows that the CPU in Xbone was more performant than the PS4 CPU. So yes, let's keep honest and factual.
Whether there is any game currently out that has a better performance on Xbone because of its higher clocked CPUs we don't know as the systems are complex in itself and you can't say that a better framerate is only because of a better CPU.
To say anything on whose CPU is better we either need profiling data or look at specialized benchmarks devs give us. Inferring anything from games will most probably be wrong.
 
To me it clearly sounds like a port that is only worth getting at a significant discount if at all. Actually this being Ubisoft I expect it to be a PS+ title down the line.

This is a very good point and I do have plenty of other games to play. Haven't even downloaded Rapture yet and Rocket League is just sitting on my hard drive...not to mention I still have most of LiS episode 4 to finish.
 

Fury451

Banned
This still seems like a very bare bones effort.

Undoubtedly, but it's not as poorly functioning as some accounts stated. Some made it sound like the thing barely worked.

Played for awhile last night, and found little in the way of hampering issues besides the aforementioned drops at the palace (and a zombie hit with the bat did fly into the ceiling like Team Rocket though). It is a very basic port, but I think the price reflects that, even though the lack of the WiiU multi features is a bummer.

PS4 here, I should mention. May get it on PC during a sale to compare, because it's a pretty good game.
 
Some devs? On GAF it was only Matt, IIRC. Than we had a compression benchmark close after launch and the Ubisoft benchmark which many like to show when showing the GPGPU performance advantage PS4 has over Xbone but it also shows that the CPU in Xbone was more performant than the PS4 CPU. So yes, let's keep honest and factual.
Whether there is any game currently out that has a better performance on Xbone because of its higher clocked CPUs we don't know as the systems are complex in itself and you can't say that a better framerate is only because of a better CPU.
To say anything on whose CPU is better we either need profiling data or look at specialized benchmarks devs give us. Inferring anything from games will most probably be wrong.

Wasn't that matt comment also not old, like pre upclock, sdk optimization and freeing cpu core.
 

c0de

Member
Our very own Matt and Metro devs too. This was some time ago though so things may have changed.

Yep, the Metro devs explicitely said that draw calls were an issue on Xbone.

Digital Foundry: DirectX 11 vs GNMX vs GNM - what's your take on the strengths and weakness of the APIs available to developers with Xbox One and PlayStation 4? Closer to launch there were some complaints about XO driver performance and CPU overhead on GNMX.

Oles Shishkovstov:
Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

http://www.eurogamer.net/articles/d...its-really-like-to-make-a-multi-platform-game
 

Markitron

Is currently staging a hunger strike outside Gearbox HQ while trying to hate them to death
If I had ever gotten a Wii-U, this woulda been my first purchase. I'm a bit disappointed that it's a bare bones port, but I suppose they priced it accordingly.

This should also serve as a good example to all of those calling GoW3 a straight port (of which there were plenty) I don't mind paying extra for a resolution and a performance boost.
 

SerTapTap

Member
Oh noes! Whatever shall we do :-( /s

Was this game any good? I thought it bombed critically?

Significant bugs and perf issues at launch, game is mostly fixed on Wii U now. The gameplay isn't for everyone, and combat is very shallow, but it's very effective as a survival horror game. I would say it's a notably flawed game with a great idea. Worth checking out if the idea interests you. Might have been a rough sell at $60 (I assume it launched at that?) but for $20, go ahead
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Usually indicates a CPU bound issue. Xbone CPU is a hair faster. Or just a result of the different tools, it's not like they went gung-ho on performance after all.

Every time someone jumps to this conclusion, a patch comes out and throws the theory out the windows. Also explain how lightning takes CPU cycles.

The simplest assertion is it is a cheap port with unoptimized code. Any amount of polish and technical skill this game would be 60fps with no drops. Trying to use it to judge hardware is console war none sense.
 

Three

Member
Maybe before uptick in clock, but definitely before the extra core, before the draw calls issue and before the release of Kinect reservation. I would say the biggest impact would be the draw calls based on what Oles Shishkovstov said, stuff like that would bring your CPU to a halt.

Definitely not before upclock. That was in September 2013 before the console even launched.
Every time someone jumps to this conclusion, a patch comes out and throws the theory out the windows. Also explain how lightning takes CPU cycles.

The simplest assertion is it is a cheap port with unoptimized code. Any amount of polish and technical skill this game would be 60fps with no drops. Trying to use it to judge hardware is console war none sense.
That is probably the truth.

Lightning strikes likely have nothing to do with CPU. If anything I would guess it is the XB1s compressed render targets that help in that situation.
 

thelastword

Banned
Which Devs?

Substance-Engine_Texture-Generation.jpg


Link.



Oles Shishkovstov: Well, you kind of answered your own question - PS4 is just a bit more powerful. You forgot to mention the ROP count, it's important too - and let's not forget that both CPU and GPU share bandwidth to DRAM [on both consoles]. I've seen a lot of cases while profiling Xbox One when the GPU could perform fast enough but only when the CPU is basically idle. Unfortunately I've even seen the other way round, when the CPU does perform as expected but only under idle GPU, even if it (the CPU) is supposed to get prioritised memory access.

Oles Shishkovstov: Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.
Link

There was another dev, but I can't get on it right now. I know things have improved with the xbox SDK after that (improvements to dx11 and the ushering in of dx12) but so has the PS4 API, even though we hear less about it. In any case, a processor which returns you a CPU mark of 2350 is not going to give you a 7 frame advantage in a game with a CPU mark of 2250 or 2300... if we have to go on the cpu clock alone.
 

psychotron

Member
so the wiu is the worst version...good to know

Not exactly. If you're a fan of the lens effect and gamepad inventory system, Wii U is superior. Then again, radio transmissions are done through the Dual Shock speaker, which is a nice touch on PS4. Framerate is best on XBox One. All have their positives and negatives.
 
You forgot the extra core available to devs on XB1. I'm not saying the CPU had anything to do with it, but since you brought it up, an extra free core can be put to good use. Likely the real differenciator would be in SDK optimizations, or maybe XB1 is better with these kind of ports, I mean didn't we already see something similar with another old port? I think RE IIRC.

Yes and the PS4 has a massively better GPU with twice the number of ROPS and yet that never seems to be mentioned in these cases, lightning strikes aren't going to stress the CPU, they are going to use GPU power which the PS4 has much more of than the XB1.
 
Every time someone jumps to this conclusion, a patch comes out and throws the theory out the windows. Also explain how lightning takes CPU cycles.

The simplest assertion is it is a cheap port with unoptimized code. Any amount of polish and technical skill this game would be 60fps with no drops. Trying to use it to judge hardware is console war none sense.

Lazy devs, right? Like in the good ol' times.
 

c0de

Member
Yes and the PS4 has a massively better GPU with twice the number of ROPS and yet that never seems to be mentioned in these cases, lightning strikes aren't going to stress the CPU, they are going to use GPU power which the PS4 has much more of than the XB1.

How do you know when you don't have the code at hand?
 

SuomiDude

Member
Textures are way better on doors and walls and such, but the missing dirty lens effect makes the XBone4 versions look actually worse than Wii U... And other missing stuff from Wii U version (like multiplayer and gamepad use), and the fact that you can buy Wii U version for less than 10$/€ pretty much everywhere makes Wii U version the best option by far.
 

c0de

Member
In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

They already said they didn't use the low level API so we don't know how much there is to "win" here in terms of performance. In the same interview you linked.
 

Fbh

Member
Sounds pretty dissapointing.
You'd expect them to be able to port it over at 60fps, instead we get 30 with framedrops
 

Mastperf

Member
My guess is that the game profits from higher single CPU core speed. On Wii U it only ran on 3 cores after all.
Those WiiU cores are less powerful than the cores in the PS4. I'm not seeing any cpu related improvements with this port so I'm at a loss as to why the PS4 frame-rate is dropping so low.
 

c0de

Member
So shitty port then? I mean CPU doing GPU work?

Well, it's a port from WiiU and we don't have any clue which tricks devs use to get performance out of that machine. And given the state of the WiiU launch tools (which seemed to be a mess) I wouldn't be surprised if the code was "unusual" and this code was just ported over, to the point that the sdk actually spit out an executable. And then they stopped.
 
Kinda disapointing. Looks like I have no reason to double dip. Wii U sounds like the superior version overall to me, especially when considering you can routinely pick up for under $10. That said, I hope PS4/Xone owners give it a shot. It was great on the Wii U and a proper sequel, on any console, would make my day.
 

Javin98

Banned
Wow, yet another shit port on consoles and PC. Remember before this gen began we were all hoping that the consoles using x86 architecture meant that the PC and console versions would all perform very well? Fast forward 2 years and we still have shit ports.
 

Crom

Junior Member
so the wiu is the worst version...good to know

No. It is the only version with the multiplayer mode and the gamepad adds intangibles in the single player mode.

I also read that they changed the limited FOV which is a bad thing because it makes the game less intense, easier, and less survival horror. That was the whole point of the FOV to begin with.
 

omonimo

Banned
Wow, yet another shit port on consoles and PC. Remember before this gen began we were all hoping that the consoles using x86 architecture meant that the PC and console versions would all perform very well? Fast forward 2 years and we still have shit ports.
It's a cheap port. x86 can't avoid such stuff.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I mean people throws accusations at the developers very easily when a game doesn't perform as expected in his console of choice.

So you think this $20 stealth release of a Wii U game is a big budget port? Occam's razor my man. It's all about time+budget (man hours) and experience.
 

c0de

Member
Wow, yet another shit port on consoles and PC. Remember before this gen began we were all hoping that the consoles using x86 architecture meant that the PC and console versions would all perform very well? Fast forward 2 years and we still have shit ports.

If you were one of them to believe this, then it was your fault. I never believed x86 would mean great ports. It only ever meant less work for devs.
 

thelastword

Banned
Just watched the vid again, it would seem that the Wii-U has a more powerful CPU than the PS4 as well.....(at least a CPU that gives it a better framerate like the xbox-one does)..... I saw that the game fell to 19fps on the PS4 but never got to such lows on the Wii-U, that was the lowest drop across all the consoles benchmarked, that has to mean something I'm thinking......
 

StevieP

Banned
Just watched the vid again, it would seem that the Wii-U has a more powerful CPU than the PS4 as well.....(at least a CPU that gives it a better framerate like the xbox-one does)..... I saw that the game fell to 19fps on the PS4 but never got to such lows on the Wii-U, that was the lowest drop across all the consoles benchmarked, that has to mean something I'm thinking......

Both the Wii u and the xbone have 32mb of embedded memory connecting to a pool of ddr3? Lol
 

Jito

Banned
Only place I've noticed frame drops is the lightning outside Buckingham Palace, I can live with that if it's the only place in the whole game. Other than that it's been running and looks great with the brightness adjusted. Really glad I can see now without those overblown lens effects. Well worth it for £15.
 
I just can't believe they ruined their true vision with this port

Just watched the vid again, it would seem that the Wii-U has a more powerful CPU than the PS4 as well.....(at least a CPU that gives it a better framerate like the xbox-one does)..... I saw that the game fell to 19fps on the PS4 but never got to such lows on the Wii-U, that was the lowest drop across all the consoles benchmarked, that has to mean something I'm thinking......

wiiu is also running it at 720p vs 1080p on the PS4.

what it likely means is that there wasn't much priority given to CPU optimizations because the game should be running at 60 fps on both machines
 

Javin98

Banned
It's a cheap port. x86 can't avoid such stuff.
True, cheap ports will still be cheap ports even if there was a very simple hardware to work with.

If you were one of them to believe this, then it was your fault. I never believed x86 would mean great ports. It only ever meant less work for devs.
There's a difference between believing and hoping.
 

Chobel

Member
Just to correct some misinformation, XBO CPU performs better than PS4 CPU. That was not the case at launch (when Matt posted that comment), however after many SDK updates MS got their shit together and XBO CPU is now outperforming PS4's. Matt made a new post in which they said the situation changed http://www.neogaf.com/forum/showpost.php?p=134491298&postcount=298

Of course, this doesn't change the fact that PS4 has better GPU which will give it advantage against XBO in most (all?) situations even with XBO CPU advantage.
 

omonimo

Banned
PC version seems like it came out mostly intact and with a non-shit framerate
They cut off the dirty len effects without any reason on pc too. Of course it runs better, with the double of the cpu resource available, did you expected it runs like the ps4?
Hard to say, it's a great port if it was developed by a 5 person team in 6 months. We don't have enough info to say it's a lazy port.
So even Batman on pc could be a great port because we don't know how many people worked on it, right?
 

stuminus3

Member
The dirty lens filter is gone? That's a shame. I mean it's silly and I can see why people don't like it but why not make it an option? There's surely no technical reason for it being omitted.
 
Top Bottom