• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry Face-Off: Child of Light

pixlexic

Banned
So you get clipping (black crush/blown highlights) whether sub-1080p, native-1080p or supersampled 1080p on Xbone.

http://cfa.gamer-network.net/2013/a...1/PS4_004.bmp.jpg/EG11/quality/100/format/png
http://cfa.gamer-network.net/2013/a.../1/XO_004.bmp.jpg/EG11/quality/100/format/png

PS4
iboo7v4YNIGjry.png
iy49Ytya6ULe9.png
XBO

PS4
iXzZpUHmqZHDs.png
iW9mdigELnOp7.png
XBO

edit: Full platform comparison to show only 360 and XBO have incorrect levels.

ibevrCARqToleE.png

They blow out the contrast because they think it makes it look better.

You can see the light areas are very light and dark is very dark.

Very simple to fix. At least give devs a simple way to turn it off .
 
They blow out the contrast because they think it makes it look better.

You can see the light areas are very light and dark is very dark.

Very simple to fix. At least give devs a simple way to turn it off .

wouldn't this effect darker games like Alan Wake, Thief, Dead Space, and Dark Souls?
 

pixlexic

Banned
wouldn't this effect darker games like Alan Wake, Thief, Dead Space, and Dark Souls?

And brighter games as well.

Bottom line is .. It is always better to let the consumer decide and let them set their tvs how they want to.


Also what is even worse the blowing out the contrast brings out aliasing issues more.
 

Handy Fake

Member
wouldn't this effect darker games like Alan Wake, Thief, Dead Space, and Dark Souls?

Dark Souls on XB1 got an extra 0.1 score due to the system-specific contrast making the game more "exciting" and "perilous", with "unseen hundred foot drops and invisible one-hit killers hiding in the shadows being a major plus point, and a wonderful bonus to any seasoned quester".
 

jaypah

Member
Competition isn't about propping up the console that under performs (however slight it might be.)

Nah, the market always decides the value of a product. That's actual competition. XO doesn't have as much power as PS4, that's reality. Folks shouldn't feel offended by that. Or, if they do, buy a PS4. I think the annoying thing is people taking so much delight in another platforms shortcomings (XO sales/performance, Vita sales, WiiU everything, etc.). Doesn't bother me as I don't care and tend to own all of them but it does clog up decent threads.
 
So the take away is you really won't notice a difference between the ps4 and xb1 versions.

If the framerate ONLY drops to 52fps in some situations who the hell is going to notice. Seriously.

I get that is what people want to talk about so they can put Xbox one down but as for this game, it doesn't really matter. They probably could've hit a rock solid 60fps if they weren't down sampling.

I love Digital Foundry articles, but they even suggest these fluctuations are hard to notice when you're playing. So when you're talking about differences like that, it's kind of silly to even make a thing out of them, which is certainly what some people are trying to do in this thread for obvious reasons.

Keep in mind that the game runs at 1440p internally. I'm guessing that's not cheap, processor power-wise.

That said, the more likely culprit is the Ubisoft not giving enough fucks about occasional minor framerate drops to pull programmers off of whatever else they were doing to do the optimizations necessary to get a smooth 60fps on the XBone, rather than the XBone straight-up not being able to handle it. Which is probably the right call, since this is something only a handful of nerds actually care about.

This. Who cares about tiny and very infrequent frame drops? The PS4 and Xbone version will be pretty much identical.

Honestly, why do you guys even bother to come into a Digital Foundry thread and make these comments. Doesn't even cross your mind that if people "didn't" care about the minute details of technical performance that they wouldn't even bother to read the article in the first place? I doubt the average gamer even knows what Digital Foundry is, but here we are.... and these comments stick out like a sore thumb.

and again, wrong thread perhaps?

It just seems like there are really defensive people coming out of the woodwork for no reason at all.
 

shinnn

Member
Honestly, why do you guys even bother to come into a Digital Foundry thread and make these comments. Doesn't even cross your mind that if people "didn't" care about the minute details of technical performance that they wouldn't even bother to read the article in the first place? I doubt the average gamer even knows what Digital Foundry is, but here we are.... and these comments stick out like a sore thumb.



It just seems like there are really defensive people coming out of the woodwork for no reason at all.
Those are comments regarding people saying those marginal fps dips is a big problem for the XB1 or something. Nothing regarding the DF analysis itself.
 

Handy Fake

Member
Those are comments regarding people saying those marginal fps dips is a big problem for the XB1 or something. Nothing regarding the DF analysis itself.

You could say they're a big problem on the grounds that they shouldn't be there at all.
 
A slightly less than stable framerate on the XB1 version is curious although doesn't sound like it affects gameplay much at all. Still suggests something more challenging to get up and running stable I would guess

Am I reading this right about last-gen's resolution? DF suspects Ubi is super-sampling to 720p? Seems an odd choice if true
 
Honestly, why do you guys even bother to come into a Digital Foundry thread and make these comments. Doesn't even cross your mind that if people "didn't" care about the minute details of technical performance that they wouldn't even bother to read the article in the first place? I doubt the average gamer even knows what Digital Foundry is, but here we are.... and these comments stick out like a sore thumb.

I can only speak for my comment, but I was not thread whining in the least. I was pointing out that some people are clinging to tiny framerate drops as a substantial difference maker for this particular game.


It just seems like there are really defensive people coming out of the woodwork for no reason at all.

More than defensiveness, I see people who try and make mountains of of molehills.
 

shinnn

Member
You could say they're a big problem on the grounds that they shouldn't be there at all.
Why not? Its the weaker console, probably not the lead platform. We saw a lot of inconsistencies last gen with the PS3 ports. Also the article says it is probably using super sampling, so not a common scenario.

Unlike last gen, most XB1 owners will accept (or already accepted) they have the weaker hardware. Dont know how those people constantly downplaying the XB1 are "surprised" by those differences.
 

Respawn

Banned
Do you want to see Xbox fail? As a guy who owns every console I have no favorites, but even if I did I would never want to see the competition fail. Competition is good for the industry.
Depends on the company and there ideals. Companies shouldn't be saved just because of competition it has to be more than that.
 

Eusis

Member
This image shows just how pointless it is having multiple system review scores, like that.
I have to imagine they're there for the sake of "ok, checks out, none of them royally screwed up."

Though unless one of those has some really compelling, unique feature I'd just dock half a point off of PS3/360/Wii U (that one may be salvaged by touch screen use.) Yes, the game's the same, but if you're going to line up all the scores like that you may as well do SOMETHING to mark them off as "not as good" and people can come to their own conclusions if it matters. You would need a $400+ system to get the better version, but that may be worth the wait at least so you can have something you like immediately on it.
 

Anteater

Member
Well, yeah.
KuGsj.gif


I have the game but I'm still impressed. I'm surprised the team would take the time to even play with that instead of just calling it a day at 1080p.

yeah, i thought so too, good thing they put the extra resources into use, tbh I didn't notice much of a difference increasing the resolution but it's already looking really perfect, the game looks lovely.
 

Tsundere

Banned
Do you want to see Xbox fail? As a guy who owns every console I have no favorites, but even if I did I would never want to see the competition fail. Competition is good for the industry.
There is competition, that's why we have one console that is more powerful than the other, and there is no need to hide that fact nor downplay it. Xbox doesn't need to do well for competition tom occur.
 

Metfanant

Member
is this a legitimate mistake or is this intended due to some dumb focus testing data?
intended

So, being under the impression that the "crushed blacks" issue has been removed, the suggestion is that this is stylistic? Further, it's specific to Xbox consoles? That seems strange.
it was never "fixed" because MS doesn't consider it to be a problem...it's a byproduct of their chosen gamma curve...

Dark Souls on XB1 got an extra 0.1 score due to the system-specific contrast making the game more "exciting" and "perilous", with "unseen hundred foot drops and invisible one-hit killers hiding in the shadows being a major plus point, and a wonderful bonus to any seasoned quester".
. Is this for real?

wouldn't this effect darker games like Alan Wake, Thief, Dead Space, and Dark Souls?
You bet....it will be spun as a positive though...more "atmospheric" I assume
 

ohlawd

Member
I srsly don't buy the tressfx flowy I'm your Venus hair bullshit they're selling as the reason 360/PS3/Wii U aren't 1080p.

ahh well it looked nice playing the U version anyway. Didn't encounter any save bugs during my 10 hours.
 
Sounds like they should have just made the game 720p and perfected it on all console.

I don't understand people fetish with 1080p vs 720p when almost all wouldn't even notice the difference.
 

Anteater

Member
Sounds like they should have just made the game 720p and perfected it on all console.

I don't understand people fetish with 1080p vs 720p when almost all wouldn't even notice the difference.

For this game the higher resolution eliminates the jaggies on the characters, and the character model blend in more with the background, it's already looking really perfect at above 1080p so I'm glad they increased the resolution, the game isn't too demanding.
 

Jigolo

Member
I can only speak for my comment, but I was not thread whining in the least. I was pointing out that some people are clinging to tiny framerate drops as a substantial difference maker for this particular game.

Hm, I could've sworn you were the one who made a whole thread about frame rate issues regarding the PS4 version of CoD Ghosts.
 

Caayn

Member
Sounds like they should have just made the game 720p and perfected it on all console.

I don't understand people fetish with 1080p vs 720p when almost all wouldn't even notice the difference.
Decrease the resolution from a possible 1440p to 720p just to eliminate a rare framedrop?
 

gruenel

Member
Sounds like they should have just made the game 720p and perfected it on all console.

I don't understand people fetish with 1080p vs 720p when almost all wouldn't even notice the difference.

I wish this "almost no one sees the difference between 1080p and 720p" nonsense would stop already.

Upgrade your CRT TV maybe.
 

Shin-Ra

Junior Member
I don't understand people fetish with 1080p vs 720p when almost all wouldn't even notice the difference.
What am I seeing?

iqYi50Vxi6JNW.png
PS4
ib1JDa83IP6cgM.png
Wii U

Crisp vs blurry. At even higher resolutions you'd make out little details like her fingers even more clearly. On PS360U it looks a bit like she's got a hoof.

(David Bierton's Digital Foundry 4:2:2 captures show some artifacting, mostly on red, not present in the source game)
 
I wish this "almost no one sees the difference between 1080p and 720p" nonsense would stop already.

Well, it depends on your viewing distance of course. Usually when I go over to somebody else's house they are sitting too far away from the tv. Even the gamers, for the most part.

Of course most people are going to notice the difference at a proper viewing distance, though in my experience most people don't sit as close as they should; the THX guidelines are a pretty solid recommendation IMO.

On a PC you'll of course notice because the screen is RIGHT THERE.
 

anderedna

Member
Questions related to corrupt save bug:

Does this game only allow player to save to one specific save slot, and auto-saves at certain checkpoints or does it allows multiple save slot?
If it allows multiple save slots, once the bug hits, will all the save slots affected by the bug or just one specific slot?

I haven't bought this game yet but I can say I'll buy one later. Thanks.
 
I srsly don't buy the tressfx flowy I'm your Venus hair bullshit they're selling as the reason 360/PS3/Wii U aren't 1080p.
It won't be the hair alone, but the fact that it and all the other characters are 3D rendered. I don't think this is a technical shortcoming, but rather an artistic choice. Indeed, I'm surprised DF are confused why the game isn't 1080p like the recent Rayman games. To me it makes good sense, given what the games are.

Rayman is built of hand-drawn sprites and backgrounds, so literally the only way to increase their smoothness and detail is to raise the resolution. Child of Light has hand-drawn backgrounds, but the characters are 3D objects. Simply raising the resolution won't help as much with them (some, but not much). What you want for 3D objects is good AA, and the most effective AA process is supersampling. By downsampling to 720p on the lower-tier machines, Ubisoft can achieve better edges and detail on the characters, which is mostly what you'll be looking at.

There is a cost of increased blurriness for the backgrounds (2D art is hurt by lower resolution, and downsampling doesn't change that). But the soft-pastel watercolor look of Child of Light is less damaged than the vibrant, squiggly Rayman world would be.

It seems Ubisoft chose a softer look for PS3/360/Wii U, rather than a sharper look but with worse jaggies. I think that's the right call for this game.
 

ohlawd

Member
It won't be the hair alone, but the fact that it and all the other characters are 3D rendered. I don't think this is a technical shortcoming, but rather an artistic choice. Indeed, I'm surprised DF are confused why the game isn't 1080p like the recent Rayman games. To me it makes good sense, given what the games are.

Rayman is built of hand-drawn sprites and backgrounds, so literally the only way to increase their smoothness and detail is to raise the resolution. Child of Light has hand-drawn backgrounds, but the characters are 3D objects. Simply raising the resolution won't help as much with them (some, but not much). What you want for 3D objects is good AA, and the most effective AA process is supersampling. By downsampling to 720p on the lower-tier machines, Ubisoft can achieve better edges and detail on the characters, which is mostly what you'll be looking at.

There is a cost of increased blurriness for the backgrounds (2D art is hurt by lower resolution, and downsampling doesn't change that). But the soft-pastel watercolor look of Child of Light is less damaged than the vibrant, squiggly Rayman world would be.

It seems Ubisoft chose a softer look for PS3/360/Wii U, rather than a sharper look but with worse jaggies. I think that's the right call for this game.

I can get behind this. Thank you.
 
I can only speak for my comment, but I was not thread whining in the least. I was pointing out that some people are clinging to tiny framerate drops as a substantial difference maker for this particular game.

More than defensiveness, I see people who try and make mountains of of molehills.

My very specific point is ...... that it comes with the territory of these articles. Some people may complain about minor drops in frame rate for others it my be AA, AF, Texture resolution or screen tearing (it has to be really bad for me to notice personally). Everyone has a pet peeve when it comes to gaming performance and if some are making a big deal out of it, that is their preference. For me, I can't really tell the difference between 30 and 60 fps (yes I am one of those) in most games so minor frame drops don't bother me in the least. But black crush annoys me a bit.

So when you post in these threads and point out a hang up others may have are not "that big of a deal", it just seems needlessly defensive. I can't relate myself, but I have seen it often enough to understand there are people who really would view this as an issue and most visibly in a DF thread.
 

LordOfChaos

Member
It's a bit crazy that the PS3, 360, and especially Wii U were relegated to 720p, this game is ridiculously light to run. My Core 2 Duo T6500 and Radeon 4570 (exceptionally weaksauce...80 stream processors) manage 1920x1080 and a fairly consistent 60fps, with some scenes going as low as 40, but always very fluid.

The older consoles I can understand, but the Wii U? What's the deal with that? Apart from TMUs it seems like a doubling of my GPU. 160 shaders, 8 ROPs, 8 TMUs, compared to my 80, 4, 8, on 12.8GB/s graphics memory.
 
Top Bottom