• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Infinity Ward confirms: CoD Ghost is 720p on Xbox One, 1080p on PS4

I'm starting to think that people are ignorant enough to think that resolution is all that is different between games.

Things like MSAA, TXAA, or FXAA are just 3 differences in one game that can alter performance.

You could have a Wii U game running 60fps @ 720p but it may be completely missing

Enhanced Geometry
Dynamic Shadows
PCSS (soft shadows)
Depth of Field
Lens Flares
Distortion
Motion Blur
Real Time Reflections
Light Shafts
Ambient Occlusion
Actual Physics....etc.

Resolution only dictates so much.

I understand the difference but have you seen Call Of Duty: Ghosts??? I understand the hardware of the Xbone is far superior to the WiiU. My comment was aimed more towards IW.
 

Portugeezer

Member
It's always amusing how some magazines, twitter celebrities, websites and whatnot are telling us there's no difference after 7 years of everyone saying 360 was the place to go for multiplatform games.

And for minute details. Now we have a huge resolution difference and welp... "most people won't notice."

To me it's not whether people will notice, I'm sure they will maybe just not as much as they would if compared side by side. But from BF4 and COD we can see a clear discrepancy in power between the two consoles, with the $100 cheaper console coming off far better. That so called "truth" Major Nelson spewed about knowing that this moment would come regardless, on the Direct Xbox Panello talked about.
 
I could understand BF4 at 720p vs 900p on PS4, the images looked comparable to me, but COD at 720p vs 1080p is the end for me for this generation. Have enjoyed the 360 for years and I don't have much love for Sony, so I am moving back to PC. Have a decent rig and just preloading BF4 now.

How has MS managed to screw this up? Unbelievable.

Wonder how many other folks are in the same boat.

welcome back brother...one of us....one of us
 
Also, this is one game. One Cross-gen game by one developer on an engine that could use some work. DICE has BF4 running at 900p on Xbox One, and the game itself looks FAR better than CoD has.

This comparison will get much more interesting as we leave last gen behind, but I don't think we can come out and just straight up rip Penello yet when this could have more to do with the developer than we know.

BF4 = 720p
 
I'm still going Xbox One first but how can people even try to spin this? I mean, maybe you can spin 900p v 720p, but 1080p v 720p is just ridiculous.
 

J-P

Neo Member
iLhcwsu5RxlMO.jpg


ifLTGVwIkjZUN.jpg

The XB1 is so Vibrant!
 

B.O.O.M

Member
Yeah I knew this would happen during the BF4 comparison thread...Custom Upscaler has become the next buzzword for the Xbone

I swear this console had the most number of buzzwords associated with it in the shortest time period than any other console ever.
 

GribbleGrunger

Dreams in Digital
I'm starting to think that people are ignorant enough to think that resolution is all that is different between games.

Things like MSAA, TXAA, or FXAA are just 3 differences in one game that can alter performance.

You could have a Wii U game running 60fps @ 720p but it may be completely missing

Enhanced Geometry
Dynamic Shadows
PCSS (soft shadows)
Depth of Field
Lens Flares
Distortion
Motion Blur
Real Time Reflections
Light Shafts
Ambient Occlusion
Actual Physics....etc.

Resolution only dictates so much.

I've been asking for someone to make a thread about this stuff because I'm sick and tired of resolution alone being the factor that decides what a next gen game will look like. Why don't you do it!?
 

adixon

Member
I'm still going Xbox One first but how can people even try to spin this? I mean, maybe you can spin 900p v 720p, but 1080p v 720p is just ridiculous.

Yeah, I think the most impressive thing coming out of this launch is going to be in the new internet knowledge (and I hope to god this isn't parroted by the press) that resolution, lighting, and framerate are no longer major objective measures on the technical/graphical quality of a game.
 

daman824

Member
Their RAM sucks worse than their GPU.
The RAM issue was something that couldn't be fixed. They knew they needed 8 gigs and went with ddr3 and then added the move engines and ESRAM. The RAM solution isn't as good as Sonys, but it most likely was one of the first things set in stone and with how things went, was unavoidable. But they could have added a better GPU. And that would have helped developers hit a better resolution without having to heavily optimize the system.
 

GameSeeker

Member
Either the COD engine is shit, or IW really dropped the ball here.

Partially Microsofts fault as well. Should have just bit the bullet and gone with a better GPU.

Microsoft dropped the ball. CoD is 1080p @ 60fps on the PS4, so IW is doing fine with good hardware. Xbone is just significantly underpowered than than the PS4 and so will remain behind in either resolution or graphical effects for multi-platform titles.
 
My question, and what I believe the real issue to be, is this: what happens if (when) devs need to drop the resolution to add more and more effects on the PS4 version of games? So when the PS4 version of a game is 720p, what happens to the Xbox One version?

And I'll answer my own question with a prediction: the Xbox One version loses effects and possibly drops to sub-HD. The real question right now shouldn't be "could CoD:DoG run better on the Xbox One." It should be "what does this mean for games that actually max out the PS4 later in the gen."

All IMO, of course.

well, there are wizards who work on the playstation console. gt6 runs at 1440x1080, 60 fps, dynamic weather, dynamic time, adaptive tesselation, etc.

multiplatform games will always be different. not to say all of them aren't as good as the first-parties.
 

spwolf

Member
I was always of the opinion if the native resolution is either scaled by the console or scaled by the TV, if I select 720p native on Console display, no TV upscaling = No display lag etc, however if I select 1080p display and the game is outputting 720p, the TV upscales it and introduces input lag.

X1 has a dedicated chip so that the tv doesn't do the scaling, the console does, basically Native game 720>Upscaled at console to 1080p pre processed>Received at tv at 1080p = No display lag

Unless There is a PS4 Hardware scaler it will be (however unlikely a 720p title on PS4) 720p Game > TV Upscaled to 1080 = Display lag

Appreciate your insight, trying to get onto the same page here, my understanding differs from yours.

what are you talking about? do you have any proof for anything you wrote above?
 
So this wasn't totally clear for me in the BF4 thread...are all the XB1 games (or at least the multiplats) going to have that mega contrast and blown out colors?

It didn't look as horrible in BF4 since the palette is a big more muted to start with but it looks god awful in Ghosts...I mean really bad. Unless that oversaturated image was made by someone here to make fun of it. In that case, well done.
 

Nafai1123

Banned
The RAM issue was something that couldn't be fixed. They knew they needed 8 gigs and went with ddr3 and then added the move engines and ESRAM. The RAM solution isn't as good as Sonys, but it most likely was one of the first things set in stone and with how things went, was unavoidable. But they could have added a better GPU. And that would have helped developers hit a better resolution without having to heavily optimize the system.

A better GPU without the bandwidth to feed it would've been a waste of money.
 
It's a weaker console period. People need to stop clinging on to the hope that it will ever be competitive with the PS4 as there will always be a significant gap in performance. Myself and others have been trying to drive home this point for a couple years now and the reality of the situation is finally revealing itself in all of its aliased and blurred glory. There's no light at the end of the tunnel.

Probably due to the crushed blacks.
 

shandy706

Member
I've been asking for someone to make a thread about this stuff because I'm sick and tired of resolution alone being the factor that decides what a next gen game will look like. Why don't you do it!?

Haha..if I felt like it would matter then I would.

I'm too tired and have too little time to do it well right now. I don't think it would hit home either way :).

I'd love to see a well drawn out breakdown written by a knowledgeable person though.
 

lord pie

Member
Thanks for the insight

If they dropped the res because they wasted the gpu power they needed for native res to round the scopes and add some tesselated rocks and bricks that is by far the most silly choice.

Don't suppose you want to elaborate on the esram
Also you said you assume it uses a forward renderer (it does, cod games always have proper MSAA support)

Well. It's all guesses and speculation :) You know, internet expert :)

The problems you might encounter with ESRAM would relate to how you use (and reuse) render targets within a frame. There is no doubt that certain things would benefit greatly from being in ESRAM compared to DDR, especially temporary buffers that get overwritten entirely and then can be thrown away (depth buffers often fall into this category).

Most modern games will easily use more than 32MB of render targets during a single frame (remember the SF presentation where they had 800MB at one point? !). The difficulty would be choosing which RTs sit in ESRAM and when. If you aren't clearing the RT and regenerating it from scratch, then that would potentially mean copying the RT into ESRAM (from DDR at 68GB/s max) and then potentially copying it back to DDR to make room once you finish - which would call into question if it should even go into ESRAM at all. In a situation like that, you'd need the average number of cache missing reads/writes per pixel while in ESRAM to be pretty high to get past 'break even' on saved bandwidth and time.
It's a little hard to explain, but it'll be a very difficult balancing act (haha!) getting it right, especially if an engine likes to keep repeatedly accessing lots of big render targets over the course of a frame (eg, shadow maps). This is where deferred would help - you could generate the shadow map just before rendering the deferred light, then throw it away. In a forward render, you'd likely need to keep it around for most of the frame.

In some ways it's more complex problem than the EDRAM on the 360 - that could only hold the current RT and always copied back out to GDDR when you were done, so the usage pattern was pretty simple and easy to understand.
 
Top Bottom