• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.
People can't tell resolution exactly. They can tell if one is running at a lower resolution than the other unless the higher resolution game has piss poor image quality (then you've got bigger issues).

You're blind if you can't tell non-native resolution. Smart scaling does not fix the critical flaw with TFT screens. One being an inability to display non-native resolutions clearly.
 

Orca

Member
You're severely underestimating the impact of resolution.

Sure, many people can't tell exactly what is 720p and what isn't.

But if they are looking at the same game side by side, one 720p and one sub 720p you can EASILY tell a difference.

The image is softer, and as a result the textures aren't as sharp and jaggies can appear more pronounced,

Many 360 games looked much sharper than PS3 titles thanks to better resolution. Your post is pretty disingenuous.

Is that amazing ability the reason you never posted once in Killer Instinct thread during its three month run, but seven times in the now five day old 'Killer Instinct is 720p' thread?

lol...talk about disingenuous.
 

James Sawyer Ford

Gold Member
Is that amazing ability the reason you never posted once in Killer Instinct thread during its three month run, but seven times in the now five day old 'Killer Instinct is 720p' thread?

lol...talk about disingenuous.

I'm not interested at all in Killer Instinct. But I am interested in Xbox One performance. That thread was about killer instinct being downgraded to 720p, which hopefully speaks more to a developer issue than a wider platform issue as a whole if Xbox one won't have many native 1080p titles that would be quite disappointing.
 

vpance

Member
Even gaffers can't see the decrease... We literally wait for DF or pixel counters to tell us which games aren't even running at 720p, from which ones were. And spreading the word doesn't really affect people that are already playing the game and enjoying it regardless of what the resolution happens to be. Almost nobody on GAF caught on to the fact that COD4 wasn't 720p until pixel counters told them so. It took pixel counters for people to find out Halo 3 and Reach weren't 720p, and many more examples. Although, to me, it was very obvious that something was up with Halo 3 from the get go, but I didn't know what it was specifically.

And it's going to be even harder to see the decrease this gen, since games will likely be utilizing dynamic resolutions. It's a Directx 11.2 feature and perfect for the display planes on the Xbox One. In fact, I think the Directx 11.2 feature exactly describes the Xbox One's display planes. It'll be so much easier to hide these things this gen. We will have to rely on pixel counters yet again. For example, nobody realized that Killer Instinct wasn't 1080p until a dev apparently said so, although I don't know if it was ever confirmed. I have doubts that Ryse can look as good as it does on the Xbox One while being 1080p, but I guess we'll see.

Also, someone link me to the uncompressed and high quality vid of Infamous. I need to show a relative that's visiting this game, but I don't want him to see some compressed youtube crap. He wants to just watch it on youtube, but I'm not having any of it. Poor fool doesn't know any better. :)

You're downplaying it too much here. Especially when a lot of stink has been made around here in the last few days about whether mutliplatform games would see much of a difference.

A resolution decrease in order to make up for a 30-50% performance difference would certainly be noticeable, from jaggies and overall mudiness to picture quality from upscaling. That is, if simply decreasing res would let them run "equally" as you say. In reality it won't. Cuts would have to be made in various areas depending on the game, not just res.
 

TechnicPuppet

Nothing! I said nothing!
You're downplaying it too much here. Especially when a lot of stink has been made around here in the last few days about whether mutliplatform games would see much of a difference.

A resolution decrease in order to make up for a 30-50% performance difference would certainly be noticeable, from jaggies and overall mudiness to picture quality from upscaling. That is, if simply decreasing res would let them run "equally" as you say. In reality it won't. Cuts would have to be made in various areas depending on the game, not just res.

Why would they 'have' to make more cuts than resolution?
 

Metfanant

Member
Why would they 'have' to make more cuts than resolution?

if youre basing it off this generation..then clearly JUST a resolution drop was usually not enough as PS3 multiplats still generally had lower quality textures, lower avg framerates, less objects on screen, lower levels of AA, lower quality shadows/lighting....

or any combination of the above...
 

TechnicPuppet

Nothing! I said nothing!
if youre basing it off this generation..then clearly JUST a resolution drop was usually not enough as PS3 multiplats still generally had lower quality textures, lower avg framerates, less objects on screen, lower levels of AA, lower quality shadows/lighting....

or any combination of the above...

That was for other reasons though was it not?

Are we now not going even above 50% extra power at this point?

Anyone know what 50% less pixels that 1080p is?
 

Metfanant

Member
That was for other reasons though was it not?

Are we now not going even above 50% extra power at this point?

Anyone know what 50% less pixels that 1080p is?

yes, there were many reasons why PS3 versions of games perform/look worse than their 360 counterparts...was just making the point that a simple resolution drop might not make up for power deficiencies
 

KidBeta

Junior Member
Why would they 'have' to make more cuts than resolution?

It depends on what your doing but the PS4 has a significant advantage in at least theoretical terms in nearly everything (flops, texturing, fillrate, etc). To make up for the difference it might not be enough to just reduce the number of pixels.
 
Who is going to actually do that though?

Not just that, but we are no longer talking sub hd vs not. We are talking native 1080p or slightly less than native 1080p, or in some cases 720p, which still looks great.

And many of these games are likely to have quite strong image quality and graphics as a whole, which may make it even more difficult. But, again, that's just my assumption based on what I know about the differences between various graphics cards at different performance levels on PC. Usually when a reasonably capable GPU can't run a game at the same quality settings and resolution as a stronger GPU, a simple lowering of the resolution gains you a good deal of performance without having to change the graphics settings too much, if at all.

I know that's a big over-simplification of what we're dealing with here, but I think it's a very obvious solution for a lot of multi-plats on the Xbox One this gen. From my experience, the textures don't get that much uglier, and the image quality as a whole really does not get tremendously worse. In most cases, I can't even tell the difference at all. I guess it comes down to how gigantic your television is as to whether or not you'll be annoyed by what you see.

That was for other reasons though was it not?

Are we now not going even above 50% extra power at this point?

Anyone know what 50% less pixels that 1080p is?

According to Wikipedia, 1080p is 2073600 pixels

And half of that is 1036800 pixels, which = a resolution of 1152 x 900. Xbox One games definitely don't need to go that low, but it's entirely up to the dev. For example, if what was reported about Killer instinct is true, then it's 720p, which is less pixels than 1152 x 900, but the dev made a decision that they felt was right for their game, and the game looks fantastic, so I have no complaints.
 
Its amazing that the xbox can squeeze out more power while this ps4 can't. Maybe they built the box big to harness the power. The tiny ps4 looks like a toy. No offense.
 

KidBeta

Junior Member
I know that's a big over-simplification of what we're dealing with here, but I think it's a very obvious solution for a lot of multi-plats on the Xbox One this gen. From my experience, the textures don't get that much uglier, and the image quality as a whole really does not get tremendously worse. In most cases, I can't even tell the difference at all. I guess it comes down to how gigantic your television is as to whether or not you'll be annoyed by what you see.

Don't forget that they need to reduce any GPGPU work they are doing by 50% as well, or more, and that doing any coherent GPGPU along side normal graphics work on the XBONE will cause a massive hit at every coherent write to memory.
 

Chobel

Member
Damn Saturday night, I missed an important thread :D

I missed it too :(

OT the gif from Eltorro in that thread is brilliant

shotsnotfiredj8ss1.gif
 
read my history.. never shit on the ps4. actually am buying one.. love infamous.. try again.

Don't care about your post history. Doesn't excuse you from anything.
You can't make a rather flagrant comment, and be excused by simply saying "no offence".
If I made a post saying "The latest upclock is akin to Microsoft polishing a turd", then simply followed it with "no offence", would it still be considered trolling?
 
Damn Saturday night, I missed an important thread :D

You clearly picked the wrong time to have a life. While you were out messing around, we were here on the frontlines. :p

Don't care about your post history. Doesn't excuse you from anything.
You can't make a rather flagrant comment, and be excused by simply saying "no offence]".
If I made a post saying "The latest upclock is akin to Microsoft polishing a turd", then simply followed it with "no offence", would it still be considered trolling?

I seriously suggest you go read the Geneva Convention. You'll see the section on "no offense", right below the section on "with all due respect."

http://www.youtube.com/watch?v=Af-Id_fuXFA#t=0m17s
 

Klocker

Member
Don't forget that they need to reduce any GPGPU work they are doing by 50% as well, or more, and that doing any coherent GPGPU along side normal graphics work on the XBONE will cause a massive hit at every coherent write to memory.


are you programming a game on xbox one now?


I'm sorry. But otherwise you can not possibly know that to be true. You can only guess without knowing really how xbone need to be developed
 

KidBeta

Junior Member
are you programming a game on xbox one now?


I'm sorry. But otherwise you can not possibly know that to be true. You can only guess without knowing really how xbone need to be developed

The cache structure of the XBONE means that any coherent GPGPU write to memory requires that the entire cache be flushed, that includes the cache for graphics operations.

We know this because vgleaks specifically stated that Sony added in some cache modifications in the PS4 to specifically combat this, only requiring GPGPU data to be flushed.
 

IN&OUT

Banned
You clearly picked the wrong time to have a life. While you were out messing around, we were here on the frontlines. :p

Damn life .... Haha actually I wasn't shocked at all by the thread, been saying that for months...well, someone legit spilled the bean, this was bound to happen sooner or later.
 

strata8

Member
The cache structure of the XBONE means that any coherent GPGPU write to memory requires that the entire cache be flushed, that includes the cache for graphics operations.

We know this because vgleaks specifically stated that Sony added in some cache modifications in the PS4 to specifically combat this, only requiring GPGPU data to be flushed.

How do you know that this will result in a "massive hit"?
 

KidBeta

Junior Member
How do you know that this will result in a "massive hit"?

Because flushing a cache is bad, its something you want to avoid, it means that any reads for anything afterwards will result in a memory read instead of a chance of a cache hit, increasing offchip bandwidth usage by quite a lot.
 
Don't care about your post history. Doesn't excuse you from anything.
You can't make a rather flagrant comment, and be excused by simply saying "no offence".
If I made a post saying "The latest upclock is akin to Microsoft polishing a turd", then simply followed it with "no offence", would it still be considered trolling?
Get over it. Ms has a higher CPU clock. It will be nice snapping apps while watching tv and playing games. Hit me up when Sony does that.
 

strata8

Member
Because flushing a cache is bad, its something you want to avoid, it means that any reads for anything afterwards will result in a memory read instead of a chance of a cache hit, increasing offchip bandwidth usage by quite a lot.

It seems like any data small enough to be cached would be an ideal fit for eSRAM rather than off-chip memory.
 
The cache structure of the XBONE means that any coherent GPGPU write to memory requires that the entire cache be flushed, that includes the cache for graphics operations.

We know this because vgleaks specifically stated that Sony added in some cache modifications in the PS4 to specifically combat this, only requiring GPGPU data to be flushed.

You did kinda make some pretty massive programming related assumptions there. The PS4 will clearly have an edge on GPGPU related stuff, but anything a dev does related to GPGPU would need to be reduced on the xbox one by 50%? I'm no programmer, and even I know something sounds very off about what you said.

Who says the GPGPU related task, whatever it/they may be, will even be too much for the Xbox One to handle in the first place? And who says that any coherent write to memory alongside normal graphics operations will automatically cause a massive hit? The PS4 possibly (correction: I know the PS4 is better equipped) being better optimized and equipped to deal with such operations doesn't automatically mean any such attempt on Xbox One GPU architecture will cause some massive hit. By saying such a thing, I suspect you aren't just significantly underestimating the Xbox One GPU, but you're also very much underestimating a big part of the reason GCN was created in the first place, which was to enable this kind of simultaneous and efficient cooperation between graphics and GPGPU operations.

Without getting into a drawn out discussion on this one, I'm not sure you're properly qualified to make these level of assumptions. Unless, of course, you're a games programmer, in which case you're qualified, and I'll instead just call you a "lazy developer." :)
 
Status
Not open for further replies.
Top Bottom