• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

Skeff

Member
Yes, tell me how much you can see that 44% edge that Killzone and Infamous have over Ryse graphically. That's where this whole discussion becomes hilarious. You won't have an actual quantifiable answer outside of resolution. You'll be left with art preference or a preference of game type whether you choose to acknowledge that fact or not.

You have to lean more on numbers you want to believe are producing like you expect them to more so than what your eyes are telling you. Placebo in effect for the next gen. I feel it. Ryse? Quantum Break? Even Forza 5 60fps looking like it does. Hell, just look at what that wave race section in Kinect Sports rivals looks like. The games are painting a different story about the capabilities of the hardware. Nobody denies Infamous and Killzone are some of the most unbelievable looking games ever made. I have nothing against them; I'll own both. Already paid for Killzone, something that I didn't want to do, but was pushed into it out of a desire to have something major that looked incredible for the PS4 at launch. Personally, Infamous is more my style of game, but I would still give Killzone the edge over it graphically.

Am I not allowed to also think Ryse looks amazing and is comparable to those games because it's on the "weaker" platform? Serious question.

1080p vs. 900p is 42% more pixels, not to mention that KZ:SF multiplayer is 60fps which is 100% more frames, so it's actually pushing 188% more pixels per second as well as IMO looking better.
 

artist

Banned
1080p vs. 900p is 42% more pixels, not to mention that KZ:SF multiplayer is 60fps which is 100% more frames, so it's actually pushing 188% more pixels per second as well as IMO looking better.
It's all of no use when Penello and others (even some PC gamers) cant tell the difference, tell this to Sony.
 
1080p vs. 900p is 42% more pixels, not to mention that KZ:SF multiplayer is 60fps which is 100% more frames, so it's actually pushing 188% more pixels per second as well as IMO looking better.

So Kilzone looks 188% better than Ryse? Thanks for setting me straight. I didn't know that. I'll exit this thread now. I need to process what I just learned.
 

StuBurns

Banned
How good looking games are is a pretty bad way to determine hardware performance. Before Halo 4, you'd have been pretty safe in assuming the PS3 was a much more capable machine. The best of 360 had been consistently outclassed by Sony. But Halo 4 was a huge step up from anything on the system.

Just because a system is more powerful, doesn't mean the results will show it. I think Killzone looks much better than Ryse, but even if it didn't, it wouldn't say anything about their respective capabilities.
 

Skeff

Member
So Kilzone looks 188% better than Ryse? Thanks for setting me straight. I didn't know that. I'll exit this thread now. I need to process what I just learned.

Thanks, you'll be doing everyone a favour. You wanted a quantifyable reason that Killzone looks44% better than Ryse, I over delivered on your 44% figure.
 

Bundy

Banned
So Kilzone looks 188% better than Ryse? Thanks for setting me straight. I didn't know that. I'll exit this thread now. I need to process what I just learned.
He said 188% / times better than RYSE?
Well then we both saw two different posts, as it seems.
He's talking about % more pixels.
Stop spreading FUD with your damage control.
And stop talking about launch games and the power-difference at the same time.
As we have already said several times: You won't see "super differences" in launch games.
The developers are happy to release their games on time/launch.
Wait for the second wave of games (to see bigger differences).
And don't forget what a lot of devs said in the last couple of months.
50% faster (EDGE, former People can Fly dev, etc.)
 
So Kilzone looks 188% better than Ryse? Thanks for setting me straight. I didn't know that. I'll exit this thread now. I need to process what I just learned.

Did you even bother reading people's reply? He never said it look 188% better, he said KZ looks better (in his opinion) even while pushing 188% ( most likely less, since 60fps isn't locked yet, closer to 45-50~) more pixels per second comparatively.
 

Sky78

Banned
1080p vs. 900p is 42% more pixels, not to mention that KZ:SF multiplayer is 60fps which is 100% more frames, so it's actually pushing 188% more pixels per second as well as IMO looking better.

Killzone is TARGETING 60fps. I've been gaming long enough to know that you shouldn't count any chickens until they are hatched. Right now no one has seen any Killzone mode at a steady 60fps so until they do, it can't be taken to be the case. I'm not being unbalanced here, I have the same approach to every game on every platform. Dead Rising 3 is a broken game frame rate wise, until they show me footage that proves otherwise.

900p on Ryse and 1080p on Killzone is certainly an indication of PS4s power gap, but we play games, we don't play numbers. If you game on a TV from a sofa a few feet away, it will be impossible to tell. I know this as I did a test on myself to see. With a 42inch TV from 180cm away (probably closer but those were my conditions) they look identical.
 

StuBurns

Banned
900p on Ryse and 1080p on Killzone is certainly an indication of PS4s power gap, but it we don't play numbers, and if you game on a TV from a sofa a few feet away, it will be impossible to tell. I know this as I did a test on myself to see. With. 42inch TV from 180cm away (probably closer but those were my conditions) they look identical.
Where you using per-pixel mapping?
 
They said something quite different.

Their commentary in the article tells us that 'a lot of titles', launch titles, tend to be GPU bottlenecked, more over not CU-bound - i.e. most probably ROP bound.

Their 6.6% GPU upclock would do nothing for performance in those games if they were CPU bound.


No I believe that Microsoft was pushing the idea that the GPU really doesn't matter because the CPU is the bottleneck. That is their whole 'balance' argument. They admit that Sony has a better GPU, but say that extra power is wasted because it can't be used.

The team is also keen to emphasise that the 150MHz boost to CPU clock speed is actually a whole lot more important than many believe it is.

"Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU," Goosen reveals. "Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console."

So my question still stands. Are games really more limited by the CPU than the GPU as Microsoft is proposing? Or is true only for the XB1 because Microsoft is trying to always run Kinect in the background and have the ability to 'snap' other apps to the side while playing games?
 

Bundy

Banned
Killzone is TARGETING 60fps. I've been gaming long enough to know that you shouldn't count any chickens until they are hatched. Right now no one has seen any Killzone mode at a steady 60fps so until they do, it can't be taken to be the case. I'm not being unbalanced here, I have the same approach to every game on every platform. Dead Rising 3 is a broken game frame rate wise, until they show me footage that proves otherwise.
Right know, KILLZONE: Shadow Fall's multi-player is 60fps.

No I believe that Microsoft was pushing the idea that the GPU really doesn't matter because the CPU is the bottleneck. That is their whole 'balance' argument. They admit that Sony has a better GPU, but say that extra power is wasted because it can't be used.
Which is 100% BS!
 

benny_a

extra source of jiggaflops
Hey look! A spec related thread going in circles.
What's going in circles?

Seems like the majority of people in this thread are on the same page:
Microsoft tried to say their system is balanced, people argued that they inadvertently revealed that this isn't the case.
 
"Oh, absolutely. And you can even make it so that portions of our your render target that have very little overdraw... for example if you're doing a racing game and your sky has very little overdraw, you could stick those sub-sets of your resources into DDR to improve ESRAM utilisation," he says, while also explaining that custom formats have been implemented to get more out of that precious 32MB.
So I was right :)

I disagree. PS4 was built for max GPGPU performance. The GPU is powerful on purpose.

Dunno about GPGPU in games, but on academic researches, maximum GPGPU performance is by far and away much more dependent on memory access than ALU resources... It's pretty common to see double or 3 times the ALU counts giving marginal performance improvements while an improved memory access yields 10 to 100 fold times the performance.

However, that only speaks about the difficult to get GPGPU up and running at a decent speed, assuming optimal memory access, more ALU count will provide more performance.
 

onanie

Member
http://www.giantbomb.com/forums/xbo...-2-from-32mb-to-6gb-worth-of-texture-1448545/

Maybe you should read.



This is well documented and covered, even on all the major tech sites. Microsoft showcased 3GB worth of texture data only needing to use 16MB worth of physical memory, and this is a feature on the Xbox One. Don't think for a second they aren't going to use it. 32MB is not as small as you'd think when technology like this is in play. And, yes, I know Sony has access to similar tech through their own API.

http://www.opengl.org/registry/specs/AMD/sparse_texture.txt





Nope, more like you're trying to deny what tiled resources does just to fit with that agenda of yours. I know it allows the textures to not actually need to be physically resident inside memory, but the fact of the matter is that 16MB worth of physical memory, when dealing with the virtualization possible with tiled resources, is useful enough to manage 3GB worth of data. Who cares how the mars demo looked, it was still using 3GB worth of texture data. That's the point, the capacity, not how good mars looked. How interesting was that suppose to look? Carmack has already announced that Doom 4 will use the tech also. It also has low bandwidth usage, another plus, and it helps a great deal with shadow maps. These things aren't made up. I know it isn't REALLY 3GB of data suddenly fitting inside 16MB for real, but that's how it works regardless, and the tech is proven. End of story.

That was merely a showcase of unconventionally large textures in a system with limited memory. It does not make a case for better performance in a system with asymmetric memory pools.

Sure, you can fit 3GB into the ESRAM in 32mb tiles, but the ESRAM will not receive each tile faster than the source of the texture, i.e. the DDR3, or worse yet, the HDD. 3GB of data will not somehow fly through the ESRAM faster than 68gb/s, no matter how hard you tile. That is just physics.

There is a reason why you can't just have 1 byte of memory with 1000gb/s bandwidth and pretend it's better, like what Microsoft is doing.

It's all made up. End of story, indeed.
 

Rebel Leader

THE POWER OF BUTTERSCOTCH BOTTOMS
Killzone is TARGETING 60fps. I've been gaming long enough to know that you shouldn't count any chickens until they are hatched. Right now no one has seen any Killzone mode at a steady 60fps so until they do, it can't be taken to be the case. I'm not being unbalanced here, I have the same approach to every game on every platform. Dead Rising 3 is a broken game frame rate wise, until they show me footage that proves otherwise.

900p on Ryse and 1080p on Killzone is certainly an indication of PS4s power gap, but we play games, we don't play numbers. If you game on a TV from a sofa a few feet away, it will be impossible to tell. I know this as I did a test on myself to see. With a 42inch TV from 180cm away (probably closer but those were my conditions) they look identical.

Multiplayer is at 60fps
Singleplayer is at 30fps
 

LiquidMetal14

hide your water-based mammals
Only 2 more months to find out the truth about the two consoles.

Launch games (especially multi-platform games) will not be an indicator or PS4's HW advantage.

Although the you are seeing some really good looking and 1080p resolution games on PS4. I'd say Ryse is the most impressive just from tech on Xbone but that 900p hurts its credibility when trying to compare it to KZ or Infamous.
 

Skeff

Member
No I believe that Microsoft was pushing the idea that the GPU really doesn't matter because the CPU is the bottleneck. That is their whole 'balance' argument. They admit that Sony has a better GPU, but say that extra power is wasted because it can't be used.



So my question still stands. Are games really more limited by the CPU than the GPU as Microsoft is proposing? Or is true only for the XB1 because Microsoft is trying to always run Kinect in the background and have the ability to 'snap' other apps to the side while playing games?

Unoptimized could be, however these machines have dedicated processors to deal with a lot of this, also the GPU in the PS4 is designed to be able to do GPGPU with very little performance cost to the graphics, So by increasing the power of the GPU, you are also increasing the ability to do GPGPU and therefore decreasing the chance of a CPU bottleneck.

You are also right in the sense of "snap" using CPU and GPU resources as well as the Kinect, but we simply don't know how much this is. It is likely that the talk of the CPU upclock giving such great results is likely just Marketing. There will be a small benefit, but not to really make a real difference.
 
How good looking games are is a pretty bad way to determine hardware performance. Before Halo 4, you'd have been pretty safe in assuming the PS3 was a much more capable machine. The best of 360 had been consistently outclassed by Sony. But Halo 4 was a huge step up from anything on the system.

Just because a system is more powerful, doesn't mean the results will show it. I think Killzone looks much better than Ryse, but even if it didn't, it wouldn't say anything about their respective capabilities.

That's why multiplatform games are a much better base to draw comparisons. It's certainly much better than trying to compare two games that have completely different design goals...
 
By the way, what happened to the 218/204 ESRAM stuff that Albert supposedly re- clarified with his colleagues?

Because based off this article, the 218GB/s number that he supposedly got corrected by GAF/colleagues is not true, and they're still basing off the 204GB/s number off Hotchips, which is now taking known limitations into account.
 

StuBurns

Banned
That's why multiplatform games are a much better base to draw comparisons. It's certainly much better than trying to compare two games that have completely different design goals...
They're also a crap way to compare. Is the 360 better because Bayonetta is much better on there? Is the PS3 better because FFXIII is better on there?
 

Skeff

Member
By the way, what happened to the 218/204 ESRAM stuff that Albert supposedly re- clarified with his colleagues?

Because based off this article, the 218GB/s number that he supposedly got corrected by GAF/colleagues is not true, and they're still basing off the 204GB/s number off Hotchips, which is now taking known limitations into account.

It seems some of Alberts numbers were wrong, and I doubt it as just this number hehe.
 
If Sony had stuck with the 4GB of RAM as planned, would they be on par?

I'd still put Sony way out in front, but there could have potentially been areas of games and especially the OS (assuming the 512MB reservation stood) where MS could show an advantage. Launch titles would have been unaffected IMO.

The way I see it, 8GB was the icing on the cake, it wasn't a critical requirement which was why they were prepared to launch with 4GB.
 

gofreak

GAF's Bob Woodward
No I believe that Microsoft was pushing the idea that the GPU really doesn't matter because the CPU is the bottleneck. That is their whole 'balance' argument. They admit that Sony has a better GPU, but say that extra power is wasted because it can't be used.

They're not saying that. That would be nonsense.

And saying the CPU often helps avoid sudden drops due to CPU activity isn't the same as saying the game is typically CPU bound. Maybe, for microsecond the CPU might become a bound if it's doing something suddenly intensive. Their comment elsewhere indicates a bound in software residing elsewhere typically - in the GPU, somewhere other than CUs.
 
1080p vs. 900p is 42% more pixels, not to mention that KZ:SF multiplayer is 60fps which is 100% more frames, so it's actually pushing 188% more pixels per second as well as IMO looking better.

On the other hand the highest char poly count in Ryse is almost 4 times the one in KZ and there are more characters on screen as well. And Ryse lighting is 100% dynamic even their GI, which is pre baked on KZ...

I could go on and on, but my points is: You are constraining your comparison into a single spec, but there are lots of aspects where the goals and technologies from these two games differ, which means your performance compared is flawed because you are assuming everything they are doing is the same.
 

KaiserBecks

Member
1080p vs. 900p is 42% more pixels, not to mention that KZ:SF multiplayer is 60fps which is 100% more frames, so it's actually pushing 188% more pixels per second as well as IMO looking better.

Let's hope the input lag stays under 188ms then.
 

jett

D-Member
By the way, what happened to the 218/204 ESRAM stuff that Albert supposedly re- clarified with his colleagues?

Because based off this article, the 218GB/s number that he supposedly got corrected by GAF/colleagues is not true, and they're still basing off the 204GB/s number off Hotchips, which is now taking known limitations into account.

Technical fellows don't know bout maths
 

artist

Banned
Seems like the same ground that's been covered for a while. Even the EG article doesn't cover much that wasn't already gone over in past threads.
We got the configuration of the GPU, reasoning for upclock, ESRAM, clarity on it's functionality etc.

Unfortunately the article could've been a lot better only if it was some one else asking more tough questions.
 

Green Yoshi

Member
When you see actual footage of Dead Rising 3, it's hard to belive that the Xbox One is a powerful console. Even Ryse by Crytek looks not very impressive.
 

vpance

Member
900p on Ryse and 1080p on Killzone is certainly an indication of PS4s power gap, but we play games, we don't play numbers. If you game on a TV from a sofa a few feet away, it will be impossible to tell. I know this as I did a test on myself to see. With a 42inch TV from 180cm away (probably closer but those were my conditions) they look identical.

Mr-magoo.gif
 

pixlexic

Banned
Launch games (especially multi-platform games) will not be an indicator or PS4's HW advantage.

Although the you are seeing some really good looking and 1080p resolution games on PS4. I'd say Ryse is the most impressive just from tech on Xbone but that 900p hurts its credibility when trying to compare it to KZ or Infamous.


Nah you can never compare two exclusives only multiplat from the same dev. Even if both were thrown together quickly the best hardware that's the easiest to program for will show itself just fine.
 
Top Bottom