• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

velociraptor

Junior Member
what I'm getting from this is as follows: close but inferior in most multiplats. missing GPU compute heavy features in exclusives and in occasional multiplatform games where the developers put in the extra effort on PS4 (or accept help from a team of Sony programmers).

that's what Xbox One owners will find when they buy their system. it's far from a slap in the face. you can tell from the exclusives that it's a very capable system. it's just less capable.

given Kinect, and that they're selling for a profit, Microsoft might be getting more bang for their buck, and can be legitimately proud of the hardware they've put together.

gamers, however, will be getting more bang for their buck buying a PS4.


I suppose you could do some crazy maths with requiring PS+ to play a lot of the multiplayer titles, but it'd be just that. Crazy.
I think it will ultimately depend on the kind of value PS+ delivers.

If we are able to play a good selection of PS4 titles just like we have with the PS3, then it's worth it's asking price.

If we don't get free games every month, then welp.
 
So wait a minute.... not even 14 CUS but 12?!
It's always been 12.
They only stated the DF numbers are wrong, but what wrong it is ?

The number of PS4 virtual ram is wrong ?
Or there is guaranteed 6GB GDDR5 ram access for games ?
The dev in question is Brian Provinciano.
https://twitter.com/BriProv/statuses/361140165026131968
Here's their blog on the subject:
http://retrocityrampage.com/blog/2013/07/522/

Thuway, BruceLeeRoy and Kagari were the GAF insiders who indicated a game was currently using 6GB iirc.
 

benny_a

extra source of jiggaflops
I think it will ultimately depend on the kind of value PS+ delivers.

If we are able to play a good selection of PS4 titles just like we have with the PS3, then it's worth it's asking price.

If we don't get free games every month, then welp.
Unfortunately for us, PS+ only needs to suck less than XBL to be competitive in the console space.

And the 20% price difference between PS+ and XBLG is probably enough to qualify as sucking less.
 

KMS

Member
They only stated the DF numbers are wrong, but what wrong it is ?

The number of PS4 virtual ram is wrong ?
Or there is guaranteed 6GB GDDR5 ram access for games ?

6.5gigs is the popular guess(I think) with 512megs of it being a pagefile.
 

nib95

Banned
They only stated the DF numbers are wrong, but what wrong it is ?

The number of PS4 virtual ram is wrong ?
Or there is guaranteed 6GB GDDR5 ram access for games ?

It is unknown, but Thuway and I believe one other insider (filopilo?), maybe more of them actually, seem to be pushing the 6GB number. What we know for a fact (ignoring insider posts but looking to a Sony dev) is that DF were wrong in their assessment and with their numbers. That should be enough that we don't go back to them on those figures or that article. Anything else is just semantics.
 
Microsoft said:
Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."

You don't inquire about a nose job unless you think you've got a problem with your nose. The fact that they explored demanding a 100% yield says a lot. They were scrambling in recent months to make up the serious difference once they became aware of it. Unfortunately the best they could possibly do was 14 CU, while Sony's still sitting on a fat 18.

Microsoft is just trying to muddy the waters through articles like this to confuse as many people as possible that things are closer than they really are.
 
So will be getting a similar article exploring all the good options the PS4 has and how they'll help developers - just for, you know, balance?

Cerny's already talked some tech stuff on the Gamasutra, and he already teased comparisons during his Gamelab talk.

Rationally though, we shouldn't expect one, at least not directly addressed by Sony. The problem regarding articles and news like this is that it's often seen as a reactionary measure towards a competitor's superior specs. You expose points of contention towards your own stuff, and you give mentions to a competitor that cannot be portrayed effectively into "we're better than them."

I also don't think either company should talk about the respective competitor's specs, especially since neither of them can actually speak from an actual of a dissection of the competitor's hardware. For MS to rely on stuff like leaks & other spec docs as reference is simply inappropriate.
 
EA better not gimp BF4 on PS4

I know many people think EA will do it, but i'm really not worried about them. Just the fact that they continue to only show the PS4 and PC versions of BF4 says a lot. You'd think that Sony was the one with the timed exclusive DLC with the way they're handling things.
 

jett

D-Member
GDDR5 "uncomfortable"?

Bringing up the 14+4 CUs thing that was proven false months ago?

Microsoft really shouldn't have tried to talk about specs again. They've just been embarrassing themselves on that front after having finally gotten some of their shit together after E3 and Mattrick's departure. There's no info in this article, just a reiteration of nebulous phrases like balance that some people have already latched onto in discussing power differences with no insight into what they mean.

Agreed, same ol' disingenuous BS from Microsoft.
 

artist

Banned
Easier to understand what compromise/trade-off was done in going for a higher clock with lower CUs.

ay1fhn.png


(Using Xbone's current production configuration as a baseline)
 

Yoday

Member
This is obviously game dependent, but if the games they looked at scaled better with a 6.6% upclock than a 16.6% increase in ALU, then it suggests they were being limited more by other parts of pipeline. ROPs is the easy one to finger.
It is probably safe to assume the launch games they are looking at are internally developed games, and therefore console exclusive. If that is indeed the case then wouldn't those games be coded to specifically work with 12CU's rather than 14CU's, and thus see a bigger performance increase from a clock bump rather than a CU bump?
 

gofreak

GAF's Bob Woodward
So power up to 14CU doesn't peform better than raising 7% GPU clock.

Of course you can say the X1 ROP limits it, but they said PC benchmarks as well.

If you have a ALU bound game that continues to be ALU bound if you increase the number of CUs by - say - 40%, then in those games performance will scale linearly with additional CUs.

It's a little bit wobbly to look at ten games today and extrapolate that 'real world performance doesn't scale with additional CUs' or that there is a 'law' of diminishing returns. It depends totally on the software, and the shape of that software may be different in the coming generation than the 10 current gen games Richard is drawing conclusions from.

And in those existing games it seems too narrow focussed to make that point but ignore the other factors that lead to game perf not scaling linearly with extra CUs that happen to be very different indeed in the two systems.

I just posted this to the "PS4 and Xbox One performance" thread before noticing this one.

It seemed like from the article that one of Microsoft's main defenses of their less powerful GPU was that most games are CPU limited anyways so that the extra power of the GPU would not matter. My question is "Is this true? Are most console games really CPU limited?"​


They said something quite different.

Their commentary in the article tells us that 'a lot of titles', launch titles, tend to be GPU bottlenecked, more over not CU-bound - i.e. most probably ROP bound.

Their 6.6% GPU upclock would do nothing for performance in those games if they were CPU bound.​
 

gofreak

GAF's Bob Woodward
It is probably safe to assume the launch games they are looking at are internally developed games, and therefore console exclusive. If that is indeed the case then wouldn't those games be coded to specifically work with 12CU's rather than 14CU's, and thus see a bigger performance increase from a clock bump rather than a CU bump?

You don't code for a specific number of CUs.

If there were more CUs, regardless of resolution they would chew through your vertex and fragment processing faster.

The frametime for vertex/geometry/fragment shading would be faster - was faster - when they tried 14 vs 12 CUs. But this is pointless if another point in the chain is the limiter and hasn't gotten any faster and is holding overall framerate down.
 
MS still has no idea they have created an overly expensive PC from 5 years ago with 32MB of essentially GDDR5. Big whoop. I nearly fell over when I saw the numbers and architecture of this system. How the hell can try and make a next gen system using only DDR3? Blows my mind. The only explanation is that the multimedia functionality is the most important thing in this system to Microsoft. To gimp your system this badly and create such a massive bottleneck all because you wanted snapping and live fantasy sports updates on the TV means this was made as a multimedia device first and the gaming aspect came in a distant second in design.

This is likely why so far every X1 game is purely about high model detail. When it comes to effects or anything remotely resembling next-gen tech it is non-existent. Even the exclusive games built from the ground up for this hardware has nothing special going on, just high model detail and that's the end of it.

When Housemarque can make Crytek look lazy that is pretty bad.
 

Bundy

Banned
Reading all that,..... sorry, there is a lot wrong in Leadbetters article + much spin by MS.
But a lot here have already pointed at that:
MS engineers referenced Vgleaks articles about 14+4 PS4 CUs that was later debunked by Cerny in an interview.
hilarious. lol
I will try to find Cerny interview when asked about this.
Exactly!
GDDR5, so good it's uncomfortable.
Their comment was really hilarious.
8GDDR5 RAM is clearly the better choice than 8GB DDR3 RAM + 32MB ESRAM.

Ignoring the RAM the XB1 does seem like a more balanced system, however the power gap is pretty big and the PS4 is also cheaper. Is the XB1 selling with a profit or something?
It's the more balanced system for you, because MS is telling you that.
If you look at both systems architecture, the PS4 seems to be the more "balanced" system.

OK let's see:
PS4 has a unified memory with 18CUs and tight APU configuration

vs

X1 with less CUs and tiny esram and slow DDR3 ram
and we have one comment believing MS "Balance" spin.
yep, MS has succeeded .
And this!

It's kind of funny observing Microsoft's constant re-juggling of their PR message hoping something sticks. If at first you don't succeed, try, try again...

"Cloud is going to give you more computational power!" - No, it's not.
"We never targeted the best specifications on our machine." - Fair enough, but...
"We up-clocked the CPU, this makes us better!" - Well, it is a good start but...
"Everything is going to be even, 50% is not going to happen!" - Yeah, 50% is probably not what we'll see but...
"... this shit is balanced. It's been designed to work together. Basically, it's secret sauce." - Okay but... the whole initial push of the PS4 was how there is no bottlenecks and how it's perfectly balanced.


And yet every single step of the way they pick up some stragglers who have been waiting to be wrapped in the warm bosom of Microsoft once again.
This!
Most of the spin has been debunked now.
Their next "thing/spin" is "balanced" now.
"We are more balanced, so the extra power of the PS4 is useless."
Jesus..... it never ends.
 

daxter01

8/8/2010 Blackace was here
Am i crazy or I just saw SenjutsuSage corrected lherre on ram allocations? dude even realize that lherre were probably looking at both console dev kit when Clicked on submit reply?
 

Chobel

Member
MS said:
Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."

I call bullshit in this one. Why not do both?
it's not like upping GPU clock and using 14 CUs are mutually exclusive.

Some of those 2 additional CUs are defect.
 
By the way, SenjutsuSage, why are you still peddling the 5.5GB number when several insiders have confirmed it was wrong, including a Sony dev themselves? Going down the route of pushing known misinformation is pretty frowned upon round these parts...

The information came straight from Sony's own documentation, and speaking of pushing known misinformation: Remember that Xbox One GPU downclock that some of those exact same insiders were so sure of that they were already drawing the chalk lines around the Xbox One, and calling for an autopsy?

Those same insiders? Listen, far be it from me to doubt sony devs, but the information was derived from official documentation. Am I suppose to listen to certain insiders, one of which, according to other posters (haven't seen this myself) supposedly said that Sony planned on upping their GPU speed to 1GHZ AFTER the PS4 launched, knowing how batshit insane that sounds? I won't deny that some of these guys have information or know people, but some, not all, of our insiders that have given us information have appeared, at least to me, and certainly to others if they're willing to speak up, to have a clear agenda behind some of what they say. I'm not referring to cboat. I'm not even referring to the devs and sony employees.

I won't beat around the bush, because I think that's even more frowned upon, but thuway has said some things that have turned out blatantly false, which isn't exactly a bad thing, because there's no such thing as perfect second hand or third hand information, but he certainly didn't act that way when he thought the Xbox One GPU downclock was real, and my view is that if you say something and then openly take a victory lap and use it to mock people as I recall him doing around that time, then you should prepare to have things you say called into question later if it turns out all that premature celebration and confidence was unfounded. Fair is fair, and I think he was one of the primary individuals calling into question the reservation numbers as listed in the PS4 document false. Even in Sony's response to Eurogamer, they ended up confirming some of the things labeled in that article, but denied to confirm reservations, which only seemed to lend credence to the validity of what was said more than it did to discredit the information. Now, what I found funny is that many of his posts prior to that revelation all seemed to indicate, and please correct me if I'm wrong, because I do like to be corrected if I'm wrong about something and I have no problem acknowledging my error, made it sound like the PS4 OS was taking nothing more than 512MB, or upwards of 7GB was available to devs for games. But then suddenly as official documentation comes out casting doubt on those figures, suddenly our insider is now suggesting 6GB is the real number almost on demand.

If I'm mistaken please correct me, but I don't think he was so forthcoming about that information before. I seem to recall his being more than willing to have people believe something he either knew was false, or didn't actually know at all. So, that's just one tiny example of what I mean. And there was one other fairly prominent insider that made some pretty borderline super secret sauce posts regarding the xbox one, but then all of that suddenly evaporated once more and more information came out. Say what you will about me, but I've maintained from the very beginning, stubborn as I may be, that the specs as leaked on vgleaks were exactly what the Xbox One was, and that I didn't feel there was anything wrong with that. Did I believe there were things not fully understood about the system? Sure, but that didn't have to mean that all the specs as reported by vgleaks were somehow so close for the PS4, but way off for the XB1. It didn't make sense, and plus I knew firsthand they were real, so there's also that.
 
Top Bottom