• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

NBtoaster

Member
How much "good AA" did Leadbetter put into this one?

2xSMAA. Not very demanding.

720p is 26% more than 640p
1080p is 44% more than 900p
So there's now way in hell The difference between them is tiny compared to 640p vs 720p.

Also what do you mean by good AA?

The difference is clearly tiny in those screenshots.

If a game is running at 900p because it can't handle 1080, exactly why do you think there will be sufficient resources for good AA?

If you lower resolution good AA becomes more important. The cost for post process AA should be very small on both console GPUs no matter the res.
 

Skeff

Member
yeah i know, but why is it today funnier than ever before? doesnt make sense to me

edit
feel almost like kanye with fish sticks

Usualy figures are given out with info, such as if a graphics card was announced they woul say something like:

2 billion transistors which provide roughly 2TFLOPs with a ROP count of 32, and a power consumption of 100Watts to provide an efficiency of 40 GFLOPs per watt.

Microsoft said:

5 Billion transistors!
8 core CPU!

It's the additional details that were missing, not necessarily the transistor count but that on its own it gave precisely no information, other than it was big, It was designed as pure obfuscation rather than to give more credence to the listed specs.
 

NBtoaster

Member
Where does it say this?

On one of the videos

Our game focus for our next-gen GPU tests is the phenomenal Crysis 3 - the closest thing we have to a next-gen title in the here and now. Here we study how the game runs at 1080p on high settings with 2x SMAA anti-aliasing and v-sync engaged. We compare our two test platforms head-to-head with the same content, then re-run the tests with the target Xbox One version scaled down first to 1776x1000, then to 1600x900.
 

AzerPhire

Member
The moment MS said " 5 Billion Transistors" in their E3, I realized that X1 is no where close to PS4 power wise. Their next statement was " we didn't target the highest end specs". following statements were pure PR like "infinite power of the cloud", "Balance" and bandwidth funny math.

Any rational guy would have inferred that X1 is not as capable as PS4 right after E3, no need for months and months of debate and defence. Key members in the industry already talked about this;

AMD (Makers of the APU's in BOTH X1 and PS4) said that PS4 APU is the most powerful they ever created for anyone.

Multiple Devs saying that PS4 has a clear advantage power wise.

Sony is confidentally saying that PS4 is the most powerful console ever created.

MS is touting PARITY with PS4. (Notice the difference between MS and Sony messages)


and we still have people questioning those facts?

I don't think anyone really questions these it is more of 'how big of a difference will this make in the final product (games)?'

And we won't know that for sure until launch.
 

jcm

Member
SHAPE is the new HANA. You always need that one mysterious secret chip that supposedly is super powerful but turns out to be just a whole lot of rubbish.

That reminds me of one of my favorite clueless, credulous games writer swallows PR bullshit stories ever:

They realize what I'm talking about, and Scott Henson opens a small package and shows me what's inside.

"Is that it?" I ask. He nods.

"We call it Ana. This is the scaling chip that's in the 360," he tells me.

It's odd to see it—a tiny little chip—but this may be one of the secret weapons the 360 has against the PS3. The PS3 has no internal hardware scaler, which means games that are 720p native can only be shown in 720p or 480p; there is no scaling up to 1080p or 1080i. This causes people with older HDTVs to have issues with the available resolutions, and keeps them from playing the games in anything but 480p. It's a vexing problem for a system that's supposed to be HD, and this issue is one of the most challenging that Sony faces. I ask the Microsoft guys how important it was for them to include a scaler in the 360.

"It was a critical design decision; we wanted the 360 to be high-definition, not just 1080p or some other standard. That's why we included component cables in the box; there is no HDTV that doesn't have a component in," said Greenberg.

They assume that Sony didn't include a hardware scaler to keep costs down, but get a little cagey when I ask how much it costs to put Ana into the 360. "This isn't a $1,000 scaler," Henson says, "but it's a good one."

It was apparently designed at the same time as the GPU, and the effortless scaling with different televisions was something that was important from the early design stages of the system. I ask if they think this is something that Sony can fix in software.

"It'll be hard," Greenberg answers, "and compatibility testing would be tough with existing software. I think as they update the hardware they'll add a hardware scaler."

Six months later Eurogamer reported:

The only other major change under the bonnet is the new HANA video display chip, replacing the old ANA version in the classic 360. This chip has erroneously been described as the silicon that deals with the 360's inbuilt hardware scaling. In truth, Microsoft has now confirmed to us that it's merely a video output chip - a means of transferring the framebuffer into all of the different signals: composite, s-video, RGB SCART, component, VGA and - the key addition with HANA - HDMI. Scaling itself is actually performed by the Xenos GPU (most likely using a variation of Lanczos resizing) so in that respect the Elite performs identically to the original Xbox 360.
 

frizby

Member
Why do people keep saying that we have to "wait till launch" to see the power differences?

I get that we want to measure third party games, but their target is parity anyway, and they likely won't tell us all that much, certainly not at launch.

But the first party games are already showing some real differences, even in the short clips we see leading up to launch...at least to my eye. Combine that with the hardware math, and the "wait till launch" argument seems to have the sole purpose of further muddying the water.
 
Right. PS4 doesn't have any hardware to match SHAPE. The audio processor in the PS4 is just a compression/decompression piece and does nowhere near what shape does, anything advanced to match and beyond has to be handled in software using CU, which has been estimated would drain 100-200 Gflops to match shape, or even more if your doing audio raycasting.

idxhabruuflcyr0r5d.gif


Lay off the keyboard son. Even one of the engineers that worked on Shape said it's mostly for Kinect and devs won't make much use of it at all.
 

FeiRR

Banned
It's funnier because MS didnt give out the other relevant specs.

Sort of like announcing a new Prius hyrbid with a tank capacity of 40L but without any MPG figures. :p
Actually it is somewhat opposite: the more transistors you have the more difficult and expensive (yields, materials) it is to manufacture. But you're right it says nothing about the processing power.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I made this point earlier in the thread and got dismissed. 180cm from a 42 inch screen and it looked identical to me.

1080p vs 900p is an important distinction if you are gaming on a large monitor inches from your face, but in living room conditions it will be indiscernible in the average set up.

People will want to claim they have super human eye sight and can definitely tell the difference, but there is a point at which the bullshit siren starts ringing for me.

A series of blind tests on stuff like this would make a fascinating article.

Yes. You're right. And I would imagine the average consumer wouldn't notice the difference either. Not when the game is sitting in there living rooms. Only the pixel counters will care.

Which could be interesting. Given that Xbox One has a faster CPU.
 

Skeff

Member
Yes. You're right. And I would imagine the average consumer wouldn't notice the difference either. Not when the game is sitting in there living rooms. Only the pixel counters will care.

Which could be interesting. Given that Xbox One has a faster CPU.

Sweet, post the PS4 confirmed clock speed please.
 
Yes. You're right. And I would imagine the average consumer wouldn't notice the difference either. Not when the game is sitting in there living rooms. Only the pixel counters will care.

Which could be interesting. Given that Xbox One has a faster CPU.

Console wars don't get any sadder than this. Audio secret sauce, lower res => higher res... Yeah man it's all very interesting how common sense just gets thrown down the toilet to justify our options in life. I'm going to laugh my ass off when multiplat games run at a higher rez, more stable framerate, have better IQ, same sound.... It's going to be hilarious.
 

Cesar

Banned
Why is it so surprising that balance can be better than raw power? Just like Cerny said about PS4 that they could have used edram to have 1000 GB/s but chose not to for ease of programming. When the hardware of Xbox One is used just the right way (this will take some time) they can surpass brute forced graphics of "better" hardware..
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Console wars don't get any sadder than this. Audio secret sauce, lower res => higher res... Yeah man it's all very interesting how common sense just gets thrown down the toilet to justify our options in life. I'm going to laugh my ass off when multiplat games run at a higher rez, more stable framerate, have better IQ, same sound.... It's going to be hilarious.

Since when did I say that. Please point out that post.
 

gofreak

GAF's Bob Woodward
Just like Cerny said about PS4 that they could have used edram to have 1000 GB/s but chose not to for ease of programming. When the hardware of Xbox One is used just the right way (this will take some time) they can surpass brute forced graphics of "better" hardware..

Where's Xbox One's 1000 GB/s of bandwidth though?

You can argue for a more complex system if there's a performance gain to be had. Where's XB1's?

You potentially have a 'SPU part deux' kind of programming discomfort situation with XB1's bandwidth setup, but without the big competitive gain in performance once you do get over that hump. You're working harder for 'adequate' bandwidth.
 

mrklaw

MrArseFace
Yes. You're right. And I would imagine the average consumer wouldn't notice the difference either. Not when the game is sitting in there living rooms. Only the pixel counters will care.

Which could be interesting. Given that Xbox One has a faster CPU.

so then the PS4 could switch to 900p. then what? 720p looks great?
 
Common sense really. Can't believe there's people who actually believe Shape would be up to 200 Gflops, some even (on B3D) believing 400 Gflops! (4x more than the CPU itself lol.) No wonder that forum has lost it's credibility. At least on GAF mods are super strict about checking on claimed sources and insider links (thank God). Most other forums people can just get away with posting whatever shit they want and claiming they know someone on the inside lol.


Actually part of the issue is misinterpretation. I can't even bother half the time explaining myself, then someone else takes my posts out of context, then it evolves to me making statements that I never made, it can get scary.

It starts like this:

Exhibit A) SHAPE is only a 15GFLOPS piece, no way you can do audio raycasting with just 15GFLOPS.

Exhibit B) It's not accurate to measure a sound processor by FLOPS anymore than it is to measure a CPU that way. It's a discreet part designed to do one thing so the best way to measure is to try and figure out what it would cost is CU to replicate it's workload, and THAT's where the 100-200 GFLOPS figure comes from

Exihibit C) It takes somewhere between 100-200 GFLOPS to do Shape's work in CUs, but if one were to also include audio raycasting as suggested by Cerny, we could be looking at up to 400GFLOPS used up for audio.


Notice its still not being said Shape can handle audio ray casting or that it's a 200GFLOP or 400GFLOP part, but you can see how the argument can evolve into people thinking that's what's being said.

What Shape will do besides echo cancellation for Kinect is also allow up to 4x discreet stereo sound for say 4 players each using SPDIF headphones, that's pretty impressive. Chat audio is a lot less compressed and sent at a much higher bitrate than 360 and sounds great, way better than current chat audio for LIVE. Perform high polyphony and complex signal routing with custom DSPs, all with minimal CPU impact.

Anyone who currently owns or plans to own high quality headphones for 5.1 gaming or Live chat can rejoice.

http://www.computerandvideogames.com/426985/xbox-one-vs-xbox-360-voice-chat-quality-comparison/
 
Honestly, I just can't wait for these god damn consoles to come out and the initial DF articles to hit. Whatever the result is in favor of, these silly arguments of resolution and framerate will at least start to die.
 
Since when did I say that. Please point out that post.

Oof, you caught me on my day off man:

I ran the game at 1080p and at 900p both with AA. At my normal seating distance, 8 or so feet. I couldn't tell the difference. Especially when the game is moving.

cyberheater said:
Ushae said:
What's the real difference between 900p and 1080p? I mean to the human eye on a (for example) 42 inch TV? I'm actually curious.
With good AA. Realisically, you're not going to notice the difference.

So basically Mr "I sit 2 miles away from my 40"" T.V", you are actually saying that 900p is essentially as good... and maybe even better because what's the upside to 1080p? 900p costs less performance, and hell if you sit a bit further away you won't even need AA.
 
I used to own a 42 inch. I feel like it's tiny now... My friend still has a 37 inch sammy plasma. And he sits like 10 feet away from it. I can't see shit.
 

nib95

Banned
If you're using a 40" TV and sitting quite far away, it's true, you probably might not notice as much of a difference. 50" here and about 8-10 foot away and I do notice.
 

Cidd

Member
I used to own a 42 inch. I feel like it's tiny now... My friend still has a 37 inch sammy plasma. And he sits like 10 feet away from it. I can't see shit.

I sit like 7ft away from my 55" and I can see the difference between 1080p and lower resolutions quite easily, those who don't see a difference need to get their eyes checked.
 

AzerPhire

Member
Why do people keep saying that we have to "wait till launch" to see the power differences?

I get that we want to measure third party games, but their target is parity anyway, and they likely won't tell us all that much, certainly not at launch.

But the first party games are already showing some real differences, even in the short clips we see leading up to launch...at least to my eye. Combine that with the hardware math, and the "wait till launch" argument seems to have the sole purpose of further muddying the water.

Because these games are still in development and are constantly changing? See Driveclub as an example of a game that keeps improving technically. This same thing could be happening with all of the other launch games and we are just not seeing or hearing about it.

Also it is near impossible to compare first party games. Different gameplay, AI, art styles, etc, etc. The only definitive way is through 3rd party games and we won't be able to tell until launch.

Don't know why that is hard to understand.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
So basically Mr "I sit 2 miles away from my 40"" T.V", you are actually saying that 900p is essentially as good... and maybe even better because what's the upside to 1080p? 900p costs less performance, and hell if you sit a bit further away you won't even need AA.

Hyperbole much? Ha ha.

I'm just putting forward that MS knew that they have a performance gap to fill relative to the GPU in PS4 and they might have figured and tested that the average consumer won't notice a large difference between native 1080p and 900p scaled to 1080p.

As a experiment. I set up both those resolutions on my gaming PC playing Bioshock Infinite using a 40 inch tv and from my normal seating distance and the difference was very slight on stills but as soon as I in motion. I couldn't tell the difference.

To be honest. I was that impressed that I might just switch all my games to ran at 900p as the boost in frame rate is a better trade off.


Why are you asking this?

Because I always thought the dynamic resolution changes in Wipeout HD was a very good method to maintain a rock steady framerate. I know Xbox One supports it. I was just wondering if PS4 did.
 

frizby

Member
Because these games are still in development and are constantly changing? See Driveclub as an example of a game that keeps improving technically. This same thing could be happening with all of the other launch games and we are just not seeing or hearing about it.

Sure, but one console has games that are doing things the other console's games aren't even trying to do. I'm thinking specifically of Driveclub, Infamous and Killzone, which are all pushing both fidelity and effects I haven't seen on X1.

Also it is near impossible to compare first party games. Different gameplay, AI, art styles, etc, etc. The only definitive way is through 3rd party games and we won't be able to tell until launch.

Don't know why that is hard to understand.

I'm not judging the games themselves before launch. Rather, I reserve the right to use their relative IQ completely independent from their gameplay or genre as a factor in deciding which console has better hardware, and indeed, which one to preorder.
 

Binabik15

Member
Besides size and IQ, what about precision when it comes to lower resolution? Over 40% more pixels surely must have an impact on the accuracy of hit boxes?

Just a gut feeling, but console games always seem to have wonky hit detection compared to high res games on pc, where a pixel is a pixel and not some sort of stretched blob that is scaled, maybe epven differently for horizontal and vertical dimensions. Not even counting low poly count geometry that blocks your projectile thanks to being a solid rectangle or whatevertextured like a broken wall. Argh.


Of course that's comparing (sub-)720p games to 1080, but still.
 
The moment MS said " 5 Billion Transistors" in their E3, I realized that X1 is no where close to PS4 power wise. Their next statement was " we didn't target the highest end specs". following statements were pure PR like "infinite power of the cloud", "Balance" and bandwidth funny math.

Any rational guy would have inferred that X1 is not as capable as PS4 right after E3, no need for months and months of debate and defence. Key members in the industry already talked about this;

AMD (Makers of the APU's in BOTH X1 and PS4) said that PS4 APU is the most powerful they ever created for anyone.

Multiple Devs saying that PS4 has a clear advantage power wise.

Sony is confidentally saying that PS4 is the most powerful console ever created.

MS is touting PARITY with PS4. (Notice the difference between MS and Sony messages)


and we still have people questioning those facts?

I think the goal (of Microsoft and their defense force fans) is to simply try and minimize the advantage as much as possible. Everyone, even Microsoft seems to acknowledge the PS4 has some advantages.

It's all about trying to frame those advantages in the smallest possible light.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Besides size and IQ, what about precision when it comes to lower resolution? Over 40% more pixels surely must have an impact on the accuracy of hit boxes?.

Hit boxes based on bounding boxes and spheres are complete decoupled from resolution.
 
Oh dear. What a nonsense this thread is. Do we have any case examples about this balance (I mean sofware usage case)? Does this thread have any revelant or important info? This all seems to amount to nothing.

Have to revisit these from time to time to see if any intresting info comes along. Most likely new thread would be done anyway.

Anyway, have a nice evening everybody.
 

Mad_Ban

Member
Because I always thought the dynamic resolution changes in Wipeout HD was a very good method to maintain a rock steady framerate. I know Xbox One supports it. I was just wondering if PS4 did.
Of course it supports it. It's an engine related "feature". RAGE on both the PS3 and 360 changed resolution dynamically. :)
 

AzerPhire

Member
Sure, but one console has games that are doing things the other console's games aren't even trying to do. I'm thinking specifically of Driveclub, Infamous and Killzone, which are all pushing both fidelity and effects I haven't seen on X1.



I'm not judging the games themselves before launch. Rather, I reserve the right to use their relative IQ completely independent from their gameplay or genre as a factor in deciding which console has better hardware, and indeed, which one to preorder.

But that just re-inforces my point. We know the PS4 is better, MS has not denied this. However right now it is almost impossible to put a number to how much better it really is.

And looking at Killzone, Drive Club and Infamous it is hard to compare those to what the X1 has.

MS does not have a FPS to compare to Killzone, we will need to wait and see what Halo looks like before making that comparison.

Drive Club and Forza can be compared but they are two different kinds of racers (One is more arcadey and the other is a full on simulation where there are tons of additional calculations going in each second).

And there isn't an open world game we can compare to Infamous, Dead Rising possiblly but it is not a true first party title and is going for quantity of objects on screen versus better image quality.

Again not trying to say the X1 is on par with PS4. I know PS4 is better, my point is that it is impossible at this point to say how much better it really is.
 
Top Bottom