• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Seriously, the hot RAMZ was FUD, i thought this was dispelled months ago?

Dispelled? Its fact lol. The FUD was that the GDDR5 was unmanagable.

More pics:

gpuzdo9.jpg
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Interesting, i wonder if Microsoft let them do it.

According to another tweet from that timeline, no. Wouldn't make any sense anyway. The whole point of the XBO's OS setup and resource reservation is to provide solid isolation between the Win8/apps-part and the game-part. Compromising that isolation would also compromise the concept of the console. (Assuming that it could even be done.)
 
According to another tweet from that timeline, no. Wouldn't make any sense anyway. The whole point of the XBO's OS setup and resource reservation is to provide solid isolation between the Win8/apps-part and the game-part. Compromising that isolation would also compromise the concept of the console. (Assuming that it could even be done.)

What Activision was asking was pretty much "Can you kill off snap-mode? This thing is irritating as hell."

No way MS would do that, even for CoD:Ghost.

... Maybe if Respawn asked? :p
 

Special C

Member
If Xbox One provides me with a more streamlined OS, working snap, great app integration, and a well implemented kinect, then the lower resolutions won't matter a bit. I'll take those advantages over a graphical advantage any day. I'm aligned with Microsoft's vision on this stuff,

However, where I'm skeptical is if they can actually implement this stuff and make it work. If they can't, I image i'll be getting a PS4 sometime in 2014.
 

Marlenus

Member
VRAM is the GDDR pool. VRM(Video Ram) is what you're looking for. Its 57C(VRAM) to 39C(GPU). GDDR runs very hot

Your points are going everywhere.

Yes again, RAM is alot less sensitive to heat than logic is. We don't know which card is being used but its very clear what is being shown. GPU-z pulls data directly from the GPU sensors, so its as accurate as they are.

VRAM runs hottest in most high-end systems, its well established, I can provide more pictures that all show the same conclusion if you want...

The VRM is the Voltage Regulation Module not the memory, this does get mighty hot but is very rarely actively cooled. If you did not have active cooling on a GPU and disabled automatic throttling/shutdown the damn thing would destroy itself from temperatures well in excess of 100 degrees C.
 
Yes, it is measuring the heat of the memory controller which is not RAM. I'm glad we are in agreement now.

Its measuring the temperature of the ram. There is hardly any logic even in a MC, there's no reason it would need a sensor as there is generally no reason to know its temperature.
 

evilalien

Member
Its measuring the temperature of the ram. There is hardly any logic even in a MC, there's no reason it would need a sensor as there is generally no reason to know its temperature.

You're wrong about this just like you were wrong about what a VRM is, but there is clearly no convincing you of that fact so enjoy living in ignorance I suppose.
 

Nafai1123

Banned
Can anyone with more technical knowledge than me comment on the capability of the ESRAM with deferred rendering? It seems like forward rendered games are capable of running at 1080p (Forza) while deferred engines are having problems. Does this have to do with the size of the framebuffer in relation to the 32MB of ESRAM? And it that is the problem, is there a possible workaround?
 
Has this already been discussed somewhere?

DmdtzBJ.png

XYipkXA.png


I actually find that odd. I didn't think that the raw processing power was one of CoD's bottleneck, but the ESRAM size alone.

That's quite telling then. It would certainly imply that the XB1's GPU had trouble handling Ghosts at higher than 720p resolutions when only 90% of it is available

That's odd
 

benny_a

extra source of jiggaflops
The voltage regulator is VREG. I'll take the real world if you don't mind
You first posted an image with GPU VRM Temperature circled while the GPU Memory temperature was at 31 degree Celsius.

You then used the VRM acronym to mean Video RAM while the GPU memory obviously refers to the outside temperature. (My conjecture, you never said what it could mean if VRM is Video RAM.)

You doubled down on this misinterpretation.
 

Marlenus

Member
The voltage regulator is VREG. I'll take the real world if you don't mind

1) I do not see VREG anywhere and the VRM is the voltage regulation module. http://en.wikipedia.org/wiki/Voltage_regulator_module

2) The biggest heating device on dGPU is the part that draws the most power, that would be the GPU itself, this is why it is actively cooled unlike the VRM, and the memory modules that on stock configurations just settle for passive cooling.

3) The fact that the GPU core is at a lower temperature than the VRM/Memory does not mean it is generating less heat because it is actively cooled and a large portion of the heat generated is removed via the heatsink and fan.
 

Skeff

Member
I don't see all the fuss. It's early and devs haven't even tapped into the cloud yet.

Hey, man, Why don't you take a seat.

As it turns out I've got some bad news, it turns out the cloud won't change anything.

Sorry, You want a lollipop?

Ok Buh-bye, have fun and don't let the bad news weigh you down :)
 
1) I do not see VREG anywhere and the VRM is the voltage regulation module. http://en.wikipedia.org/wiki/Voltage_regulator_module

2) The biggest heating device on dGPU is the part that draws the most power, that would be the GPU itself, this is why it is actively cooled unlike the VRM, and the memory modules that on stock configurations just settle for passive cooling.

3) The fact that the GPU core is at a lower temperature than the VRM/Memory does not mean it is generating less heat because it is actively cooled and a large portion of the heat generated is removed via the heatsink and fan.

525x525px-LL-09ba9d87_vbattach204609.gif


You first posted an image with GPU VRM Temperature circled while the GPU Memory temperature was at 31 degree Celsius.

You then used the VRM acronym to mean Video RAM while the GPU memory obviously refers to the outside temperature. (My conjecture, you never said what it could mean if VRM is Video RAM.)

You doubled down on this misinterpretation.

I said the captions aren't mine. I didn't circle anything.
 

Y2Kev

TLG Fan Caretaker Est. 2009
I don't think ESRAM makes sense as an excuse for CoD. The engine is not using deferred rendering.

Weak GPU makes more sense. Fillrate issue.
 
I don't think ESRAM makes sense as an excuse for CoD. The engine is not using deferred rendering.

Weak GPU makes more sense. Fillrate issue.

How much do you think the immaturity of the dev tools and SDK had an effect?

You think future iterations of COD on XB1 will have improved resolution?

Cboat did mention a while back that one of the SDK updates dropped performance by like 20% or something crazy like that. Wondering if they've got that sorted yet

If it is mostly down to the GPU though that's not a good sign for the future.
 

Y2Kev

TLG Fan Caretaker Est. 2009
I'm not a developer. I have no idea.

There are a lot of confounding factors. CoD: Ghosts should be higher than 720p on Xbox One. It is a marginally improved engine. My natural inclination is to say things will get better, but what if they upgrade their technology next year or the year after and the game becomes more technically demanding?

Keep in mind this gen the resolution of CoD got worse over time (until the end, when Blops 2 made it sorta wackily different).

The GPU is weak. Period. The PS4 GPU is not a superstar but the Xbone GPU is like worse than the RSX for its respective time period.
 

le.phat

Member
I'm not a developer. I have no idea.

There are a lot of confounding factors. CoD: Ghosts should be higher than 720p on Xbox One. It is a marginally improved engine. My natural inclination is to say things will get better, but what if they upgrade their technology next year or the year after and the game becomes more technically demanding?

Keep in mind this gen the resolution of CoD got worse over time (until the end, when Blops 2 made it sorta wackily different).

The GPU is weak. Period. The PS4 GPU is not a superstar but the Xbone GPU is like worse than the RSX for its respective time period.

You-Died.jpg
 

killatopak

Gold Member
Can anyone with more technical knowledge than me comment on the capability of the ESRAM with deferred rendering? It seems like forward rendered games are capable of running at 1080p (Forza) while deferred engines are having problems. Does this have to do with the size of the framebuffer in relation to the 32MB of ESRAM? And it that is the problem, is there a possible workaround?

I'm not technically inclined but I've been reading on this stuff and size is the problem for the most part. Using deferred rendering on modern engines, using 32mb is just enough for around 720p-900p.

I'm not sure if there are any workaround other than not using deferred rendering but other more technically inclined GAF members should be able to answer you.
 
That's quite telling then. It would certainly imply that the XB1's GPU had trouble handling Ghosts at higher than 720p resolutions when only 90% of it is available

That's odd

I think MS will have to settle with no snap mode for some games. May become a per game feature. 10% of GPU is a lot. Especially when you're already underpowered.
 

twobear

sputum-flecked apoplexy
The GPU is weak. Period. The PS4 GPU is not a superstar but the Xbone GPU is like worse than the RSX for its respective time period.

Well, unless I'm mistaken, so is the PS4's GPU. Not to downplay the comparison too much, but yeah.
 
I don't think ESRAM makes sense as an excuse for CoD. The engine is not using deferred rendering.

Weak GPU makes more sense. Fillrate issue.

do we know that? they've confirmed it has an all new real time lighting model on next gen consoles. I know that doesn't necessarily mean deferred lighting, but I'm be surprised if it wasn't, given the circumstances.
 
I think MS will have to settle with no snap mode for some games. May become a per game feature. 10% of GPU is a lot. Especially when you're already underpowered.

I can't see that happening though. Would likely cause OS stability issues I would imagine as I thought MS kind of cordoned off the resources for snap and the like?

Well, unless I'm mistaken, so is the PS4's GPU. Not to downplay the comparison too much, but yeah.

Yes the PS4 GPU isn't particularly powerful but considering the differences the XB1's is very very weak

As far as PS4 vs Xbox One GPU wise, here's how it stacks up.

PS4: 1.84TF GPU ( 18 CUs)
PS4: 1152 Shaders
PS4: 72 Texture units
PS4: 32 ROPS
PS4: 8 ACE/64 queues
8gb GDDR5 @ 176gb/s

Verses

Xbone: 1.31 TF GPU (12 CUs)
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
8gb DDR3 @ 69gb/s+ 32MB ESRAM @109gb/s
 

Melchiah

Member
Yes but you know what, its always been like that. Reviews for example are often based on personal preferences and are rarely ever objective. Thats why i never read them. We are surrounded by it all year round and now people are surprised cause it concerns hardware? Where have these people been all this time?

The things is, reviews are expected to include writers' subjective views, unlike hardware comparisons, which have recently been tainted by all kinds of excuses, whether it's balance or the indifference between resolutions. Those same tech articles didn't have similar derails when PS3 and 360 were compared.
 

Marlenus

Member
Well, unless I'm mistaken, so is the PS4's GPU.

It depends on the metric. In pure performance terms the PS4/Xbox One GPUs are slower compared to top tier pc parts than the 360/PS3 were back when they launched. In terms of power consumption the PS4/Xbox One parts are roughly on par with what was in the 360 and the PS3. The issue is that on the PC front power consumption for graphics cards has been going up.

When you want to design a box that is going to be small, quiet and stable you cannot put in a 200+ watt GPU into it without making a compromise on size or noise.
 

reKon

Banned
Why Xbox One Is Better Is Because They Are Focusing On Gamers Now... Just Like The New Controller. It Focuses On Family Gaming, And Media. Reason Why I Think PS3, And PS4 Sucks Is Because They Copied The Kinect, And Because It Doesn't Have Minecraft. The Xbox One Is Expensive But Is Worthed. PS3 Have Alot Of Hackers And It Ruins The Games. Another Thing Xbox One WILL SUPPORT USED GAMES. Xbox One Console Looks More Futuristic Too. Watching TV On Your Xbox One Was A Cool Idea Too.

- Youtube Comment from this "user": http://www.youtube.com/user/ayewiwita


Wow, this like some Samsung fake comment level type shit going on here, lol. For those who know what I'm talking about you have to agree with me here.
 

Conor 419

Banned
The differences between the two systems is startling, and it seems to be a much greater gap than which was between the Xbox 360 and the PS3; a size almost half that of me member, har.
 
It depends on the metric. In pure performance terms the PS4/Xbox One GPUs are slower compared to top tier pc parts than the 360/PS3 were back when they launched. In terms of power consumption the PS4/Xbox One parts are roughly on par with what was in the 360 and the PS3. The issue is that on the PC front power consumption for graphics cards has been going up.

When you want to design a box that is going to be small, quiet and stable you cannot put in a 200+ watt GPU into it without making a compromise on size or noise.

in terms of architecture too, the PS4 gpu is more modern also.
 

Sinthor

Gold Member
I think MS will have to settle with no snap mode for some games. May become a per game feature. 10% of GPU is a lot. Especially when you're already underpowered.

But there's no way they'd do this. They'd rather have games run at a lower resolution or framerate than the competition than completely break the user experience they're marketing. MS is marketing the XB1 as a TV extender that does games as well. All the running of apps and snap mode, etc., is key to their vision. If they break that...they trash their vision.

Imagine people's confusion and complaints when they can run their apps or use snap mode with every game EXCEPT for COD, let's say. Or if you just let developers kill that when they want, how long is it before none of the apps or snap mode works with ANY of the games your playing.

No, MS won't do that. People just have to realize, I believe, that MS is targeting the XB1 differently than it did the 360. They also already said they didn't target the higher end of graphics, etc., for gaming, people just didn't listen. In the end, MS is banking on the app and TV functionality making the XB1 take off like the Wii did. Remember, the Wii's graphics capabilities were NOTHING like the X360 or the PS3, but it did very well. MS wants to do that, and the XB1 system is the way they believe they can make that happen.
 

Marlenus

Member
in terms of architecture too, the PS4 gpu is more modern also.

That is also true, the 360 had a forward looking GPU with the unified architecture but the PS3 had seperate Vectors and Pixel pipeline design of the Nvidia 7xxx series gpus.

At least both the X1 and the PS4 are using GCN which is a current and up to date architecture.
 

Nafai1123

Banned
I'm not technically inclined but I've been reading on this stuff and size is the problem for the most part. Using deferred rendering on modern engines, using 32mb is just enough for around 720p-900p.

I'm not sure if there are any workaround other than not using deferred rendering but other more technically inclined GAF members should be able to answer you.

It seems like a major oversight by MS, unless they designed the box specifically targeting 720p. Many engines now and going into the future are using deferred rendering or a mix of both, which will basically force them into running around that resolution throughout the generation.
 

longdi

Banned
ms can still win the tech wars by dropping xb1 price to $349
$499 for a clearly weaker console, lol, not even apple that bad. ms is trying to be the new nintendo.
 
Top Bottom