• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

CLEEK

Member
The second thing on the Design board is the must have 8GB ram, which gave them 2 potential solutions:

Unified 8GB DDR3 Ram.
Split ram pool 4GB DDR3 and 4GB GDDR5.

The second solution is not great because a) it costs more to have GDDR than DDR and we already know, they must be making a profit on day one as well as having Kinect and the audio block, also as shown by the PS3 a split memory architecture can be difficult.

Having unified RAM seems like the obvious design choice, and Cerny has said that was the number 1 request from devs when planning the PS4.

But if you know your OS is going to reserve a significant chunk that will never be available to games, why didn't the Xbone go for split RAM. 4Gb GDDR5 for games, 4GB DDR3 for the OS/apps. Games would lose out ~1GB of RAM, but would gain in having access to memory perfectly suited for modern gaming. It would also do away with the band aid ESRAM solution. As far as games go, it would still be unified memory. The gaming side wouldn't know of the existence of the DDR3, so would just have 4GB GDDR5 for frame buffer and everything else.

I'm sure I know the answer to my question: cost. It would cost more to have two types of RAM (one being the more expensive GDDR5) with double the memory controllers etc. As gaming was never the priority for the Xbox, it wouldn't have been approved.
 

system11

Member
The thing is, there is competitive pressure from other publishers and even competition with first party titles to deincentivize such actions.

If one gimps it's games for the sake of "parity", it leaves them open to having a competitor or the first party devs to make their games look far inferior due to being more willing to push things on a platform.

Interesting times ahead.

I don't understand why the publishers would care. What are MS going to do to them, deny a release when they need all the software they can get?
 
Really mostly interested in what top level Sony devs and Top level MS devs do tbh...even with the multi-plat launch games running a bit better on PS4. That's the real test IMO...Destiny is the start, but I think it comes down to Halo and Uncharted. I'm ready to see what Ice team is capable of!
 

Melchiah

Member
In the last six months some of the people in my circles, who used to prefer Xbox, have gone from excited to judgmental, and distanced themselves from the whole subject. When the specs and graphics come up in a conversation, the usual reaction is that they don't matter, games are more important, and mentioning them is a fanboy's cry for a console war. Funnily enough, when the tables were reversed the specs and graphics seemed to be a great deal. Just like it's been recently with tech sites like Arstechnica and Digital Foundry.


"No habit or quality is more easily acquired than hypocrisy, nor any thing sooner learned than to deny the sentiments of our hearts and the principle we act from: but the seeds of every passion are innate to us, and nobody comes into the world without them."
- Bernard de Mandeville

“Everyone loves a witch hunt as long as it's someone else's witch being hunted.”
- Walter Kirn
 
In the last six months some of the people in my circles, who used to prefer Xbox, have gone from excited to judgmental, and distanced themselves from the whole subject. When the specs and graphics come up in a conversation, the usual reaction is that they don't matter, games are more important, and mentioning them is a fanboy's cry for a console war. Funnily enough, when the tables were reversed the specs and graphics seemed to be a great deal. Just like it's been recently with tech sites like Arstechnica and Digital Foundry.


"No habit or quality is more easily acquired than hypocrisy, nor any thing sooner learned than to deny the sentiments of our hearts and the principle we act from: but the seeds of every passion are innate to us, and nobody comes into the world without them."
- Bernard de Mandeville

“Everyone loves a witch hunt as long as it's someone else's witch being hunted.”
- Walter Kirn

Yes but let the sony boys stop acting as if they werent doing the same exact thing every other generation. Last i checked theyve been doing this for the past 2 console generations to the superior xbox consoles so lets get real here and stop being hypocrites.

For me what disappoints me the most isnt the resolution itself but the message it sends me. To me, this tell ne that the xbox already has trouble keeping up. And that (for a console that hasnt even lanuched yet) is a massive disappointment to me. COD in 720? Really? Thats the drop that did it for me.
 

FranXico

Member
Yes but let the sony boys stop acting as if they werent doing the same exact thing every other generation. Last i checked theyve been doing this for the past 2 console generations to the superior xbox consoles so lets get real here and stop being hypocrites.

Using the hypocrisy of some to justify the hypocrisy of others says a lot...
 
ps4 hardware isn't powerful either, it'll be fine in that case...

30w cpu and 80-100w gpu is easy enough to cool

laptop makers put far more powerful hardware in a 4x thinner laptop shell (and that does overheat haha)

PS4s thermal footprint should be alot higher than the XBONE. 18 compute units vs 12 compute units, DDR3 vs GDDR5(even if it is in the clamshell). Note that the memory is usually the biggest heat generator in a GPU card. You also can't compare to laptops because you get alot higher quality parts(and higher priced).

If PS4 works without any problems it'd be one of the best engineering jobs in the last 5 years.
 

Lazer

Neo Member
In the last six months some of the people in my circles, who used to prefer Xbox, have gone from excited to judgmental, and distanced themselves from the whole subject. When the specs and graphics come up in a conversation, the usual reaction is that they don't matter, games are more important, and mentioning them is a fanboy's cry for a console war....

"No habit or quality is more easily acquired than hypocrisy, nor any thing sooner learned than to deny the sentiments of our hearts and the principle we act from: but the seeds of every passion are innate to us, and nobody comes into the world without them."
- Bernard de Mandeville

I don´t know to whom you are referring to, but (at least) I am perfectly happy with PC, iPad and future Playstation4. No Xbone here.

Calling people hypocrites because of preferences of gaming platforms (or phones etc.) is rather insulting though. Most people won´t tolerate that and therefore get annoyed by the zealous pushing of the one true opinion™. Even if the specs and/or games were clearly better on machine A, people should be "allowed" to buy machine B if they choose to.
 
PS4s thermal footprint should be alot higher than the XBONE. 18 compute units vs 12 compute units, DDR3 vs GDDR5(even if it is in the clamshell). Note that the memory is usually the biggest heat generator in a GPU card. You also can't compare to laptops because you get alot higher quality parts(and higher priced).

If PS4 works without any problems it'd be one of the best engineering jobs in the last 5 years.

BUT what if it isn't? Holy shit, massive failures confirmed. I am concerned.
 

Shahed

Member
Yes but let the sony boys stop acting as if they werent doing the same exact thing every other generation. Last i checked theyve been doing this for the past 2 console generations to the superior xbox consoles so lets get real here and stop being hypocrites.

And they are just as silly and juvenile as the people who are doing it now. If people prefer to use the weaker system because that's where their preferences are, then that's perfectly valid and should be the main reason as to which of the two consoles you choose instead of console specs. But there's no need to be disingenuous
 

DrM

Redmond's Baby
When the specs and graphics come up in a conversation, the usual reaction is that they don't matter, games are more important, and mentioning them is a fanboy's cry for a console war. Funnily enough, when the tables were reversed the specs and graphics seemed to be a great deal.
Yeah, I have been watching the same behavior pattern on several local online communities.

Suddenly, gameplay is all important, if you are discussing graphics, they urge you to go to with PC. Also, almost all 'hardcore' X360 guys are missing. No wonder, they were the most vocal when piracy was something normal on X360 around here.
 

Melchiah

Member
Yes but let the sony boys stop acting as if they werent doing the same exact thing every other generation. Last i checked theyve been doing this for the past 2 console generations to the superior xbox consoles so lets get real here and stop being hypocrites.

For me what disappoints me the most isnt the resolution itself but the message it sends me. To me, this tell ne that the xbox already has trouble keeping up. And that (for a console that hasnt even lanuched yet) is a massive disappointment to me. COD in 720? Really? Thats the drop that did it for me.

Everyone has their preferences, what makes it hypocritical is when people, let alone tech sites, try to change and distort the subject when things aren't going their way. I don't recall there being such attempts with PS2 and the first Xbox, as there was no denying that the latter had more capable hardware. I think it's been a hard pill to swallow for some people, after being two generations in the winning camp when it comes to specs and graphics.

I also find it odd, that people are claiming the resolution would come down from 1080p later on during next gen, based on this gen's games. I don't remember newer titles from the same developers going below 720p during the course of this gen, apart from Resistance 3.

When it comes to the downplayed difference of 720p and 1080p, the difference between the resolutions was obvious to me when playing something like WipEout HD.



EDIT:
I don´t know to whom you are referring to, but (at least) I am perfectly happy with PC, iPad and future Playstation4. No Xbone here.

Calling people hypocrites because of preferences of gaming platforms (or phones etc.) is rather insulting though. Most people won´t tolerate that and therefore get annoyed by the zealous pushing of the one true opinion™. Even if the specs and/or games were clearly better on machine A, people should be "allowed" to buy machine B if they choose to.

Hello, old friend. =)

I was mostly referring to people's posts and comments on my FB news feed. It seems to me some people get annoyed when the facts (specs), that don't align with their preferences, are brought up, and as a result they try to downplay them with various means. No-one's trying to tell anyone what they should buy though.
 
PS4s thermal footprint should be alot higher than the XBONE. 18 compute units vs 12 compute units, DDR3 vs GDDR5(even if it is in the clamshell). Note that the memory is usually the biggest heat generator in a GPU card. You also can't compare to laptops because you get alot higher quality parts(and higher priced).

If PS4 works without any problems it'd be one of the best engineering jobs in the last 5 years.
I'm not that sure. If it is, it won't probably be by a lot. XB1's CPU and GPU are clocked higher and its APU is packed with transistors due to ESRAM. I have no numbers but I wouldn't be surprised if their thermal footprints end up being similar.
 
I'm not that sure. If it is, it won't probably be by a lot. XB1's CPU and GPU are clocked higher and its APU is packed with transistors due to ESRAM. I have no numbers but I wouldn't be surprised if their thermal footprints end up being similar.

I don't think clock matters as much as ALU count. Heat and cooling is more about displacement, which is harder with more units. I forgot all about the ESRAM though.
 

AmyS

Member
It seems Microsoft felt alright about giving Xbox One about the amount GPU performance, and not really any more, that Epic said would allow Unreal Engine 4 to be 'interesting', and that was 1+ TFLOP.


What are the key design goals for Unreal Engine 4?



We have three big goals:

First, to define the next generation of graphics capabilities that are achievable with DirectX 11 PC’s and future consoles.
Second, to deliver a toolset with an unprecedented mixture of power, ease-of-use, and polish.
And finally, to scale the technology and its feature set up and down the spectrum of future computing devices, including iOS and Android, and mainstream PC.


What is the target platform for UE4? What kind of hardware are gamers going to need to run UE4 based games?



Unreal Engine 4’s next-generation renderer targets DirectX 11 GPU’s and really starts to become interesting on hardware with 1+ TFLOPS of graphics performance, where it delivers some truly unprecedented capabilities. However, UE4 also includes a mainstream renderer targeting mass-market devices with a feature set that is appropriate there.

Link
 

DBT85

Member
PS4s thermal footprint should be alot higher than the XBONE. 18 compute units vs 12 compute units, DDR3 vs GDDR5(even if it is in the clamshell). Note that the memory is usually the biggest heat generator in a GPU card. You also can't compare to laptops because you get alot higher quality parts(and higher priced).

If PS4 works without any problems it'd be one of the best engineering jobs in the last 5 years.

I beg pardon. Memory is usually the biggest heat generator in a GPU card?

GDDR5 Ram will run fine with nothing but a bit of air blowing over the chips, try that with the actual GPU.
 
I beg pardon. Memory is usually the biggest heat generator in a GPU card?

GDDR5 Ram will run fine with nothing but a bit of air blowing over the chips, try that with the actual GPU.

Doesn't stop me worrying about the console in general... Tbh, I might have even preferred it to be the size of the Bone. Extra cooling never hurt anyone.
 

FranXico

Member
I beg pardon. Memory is usually the biggest heat generator in a GPU card?

GDDR5 Ram will run fine with nothing but a bit of air blowing over the chips, try that with the actual GPU.

Conversely, I heard elsewhere that the eSRAM actually is going to generate quite a lot of heat.
 
I don't think clock matters as much as ALU count. Heat and cooling is more about displacement, which is harder with more units. I forgot all about the ESRAM though.
You're right. If MS upclocked at the same voltage level, the power consumption will rise linearly. I didn't make the maths and, after checking it, it's probably very manageable considering how "oversized" (which is a plus to me) the system is.
 

DBT85

Member
Doesn't stop me worrying about the console in general... Tbh, I might have even preferred it to be the size of the Bone. Extra cooling never hurt anyone.

But a) we don't know how much cooling the PS4 has and b) the Xbone has a cavernous empty space inside the shell that isn't doing anything for cooling at all.
 
Everyone has their preferences, what makes it hypocritical is when people, let alone tech sites, try to change and distort the subject when things aren't going their way. I don't recall there being such attempts with PS2 and the first Xbox, as there was no denying that the latter had more capable hardware. I think it's been a hard pill to swallow for some people, after being two generations in the winning camp when it comes to specs and graphics.

I also find it odd, that people are claiming the resolution would come down from 1080p later on during next gen, based on this gen's games. I don't remember newer titles from the same developers going below 720p during the course of this gen, apart from Resistance 3.

When it comes to the downplayed difference of 720p and 1080p, the difference between the resolutions was obvious to me when playing something like WipeoutHD

Yes but you know what, its always been like that. Reviews for example are often based on personal preferences and are rarely ever objective. Thats why i never read them. We are surrounded by it all year round and now people are surprised cause it concerns hardware? Where have these people been all this time?
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Has this already been discussed somewhere?

DmdtzBJ.png

XYipkXA.png


I actually find that odd. I didn't think that the raw processing power was one of CoD's bottleneck, but the ESRAM size alone.
 

RoboPlato

I'd be in the dick
It seems Microsoft felt alright about giving Xbox One about the amount GPU performance, and not really any more, that Epic said would allow Unreal Engine 4 to be 'interesting', and that was 1+ TFLOP.




Link

Funny, they had that exact statement before but said 2.4tflops
 
Would that relatively small additional compute power really make much of a difference? Especially if the culprit bottleneck is really the ESRAM size?

It makes sense that they wouldn't be able to let it be used for graphics rendering, or their OS functions wouldn't work - no Snap etc.

Also, on a related note isn't it pretty safe to assume that they'll want to preserve most of that allocation and very little of it is ever going to be given back? Wouldn't it break compatibility with apps etc built around the reserve amount.
 

Skeff

Member
Would that relatively small additional compute power really make much of a difference? Especially if the culprit bottleneck is really the ESRAM size?

It makes sense that they wouldn't be able to let it be used for graphics rendering, or their OS functions wouldn't work - no Snap etc.

Also, on a related note isn't it pretty safe to assume that they'll want to preserve most of that allocation and very little of it is ever going to be given back? Wouldn't it break compatibility with apps etc built around the reserve amount.

Well a 10% boost could make the difference between 45 and 50fps which when your aiming for 60fps one is much more satisfactory than the other.

To be honest they may have just wanted to go to 768p or something so they would be higher resolution than BF4 on both consoles. All we know is they wanted more GPU power, it would be silly to make the implication that would allow them to hit 1080p

Sounds like optimizing would have taken too long so IW just picked whatever res ran what they had at a solid 60 and called it a day.

Infinity Ward likely put more time and effort in the XB1 version than the PS4 version to be honest. If any of them were just "called a day" it would have been the one at 1080p.
 
I beg pardon. Memory is usually the biggest heat generator in a GPU card?

GDDR5 Ram will run fine with nothing but a bit of air blowing over the chips, try that with the actual GPU.
Sure. ALUs are alot more sensitive to heat than ram chips. But yeah, ram runs hotter than anything.

Vapor-X7950moddedVRMtemps_zpscefdee2d.jpg


not my captions
vrmtemp-1.jpg


Funny, they had that exact statement before but said 2.4tflops
I don't think this is accurate..
 

le.phat

Member
See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.

I know you and Senjutsu are going through a rough time, but unless you were at their meetings, you have to let go of thoughts like this. Nowhere was it said that Microsoft set out to create a 1080P console. They set out to create a multimedia vision. Maybe they banked on 720P upscaled would be enough. Maybe they accepted that 1080P was hard to reach but didn't want to up the hardware and sell at a loss. Maybe they simply figured it wasn't nessesary. Maybe they figured Sony would settle for 720P as well.

The fact of the matter is, PS4 is strong enough to render whatever the xOne is rendering at a much higher resolution. In order for xOne to meet the ps4 in terms of graphical quality, sacrifices have to be made. If it's not the resolution, it will be something else. You need to come to terms with the fact that ps4/PC are much stronger and that the xOne will always struggle to tag along.
 

Orca

Member
Has this already been discussed somewhere?

DmdtzBJ.png

XYipkXA.png


I actually find that odd. I didn't think that the raw processing power was one of CoD's bottleneck, but the ESRAM size alone.

Haha, have a rumour you can't double confirm so you'll get banned if you post it? Just put it on twitter or your blog and someone will link it here right away.
 

Skeff

Member
Sure. ALUs are alot more sensitive to heat than ram chips. But yeah, ram runs hotter than anything.

Vapor-X7950moddedVRMtemps_zpscefdee2d.jpg


not my captions
vrmtemp-1.jpg



I don't think this is accurate..

Looks like both are OC'd also on GPU's the VRAM is generally cooled passivly, with the fans actually on the GPU rather than on the VRAM, Also I have no Idea what those pictures actually show as we don't even know what GPU it is.

A GPU runnig at 1155Mhz at 100% load would not be at 56 degrees with fan speed on 43%.

Something is wrong with those screens.

Also as pointed out by someone else the GPU memory is also noted to be 31 degrees on one of those pics.
 
RAM doesn't run all that hot at all compared to a CPU or GPU. 31C for the VRAM in your pictures is nothing.
VRAM is the GDDR pool. VRM(Video Ram) is what you're looking for. Its 57C(VRAM) to 39C(GPU). GDDR runs very hot
Looks like both are OC'd also on GPU's the VRAM is generally cooled passivly, with the fans actually on the GPU rather than on the VRAM, Also I have no Idea what those pictures actually show as we don't even know what GPU it is.

A GPU runnig at 1155Mhz at 100% load would not be at 56 degrees with fan speed on 43%.

Something is wrong with those screens.

Also as pointed out by someone else the GPU memory is also noted to be 31 degrees on one of those pics.

Yes, VRAM is alot less sensitive to heat than logic i
 
RAM doesn't run all that hot at all compared to a CPU or GPU. 31C for the VRAM in your pictures is nothing.
VRAM is the GDDR pool. VRM(Video Ram) is what you're looking for. Its 57C(VRAM) to 39C(GPU). GDDR runs very hot
Looks like both are OC'd also on GPU's the VRAM is generally cooled passivly, with the fans actually on the GPU rather than on the VRAM, Also I have no Idea what those pictures actually show as we don't even know what GPU it is.



Also as pointed out by someone else the GPU memory is also noted to be 31 degrees on one of those pics.
Your points are going everywhere.

Yes again, RAM is alot less sensitive to heat than logic is. We don't know which card is being used but its very clear what is being shown. GPU-z pulls data directly from the GPU sensors, so its as accurate as they are.

VRAM runs hottest in most high-end systems, its well established, I can provide more pictures that all show the same conclusion if you want...
 

Skeff

Member
VRAM is the GDDR pool. VRM(Video Ram) is what you're looking for. Its 57C(VRAM) to 39C(GPU). GDDR runs very hot


Yes, VRAM is alot less sensitive to heat than logic i

Check your screen shots again, the GPU Memory is 31 degrees, that is the GDDR5 on the Graphics card.

The DDR3 on the motherboard would be system memory not GPU memory, your screenshots are fucked up.
 

DBT85

Member
VRAM runs hottest in most high-end systems, its well established, I can provide more pictures that all show the same conclusion if you want...

Vram running hot doesn't mean vram put out more heat than the GPU. It means it's not being cooled as much as the GPU.

The GPU has a massive heatsink with heat pipes wicking the heat away.
 
Check your screen shots again, the GPU Memory is 31 degrees, that is the GDDR5 on the Graphics card.

The DDR3 on the motherboard would be system memory not GPU memory, your screenshots are fucked up.

No, its VRM.
Uh, no. VRM is a voltage regulator module, they are not RAM. VRMs are designed to run hot.



On AMD 6900s the voltag regulator is VReg

525x525px-LL-09ba9d87_vbattach204609.gif
 

Skeff

Member
VRAM is the GDDR pool. VRM(Video Ram) is what you're looking for. Its 57C(VRAM) to 39C(GPU). GDDR runs very hot
.

Check your screen shots again, the GPU Memory is 31 degrees, that is the GDDR5 on the Graphics card.
.

Uh, no. VRM is a voltage regulator module, they are not RAM. VRMs are designed to run hot.


Seriously, the hot RAMZ was FUD, i thought this was dispelled months ago?
 
Top Bottom