• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Perkel

Banned
Really interesting to see this thread is still going. I figure disparity right now is due mostly to eSRAM and immature dev tools. !

It mostly because of share power difference. Esram adds problems but it is power difference here what makes difference
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
A 1.31 TFLOP GPU should be good enough for 1080p. Especially in a closed environment.
 

kitch9

Banned
A 1.31 TFLOP GPU should be good enough for 1080p. Especially in a closed environment.

Its got very low pixel fillrate, is bandwidth starved and has less processing power. 1080p with modern engines is a stretch no two ways about it.

It should produce some decent 720p stuff but it will need AA or the jaggies will kill the IQ so the ESRAM will quickly get saturated when trying to do larger resolutions.

Either that or just turn your contrast and sharpness right up on the TV and pretend the problems don't exist.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Its got very low pixel fillrate, is bandwidth starved and has less processing power. 1080p with modern engines is a stretch no two ways about it.

It should produce some decent 720p stuff but it will need AA or the jaggies will kill the IQ so the ESRAM will quickly get saturated when trying to do larger resolutions.

See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.
 

CLEEK

Member
See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.

The Xbox One is a box full of comprimises. The 'talented designers' had to start off with the baggage of Kinect and Win8 app support. The former pushed the cost up significantly, and used up silicon budget on things like the SHAPE sound processor. The latter forced the 8GB DDR3 RAM, which led to using ESRAM to help compensate some what. ESRAM is huge on the APU, so 32MB was the absolute max they could put on there. And 32MB isn't enough for 1080p with MSAA, or 1080p and deferred rendering (e.g. a modern engine).

The PS4 was designed with a blank slate, with gaming performance at the fore. Thats hw they could produce a far more capable gaming platform which cost less to produce.
 

kitch9

Banned
See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.

All the software in the world will not make up for the lack of ROPs, Bandwidth, Stream Processors etc etc etc. Its just not happening. If you had the Xbox GPU in a PC you would expect it to struggle at 1080p without massive compromise its as simple as that.

I've been building PC's for over 20 years and buy a LOT of PC hardware and my first thought when I saw the specs of the Xbox was its a 720p machine. It has a decent amount of processing power for that res to produce some decent stuff, the only problem is the competition can run the same stuff and turn the res right up without even breaking sweat. IMO PS4 and Xbox are not even in the same league.

MS wanted to create an all in one super TV, Wafting, Skype box first and foremost because that's where they somehow think there's massive pent up demand for a games console. The think gamers don't want to play the best games, they are happy to accept mediocrity as long as they can wave their arms around to their nan on Skype in the middle of a round of low res COD. Its baking my noggin as to why they changed from the values of the 360 what placed games first then all the rest was a value added proposition which you could choose to pay for if you wanted to this forced mess.

I'm disappointed, most of my mates will buy Xbox, god knows why and I had intended to join them this time round as long as it was a close enough fight, but nah, not a chance in hell. I couldn't do it to myself, I'd rather go it alone.

MS engineers have created a machine they were told to make by the guys in suits who pay their wages. I can imagine most of them were shaking their head most days at the mismash of broken ideas they have had to implement along with the forced compromises.
 
The Xbox One is a box full of comprimises. The 'talented designers' had to start off with the baggage of Kinect and Win8 app support. The former pushed the cost up significantly, and used up silicon budget on things like the SHAPE sound processor. The latter forced the 8GB DDR3 RAM, which led to using ESRAM to help compensate some what. ESRAM is huge on the APU, so 32MB was the absolute max they could put on there. And 32MB isn't enough for 1080p with MSAA, or 1080p and deferred rendering (e.g. a modern engine).

The PS4 was designed with a blank slate, with gaming performance at the fore. Thats hw they could produce a far more capable gaming platform which cost less to produce.

To put it another way, the PS4 is the console you get if you let a games designer design it, the XB1 if you design via committee...
 

NBtoaster

Member
The Xbox One is a box full of comprimises. The 'talented designers' had to start off with the baggage of Kinect and Win8 app support. The former pushed the cost up significantly, and used up silicon budget on things like the SHAPE sound processor. The latter forced the 8GB DDR3 RAM, which led to using ESRAM to help compensate some what. ESRAM is huge on the APU, so 32MB was the absolute max they could put on there.

This is wrong. ESRAM was there before the 8GB.

http://forum.beyond3d.com/showpost.php?p=1782653&postcount=7
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
ESRAM is huge on the APU, so 32MB was the absolute max they could put on there. And 32MB isn't enough for 1080p with MSAA, or 1080p and deferred rendering (e.g. a modern engine).

All the software in the world will not make up for the lack of ROPs, Bandwidth, Stream Processors etc etc etc. Its just not happening. If you had the Xbox GPU in a PC you would expect it to struggle at 1080p without massive compromise its as simple as that.

It's clear at this point that MS has royally fucked this up. I'm so pissed off right now. Fuck you MS.
 

DBT85

Member
See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.

But as we have suspected, and as Buttocks also intimated, this console for the first time wasn't designed from the ground up by gamer people, but by committee who wanted to squeeze lots of different things in.

Had it been the same bunch of guys that made the 360, I'm sure the Xbone would be a very very different device. They wouldn't have wanted 8GB from the outset, therefore mandating 8GB of DDR3 early in the project. That one decision had incredible ramifications for the rest of the hardware.
 

CLEEK

Member
To put it another way, the PS4 is the console you get if you let a games designer design it, the XB1 if you design via committee...

MS just had completely different goals when designing their console.

It was just that after all the pieces had been put together and it was too late to redesign the hardware, they found out the public didn't exactly care for their goals, to put it mildly. Hence the 180s and the change in message from "performance doesn't matter" to the recent PR FUD campaign about balance and how MS would never allow a competitor to produce a console 30%+ faster.

But it was too late. The PR message might have changed recently, but the hardware was set in stone a long time ago, when MS still thought that people would jump at the chance to buy an always online, DRM filled, cable DVR with app and game support.


DDR3 was there first. They were still settling on how much to put in.
 
Yes the ESRam has always been there along with the DDR3 memory setup... It was there to try and bring parity to the fact Sony were using GDDR5 memory (about 2g) and MS were possibly going to use 4gig... Then Sony upped theirs to 4gig and MS had to go 8gig just to get close, however when Sony announced 8 gigs MS knew their memory configuration was toast, it would never be equal, the ESRam is exactly how it's been described, a band-aid, and not a very good one...
 

kitch9

Banned
It's clear at this point that MS has royally fucked this up. I'm so pissed off right now. Fuck you MS.

Thats before we even start on the fact the CPU in both machines is complete garbage (Relatively.) and only one had the foresight to ensure there was plenty of GPGPU processing available to help out with properly implemented HUMA.

All the "balance" talk just made me LOL my head off!.
 

Bundy

Banned
So yeah I guess we will be seeing significant differences between launch games and all that

Only question is how dev tools will improve on both systems over the generation and what happens to the gap that is clearly very significant at present
Of course we will see significant differences.
Dev tools will improve for the Xbone AND the PS4.
The visible difference will become even bigger in a few years.
The PS4 has the stronger hardware, the much easier-to-develope for architecture and the clearly better/stronger/faster RAM solution.

Yes the ESRam has always been there along with the DDR3 memory setup... It was there to try and bring parity to the fact Sony were using GDDR5 memory (about 2g) and MS were possibly going to use 4gig... Then Sony upped theirs to 4gig and MS had to go 8gig just to get close, however when Sony announced 8 gigs MS knew their memory configuration was toast, it would never be equal, the ESRam is exactly how it's been described, a band-aid, and not a very good one...
.
 
MS just had completely different goals when designing their console.

It was just that after all the pieces had been put together and it was too late to redesign the hardware, they found out the public didn't exactly care for their goals, to put it mildly. Hence the 180s and the change in message from "performance doesn't matter" to the recent PR FUD campaign about balance and how MS would never allow a competitor to produce a console 30%+ faster.

But it was too late. The PR message might have changed recently, but the hardware was set in stone a long time ago, when MS still thought that people would jump at the chance to buy an always online, DRM filled, cable DVR with app and game support.

Yes and no, too many competing divisions inside MS wanted in, so the compromises had to come, I wonder if they horse traded features with each other....
 

kitch9

Banned
I don't understand... I thought the whole point of the ESRAM was to make up for the crap speed of the DDR3? So logically wouldn't the decision for ESRAM come afterwards?

They could have started off with 4GB in mind if they had heard their competitor was considering 2GB of GDDR5.

Prices were a lot higher back then so they may have decided just to go with the more cost effective solution.
 

LCfiner

Member
pretty much, yeah. I mean, someone said:
“it’s gotta run windows Metro apps”
"it’s gotta snap apps like win8 and run these alongside games”
“it’s gotta have kinect in the box and be usable at all times”
“it’s gotta work with Live TV broadcasts”
“it’s gotta cost under 500 bucks”

You end up with lots of slower memory to handle all the apps and TV stuff - not ideal for graphics.
You end up with a large memory cache on chip to offset the slow RAM, reducing space for the GPU
You end up with a max chip die size that can’t be expanded to increase performance because of costs.
You end up with portions of the GPU being used to handle the side by side running and snapping of apps, games and TV

If it turns out there’s more people out there looking for $500 multimedia boxes these days than I think there are, it could do OK. But I think it’s a misguided vision that leaves us with a box underpowered compared to the competition for actual games and with secondary features that are uninteresting to the main audience of the box.
 

Amused

Member
So I have a question to the people smarter in this area than I am.

We all know, and have now seen with our own eyes, how the power difference results in significant differences in graphics. But is there a possibility that the power difference could result in gameplay/content-differencec in multi-platform games later in the generation?

At some point I'm guessing (maybe stupidly) game devs will want too pack their games with bigger citys, more people with more AI-scripts running at the same time and a lot of other cool stuff, resulting in them targeting 720p or 900p rather than 1080p. What would s 30/40/50 % power gap mean for the Xbone then?

Will this mean trouble for the Xbone gameplay wise? Or is this where the "lowest common denominatior" really comes into play, with devs packing their multi-platform games only to the point where they run at 720p on the Xbone, and all the PS4 gets out of the power advantage in these cases are the better visuals? The first party excusives are another chapter of course...

Anyway, would love to hear your inputs on this.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
We all know, and have now seen with our own eyes, how the power difference results in significant differences in graphics. But is there a possibility that the power difference could result in gameplay/content-differencec in multi-platform games later in the generation?

Highly unlikely. Both consoles have literally the same CPU and the same amount of main memory, hence both will be able to run the same game simulation. The differences in GFX performance will only impact visuals. The only variable in this equation is the importance of GPGPU in later games. Games like Resogun obviously depend heavily on GPGPU-algorithms for gameplay. The XBO certainly is capable of running all algorithms that the PS4 can run, nevertheless the PS4 is quite substantially more capable in this department. In any case, multi-platform games will certainly take this potential limitation into account in their gameplay designs.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I don't understand... I thought the whole point of the ESRAM was to make up for the crap speed of the DDR3? So logically wouldn't the decision for ESRAM come afterwards?

In the early design phase, the decision could nevertheless have been to go for DDR3+ESRAM for cost reasons and the potential to equip the system with any amount of memory necessary. The decision for the actual pool size was most probably made after some software and hardware prototypes. Concrete performance requirements are likely to be gathered empirically.
 

Amused

Member
Highly unlikely. Both consoles have literally the same CPU and the same amount of main memory, hence both will be able to run the same game simulation. The differences in GFX performance will only impact visuals. The only variable in this equation is the importance of GPGPU in later games. Games like Resogun obviously depend heavily on GPGPU-algorithms for gameplay. The XBO certainly is capable of running all algorithms that the PS4 can run, nevertheless the PS4 is quite substantially more capable in this department. In any case, multi-platform games will certainly take this potential limitation into account in their gameplay designs.


Around half of this makes sence to me, but I get the general gist of what you are saying, and I guess it's noe unexpected.

Thanks for the reply!
 

Ploid 3.0

Member
Its got very low pixel fillrate, is bandwidth starved and has less processing power. 1080p with modern engines is a stretch no two ways about it.

It should produce some decent 720p stuff but it will need AA or the jaggies will kill the IQ so the ESRAM will quickly get saturated when trying to do larger resolutions.

Either that or just turn your contrast and sharpness right up on the TV and pretend the problems don't exist.

For all the balance talk MS Pr has done, it seem like a very unbalanced system. The ESRAM seems like an annoying water dam.
 
The one major factor I am curious to see is the system quality. The Xbox One is slower and has a casing that's a massive vent, as well as an external power supply. The PS4 has more powerful hardware in a smaller casing with a power supply inside.

ps4 hardware isn't powerful either, it'll be fine in that case...

30w cpu and 80-100w gpu is easy enough to cool

laptop makers put far more powerful hardware in a 4x thinner laptop shell (and that does overheat haha)
 

QaaQer

Member
See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.

Blame all the MBAs, not MS engineers. PS4 was designed by a game developer and it shows.
 

RAIDEN1

Member
....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..
 

Skeff

Member
See I'm having a hard time accepting this. When the very talented MS engineers designed Xbox One. They must have sat down and worked out what kind of performance was required to create a 1080p machine. I really don't believe they sat down and designed a 720p or 900p machine.
At this point. I'm blaming a immature XDK and time starved developers trying to hit a launch deadline.

They didn't have the luxury of designing a games console to their own personal specifications. The suits at Microsoft laid down some objectives they had to meet which compromised it for gaming:

Sold at a profit at $499 with Kinect.
8gb ram goal set in 2010.
Windows 8 ecosystem and multitasking

The first issue is that the PS4 probably had a higher budget for the console itself as it is sold at $399 at a loss with no camera, this really took out a lot of the wiggle room the engineers had.

The second point is that it must have 8gb ram minimum, this is so that the OS can have 3GB available.

The multitasking and windows 8 ecosystem are responsible for the guaranteed 8gb RAM as well as the GPU reservation.

When designing the system, the first thing they had to put on the design board is Kinect hardware, meaning both the Kinect itself and the Audio block in the XB1, both of which are of course expensive. The second thing on the Design board is the must have 8GB ram, which gave them 2 potential solutions:

Unified 8GB DDR3 Ram.
Split ram pool 4GB DDR3 and 4GB GDDR5.

The second solution is not great because a) it costs more to have GDDR than DDR and we already know, they must be making a profit on day one as well as having Kinect and the audio block, also as shown by the PS3 a split memory architecture can be difficult.

At this point they already have their memory layout and sound processing, but they still need an APU, but they're already playing catch up in regards to bandwidth, they need a solution to this and that solution is embedded ram, so they need to make another decision, esram vs. edram.

They chose esram, this could be because it takes less steps in the fabrication process and they may have had difficulties with the x86 licensing and having it produced in one of these foundries, so it may not have been a choice by Microsoft, or it may have been chosen so they didn't have to put it on a daughter die, or plenty of other reasons but mainly $.

Now they are left with esram, it takes up around 3x the space as edram so they have to Balance the amount of esram with the size of the APU. From a financial perspective the maximum size of the APU was set, they decided to spend their silicon budget on 32mb esram, 1.3Tflop GPU and an 8 core jaguar. 32mb isn't a great amount, but any more than 32mb and the GPU or CPU would have to be cut down. so they had to come to a balance between esram GPU and CPU.

If the engineers from both companies sat down with the same budgets and the same goals, they'd likely design the same system. Microsoft engineers did a very good Job designing the system to meet their goals, I can't think of anything they could change to make it a better games console whilst meeting their goals from the suits.

....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..

Unfortunately for you this is a spec thread and unfortunately for Microsoft, Sony has a vastly superior first party.
 

Chobel

Member
....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..

Please go with your "its the games that matter" somewhere else. In this thread we're talking about specs not about games.
 
Please go with your "its the games that matter" somewhere else. In this thread we're talking about specs not about games.

....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..

They kill me with that BS...Like The Bone has all these great games annouced...Like the Ps4 won't be getting new games from Sony 1st party devs...But hey...at least they can play Mp3s lol
 
....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..
Aww... you get lost, little buddy? You see, this here is a specs thread. This isn't where you make unsubstantiated claims about games being the end-all for a console. Maybe you should find a thread that is more to your liking.

What's even sadder is that you probably even believe that Microsoft is going to pump out more exclusives than Sony. This has never been the case, and the contrary has been even more evident these last three years.
 

Vizzeh

Banned
They didn't have the luxury of designing a games console to their own personal specifications. The suits at Microsoft laid down some objectives they had to meet which compromised it for gaming:

Sold at a profit at $499 with Kinect.
8gb ram goal set in 2010.
Windows 8 ecosystem and multitasking

The first issue is that the PS4 probably had a higher budget for the console itself as it is sold at $399 at a loss with no camera, this really took out a lot of the wiggle room the engineers had.

The second point is that it must have 8gb ram minimum, this is so that the OS can have 3GB available.

The multitasking and windows 8 ecosystem are responsible for the guaranteed 8gb RAM as well as the GPU reservation.

When designing the system, the first thing they had to put on the design board is Kinect hardware, meaning both the Kinect itself and the Audio block in the XB1, both of which are of course expensive. The second thing on the Design board is the must have 8GB ram, which gave them 2 potential solutions:

Unified 8GB DDR3 Ram.
Split ram pool 4GB DDR3 and 4GB GDDR5.

The second solution is not great because a) it costs more to have GDDR than DDR and we already know, they must be making a profit on day one as well as having Kinect and the audio block, also as shown by the PS3 a split memory architecture can be difficult.

At this point they already have their memory layout and sound processing, but they still need an APU, but they're already playing catch up in regards to bandwidth, they need a solution to this and that solution is embedded ram, so they need to make another decision, esram vs. edram.

They chose esram, this could be because it takes less steps in the fabrication process and they may have had difficulties with the x86 licensing and having it produced in one of these foundries, so it may not have been a choice by Microsoft, or it may have been chosen so they didn't have to put it on a daughter die, or plenty of other reasons but mainly $.

Now they are left with esram, it takes up around 3x the space as edram so they have to Balance the amount of esram with the size of the APU. From a financial perspective the maximum size of the APU was set, they decided to spend their silicon budget on 32mb esram, 1.3Tflop GPU and an 8 core jaguar. 32mb isn't a great amount, but any more than 32mb and the GPU or CPU would have to be cut down. so they had to come to a balance between esram GPU and CPU.

If the engineers from both companies sat down with the same budgets and the same goals, they'd likely design the same system. Microsoft engineers did a very good Job designing the system to meet their goals, I can't think of anything they could change to make it a better games console whilst meeting their goals from the suits.



Unfortunately for you this is a spec thread and unfortunately for Microsoft, Sony has a vastly superior first party.

Excellent post mate :)
 
They didn't have the luxury of designing a games console to their own personal specifications. The suits at Microsoft laid down some objectives they had to meet which compromised it for gaming:

Sold at a profit at $499 with Kinect.
8gb ram goal set in 2010.
Windows 8 ecosystem and multitasking
...

All sounds very plausible. I wonder how much the engineers knew about the development of the PS4? They must have been hoping to get close to Sony's specs during development, knowing they would realistically have to beat PS4's specs for parity because there was going to be GPU/CPU resources for Kinect and Snap. Wonder how the management reacted as more info came in about PS4 last year?
 

KampferZeon

Neo Member
They didn't have the luxury of designing a games console to their own personal specifications. The suits at Microsoft laid down some objectives they had to meet which compromised it for gaming:

Sold at a profit at $499 with Kinect.
8gb ram goal set in 2010.
Windows 8 ecosystem and multitasking

The first issue is that the PS4 probably had a higher budget for the console itself as it is sold at $399 at a loss with no camera, this really took out a lot of the wiggle room the engineers had.

The second point is that it must have 8gb ram minimum, this is so that the OS can have 3GB available.

The multitasking and windows 8 ecosystem are responsible for the guaranteed 8gb RAM as well as the GPU reservation.

When designing the system, the first thing they had to put on the design board is Kinect hardware, meaning both the Kinect itself and the Audio block in the XB1, both of which are of course expensive. The second thing on the Design board is the must have 8GB ram, which gave them 2 potential solutions:

Unified 8GB DDR3 Ram.
Split ram pool 4GB DDR3 and 4GB GDDR5.

The second solution is not great because a) it costs more to have GDDR than DDR and we already know, they must be making a profit on day one as well as having Kinect and the audio block, also as shown by the PS3 a split memory architecture can be difficult.

At this point they already have their memory layout and sound processing, but they still need an APU, but they're already playing catch up in regards to bandwidth, they need a solution to this and that solution is embedded ram, so they need to make another decision, esram vs. edram.

They chose esram, this could be because it takes less steps in the fabrication process and they may have had difficulties with the x86 licensing and having it produced in one of these foundries, so it may not have been a choice by Microsoft, or it may have been chosen so they didn't have to put it on a daughter die, or plenty of other reasons but mainly $.

Now they are left with esram, it takes up around 3x the space as edram so they have to Balance the amount of esram with the size of the APU. From a financial perspective the maximum size of the APU was set, they decided to spend their silicon budget on 32mb esram, 1.3Tflop GPU and an 8 core jaguar. 32mb isn't a great amount, but any more than 32mb and the GPU or CPU would have to be cut down. so they had to come to a balance between esram GPU and CPU.

If the engineers from both companies sat down with the same budgets and the same goals, they'd likely design the same system. Microsoft engineers did a very good Job designing the system to meet their goals, I can't think of anything they could change to make it a better games console whilst meeting their goals from the suits.



Unfortunately for you this is a spec thread and unfortunately for Microsoft, Sony has a vastly superior first party.

Could it be possible the xbone hardware do support 1080p graphics, but the problem is in the os / drivers

Consider Thuway says dev could not run cod higher than 720p, to do so the xbone will crash every session. Isnt this suggest maybe the true problem is not a hardware issue but os/driver/software stability issue?

This is a very very serious problem. I dont think even the most loyal console warrior could tolerate multiple crashes during one single gaming session.

the ps4 is traditional and straight forward. the os has its own reserved cpu core and memory. bandwidth is relatively plentiful for both game and os. extra horsepower is available for the os in the gpgpu queues.

in contrast, the xbone has 3 oses, multiple apps running and displaying (snap mode) in parallel, and has to support kinect gpgpu algo.

i think the xbone is the first console ever to have this level of complexity in its os.
So many open questions.
how to share the precious memory bandwidth between game and apps and oses?
how is the context switching work?
how priority is handled?
what about realtime apps (skype, hdmi in overlay). how are they handled
 

coldfoot

Banned
When designing the system, the first thing they had to put on the design board is Kinect hardware, meaning both the Kinect itself and the Audio block in the XB1, both of which are of course expensive.
This is the only part where you're inaccurate. The audio block does not take up much die area (no audio block does), and the SHAPE part was added because they actually had room to spare with the chip layout.

Other than that, good summary. I'd have pushed for EDRAM if I was a MS engineer though. You could probably get 128MB instead of 32MB.
 

Skeff

Member
This is the only part where you're inaccurate. The audio block does not take up much die area (no audio block does), and the SHAPE part was added because they actually had room to spare with the chip layout.

Other than that, good summary. I'd have pushed for EDRAM if I was a MS engineer though. You could probably get 128MB instead of 32MB.

I wasn't about cost in regards to die space there only in terms of cost of design/production, I doubt the cost will be terribly high, but if the chip is as good as they say it is, then it would still make a little bit of an impact on the BoM.
 
So, this may be a stupid question but I thought this would be a good place to ask; how much more complicated is reworking a game to use tiled-based rendering? I was re-reading the blog post by Tim Lottes again, and he mentions:
On this platform I’d be concerned with memory bandwidth. Only DDR3 for system/GPU memory pared with 32MB of “ESRAM” sounds troubling. 32MB of ESRAM is only really enough to do forward shading with MSAA using only 32-bits/pixel color with 2xMSAA at 1080p or 4xMSAA at 720p. Anything else to ESRAM would require tiling and resolves like on the Xbox360 (which would likely be a DMA copy on 720) or attempting to use the slow DDR3 as a render target.

I’d bet most titles attempting deferred shading will be stuck at 720p with only poor post process AA (like FXAA).
Regarding the bolded, will developers use such techniques to get future XB1 SKUs to 1080p or will they just not bother and render at 720p as he suggests?
 

Shahed

Member
....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..

While I agree with the statement on it's own, there's two underlying mistakes you made.

1) Yes the games are more important than certain graphical effects. But it's not on or the other. You can have both.

2) The main mistake. This is a spec thread. subjectives about games library have no place here
 
So, this may be a stupid question but I thought this would be a good place to ask; how much more complicated is reworking a game to use tiled-based rendering? I was re-reading the blog post by Tim Lottes again, and he mentions:

Regarding the bolded, will developers use such techniques to get future XB1 SKUs to 1080p or will they just not bother and render at 720p as he suggests?

Huh that blog post seems to be taken down although there's hard copies elsewhere

http://www.neogaf.com/forum/showthread.php?p=46918515

Wow I thought his excitement about the PS4 might be the ram setup but instead it's GPU access to the OS (or something along those lines)

But yeah turns out his thoughts on MS's ESram decision seem to be coming to fruition
 

viveks86

Member
So, this may be a stupid question but I thought this would be a good place to ask; how much more complicated is reworking a game to use tiled-based rendering? I was re-reading the blog post by Tim Lottes again, and he mentions:

Regarding the bolded, will developers use such techniques to get future XB1 SKUs to 1080p or will they just not bother and render at 720p as he suggests?

Interesting! I'm really curious to know what the rationale behind this design was. Pretty sure some smart engineer at MS would have been aware of this. It seems so strange that they would knowingly go with this architecture. The math seems to work against them if they don't use tile-based rendering. Was it the assumption that tile-based rendering is the future and all games will use it?

Here's another question. If straightforward rendering techniques don't work for reaching 1080p 60fps, are we expecting indie games to all be 900 or 720p? If so many AAA titles don't use tile based rendering yet, would indies bother jumping through hoops to get the resolution up?
 

Figboy79

Aftershock LA
....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..

Yeah, but Sony's first party studios are some of the best in the industry, in my opinion, rivaling Nintendo.

Microsoft doesn't come close in that regard. There is no uncertainty concerning big guns on the PS4 in the future.
 

kitch9

Banned
So, this may be a stupid question but I thought this would be a good place to ask; how much more complicated is reworking a game to use tiled-based rendering? I was re-reading the blog post by Tim Lottes again, and he mentions:

Regarding the bolded, will developers use such techniques to get future XB1 SKUs to 1080p or will they just not bother and render at 720p as he suggests?

720p.

Forward rendering has pretty much gone the way of the dodo.
 

Cidd

Member
....end of the day "its the games that matter"..as cliched as it sounds but no matter how powerful a system is, if it hasn't got the games, then its just glitter, not gold..

Interesting you should say that, because now Sony got both..
 
Top Bottom