• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Xbone Specs/Tech Analysis: GPU 33% less powerful than PS4

Log4Girlz

Member
LOL @ people thinking 50% is nothing to worry about.

Look at the difference in performance between Generations of PC graphic cards, cards in the same line are usually 10-20% more powerful e.g GTX 480 > 580 > 680 > 780.

50% is a big deal, almost 2 generations or more ahead in terms of sheer power.

This will only really be notable when comparing first party offerings IMO. Naughty Dog will create some pretty graphics.
 

Toma

Let me show you through these halls, my friend, where treasures of indie gaming await...
The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM, which helped the Xbox 360 so tremendously throughout its life, also wasn't used as a cache either.

The Haswell 128MB of EDRAM acting as a cache for the CPU as well as the GPU is a side benefit of the way Intel designed it, but historically in consoles EDRAM need not ever be a cache, much less a cache that works for both the CPU and the GPU in order to still provide meaningful benefits to performance. The PS2 had EDRAM, the Xbox 360 had EDRAM, I believe the Nintendo consoles for years have had a combination of EDRAM and 1T-SRAM (not true SRAM like what the Xbox One has, but an EDRAM variant instead), and in all cases -- especially stressing the PS2's EDRAM and the Xbox 360's EDRAM implementation -- the performance benefits have been real and meaningful.

The Xbox One's ESRAM has some pretty nice benefits over EDRAM on the 360. It has none of the primary drawbacks of EDRAM on the 360. And then on top of that fact it has even lower latency. It not needing to be refreshed helps with regards to its latency.

http://en.wikipedia.org/wiki/Memory_refresh



A post made on this forum earlier about the possible benefits of low latency ESRAM.

http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495



I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.



I fucking wish :D

I think the PS4 really could be cheaper than the X1 because the included Kinect 2.0 pushes the price considerably.
 

benny_a

extra source of jiggaflops
PS4 has an extra bus that can be used to bypass its GPU's L1/L2 cache and directly access memory, and Cerny said it can do 20 GB/s.
So CPU and GPU can not both access the memory with the full speed that is advertised?

I mean it's still a big deal that you don't need to copy to VRAM like on a split architecture, but I assumed everything being on one die meant they both could get the full 176GB/s that the chips are capable of.

Or is that a non-issue?
And there is that 20GB/s figure from?
 
This will only really be notable when comparing first party offerings IMO. Naughty Dog will create some pretty graphics.

Not only first party will be able to do that, there are dedicated third party developers out there who do utilize a console's power as much as they can, for example Kojima productions or SE with FF games.

Besides, both consoles have very similar architecture, it won't be difficult to boost up multiplatform games on the PS4 version, it's not the case of 2 very different architectures like the PS3 vs 360 was anymore.

We won't see a big difference during launch or first year of next gen consoles, but I'm willing to bet we'll see a very noticeable difference in 2-4 years of next gen's lifetime when developers will have an easier time squeezing every bit of power the consoles will offer.

As for cloud gaming, don't get too excited with all the hype Microsoft is trying to create, cloud computing has Many obstacles and you should be very skeptical about what it can do until we get some real world performance results.
 

Log4Girlz

Member
Not only first party will be able to do that, there are dedicated third party developers out there who do utilize a console's power as much as they can, for example Kojima productions or SE with FF games.

Besides, both consoles have very similar architecture, it won't be difficult to boost up multiplatform games on the PS4 version, it's not the case of 2 very different architectures like the PS3 vs 360 was anymore.

We won't see a big difference during launch or first year of next gen consoles, but I'm willing to bet we'll see a very noticeable difference in 2-4 years of next gen's lifetime when developers will have an easier time squeezing every bit of power the consoles will offer.

As for cloud gaming, don't get too excited with all the hype Microsoft is trying to create, cloud computing has Many obstacles and you should be very skeptical about what it can do until we get some real world performance results.

I think the majority of cross platform games will target the lowest common denominator.
 

cchum

Member
So CPU and GPU can not both access the memory with the full speed that is advertised?

I mean it's still a big deal that you don't need to copy to VRAM like on a split architecture, but I assumed everything being on one die meant they both could get the full 176GB/s that the chips are capable of.

Or is that a non-issue?
And there is that 20GB/s figure from?


"And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in today’s terms -- it’s larger than the PCIe on most PCs!"

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2
 

benny_a

extra source of jiggaflops
"And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in today’s terms -- it’s larger than the PCIe on most PCs!"

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2
Thanks.

Mark Cerny said:
First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches. As a result, if the data that's being passed back and forth between CPU and GPU is small, you don't have issues with synchronization between them anymore. And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus.
I'm still not 100% clear on it. If the bus is for the GPU, doesn't that mean the 176GB/s figure is worthless and the actual bandwidth is 20GB/s, as the bus doesn't allow for more?

Or is that a second alternative bus for, let's say, compute jobs that the CPU should work on afterwards?

Edit: I guess on third reading and this time reading it in conjunction with the other modification it's an extra bus just go ignore the cache. I still don't understand it but my confusion is now not any more about the 176GB/s bandwidth.

It's an internal bus for passing data directly from cpu to gpu. It doesn't really have anything to do with external bandwidth to RAM.
Then 20GB/s seems to be plenty. Thanks again.

This compute stuff seems clever. I wonder how much they'll get out of it a few years down the line.
 

cchum

Member
Thanks.


I'm still not 100% clear on it. If the bus is for the GPU, doesn't that mean the 176GB/s figure is worthless and the actual bandwidth is 20GB/s, as the bus doesn't allow for more?

Or is that a second alternative bus for, let's say, compute jobs that the CPU should work on afterwards?

It's an internal bus for passing data directly from cpu to gpu. It doesn't really have anything to do with external bandwidth to RAM.
 

Perkel

Banned
The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM


Sory SenjutsuSage but most of the things you say is wishful thinking like AMD texture tech of GCN or talking about Kameo particle system which used immense eDram bandwidth in first place (bandwidth is super important for particles)


Main point of eSRAM is to have higher bandwidth and to do bandwidth starving tasks. If you take eSram from those task what is left is 68GB/s. After framebuffer there is not even 32MB in first place.

eSram would be good for GPU compute tasks imo.
 
It's an internal bus for passing data directly from cpu to gpu. It doesn't really have anything to do with external bandwidth to RAM.

Isn't this standard for AMD APU i think i also saw it getting named for Xbone on B3D.
When they tried to figure out microsoft creative bandwidth math :p

Sory SenjutsuSage but most of the things you say is wishful thinking like AMD texture tech of GCN or talking about Kameo particle system which used immense eDram bandwidth in first place (bandwidth is super important for particles)


Main point of eSRAM is to have higher bandwidth and to do bandwidth starving tasks. If you take eSram from those task what is left is 68GB/s. After framebuffer there is not even 32MB in first place.

eSram would be good for GPU compute tasks imo.

I think they are using just like intel and have the framebuffer in ddr3 memory not on the Esram.
 
I guess on third reading and this time reading it in conjunction with the other modification it's an extra bus just go ignore the cache. I still don't understand it but my confusion is now not any more about the 176GB/s bandwidth.
I think it enables splitting memory access for compute units based on the nature of their work.

Compute units dedicated to graphic tasks will crunch graphical data accessing them through the cache mechanism. These data are pretty much sequentially ordered in memory so, as long as the bandwidth is there to fill cache memory, no cache miss should happen.

Now, if you dedicate some compute units to GPGPU tasks, chances are that you'll have to exchange data with the CPU and that will endanger the coherency of the GPU cache as you'll have to flush it in order to randomly fill it with CPU data. This cache flush may stall and impact the graphic pipeline so Sony (or AMD?) allowed the GPU to access system memory directly without impacting the cache. While the GPGPU compute accesses system memory, the other compute units still run on cache memory and are not affected.

That's how I understand it but I may be wrong, I'm not a specialist at all...
 
I imagine the "best of both worlds" design would be to have a big pool of GDDR5, a chunk of ESRAM (or another fast on-die cache), and the PS4's beefier GPU, but that would have pushed costs up even higher, and Sony was already pushing it with their move to 8GB. It makes sense that they chose to leave the ESRAM out, but that doesn't necessarily mean that they wouldn't have used it if they could.

I heard that with their budget Sony choosed a GPU with more compute units.
Microsoft had nearly the same budget and opted to add the ESRAM to bypass some of the bandwidth problems, but thay had to choose a smaller GPU to compensate.

The GDDR5 ram is this generation "MVP", that thing changed the status quo.
 

Drek

Member
I think the majority of cross platform games will target the lowest common denominator.
It will take almost no effort to optimize for the superior gpu, they're from the same family of design and everything. So even if you target the lower one the other will simply run better with almost zero extra work.

I'm betting we'll see a lot of XB One titles that barely make 30 fps with some drops while the PS4 runs much smoother. Gamers have been putting up with jittery cnsole frame rates for a long time already.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Thanks.


I'm still not 100% clear on it. If the bus is for the GPU, doesn't that mean the 176GB/s figure is worthless and the actual bandwidth is 20GB/s, as the bus doesn't allow for more?
.

They have a backdoor for compute tasks so they don't dirty the cache that rendering is using. So they can keep cache hits high for textures and still use CUs for other stuff without interference.
 

James Sawyer Ford

Gold Member
If the rumors of downgraded clocks + 10% reservation for non-gaming are true, we are looking at ~.8-1 TFlops, with of course less memory bandwidth than officially stated.

What implications does this now have stacked against the PS4? That would be almost double the performance...the gap was large before, but now it seems genuinely massive.

I'm guessing devs will probably target 1080p for PS4 and then adjust the Xbox One resolution in order to get framerate parity, but we could be looking at resolutions as low as 720p in that case?

I suspect devs want to keep framerate roughly the same, so as to be the most authentic replication of the experience, so adjusting resolution seems more logical.
 

RoboPlato

I'd be in the dick
If the rumors of downgraded clocks + 10% reservation for non-gaming are true, we are looking at ~.8-1 TFlops, with of course less memory bandwidth than officially stated.

What implications does this now have stacked against the PS4? That would be almost double the performance...the gap was large before, but now it seems genuinely massive.

I'm guessing devs will probably target 1080p for PS4 and then adjust the Xbox One resolution in order to get framerate parity, but we could be looking at resolutions as low as 720p in that case?

I suspect devs want to keep framerate roughly the same, so as to be the most authentic replication of the experience, so adjusting resolution seems more logical.
I think it'll be mixed between resolution drops and other settings lowered. Less AA, lower quality DoF/AO, and fewer particles. I could also see PS4 possibly getting some extra cosmetic physics if they do it on the GPU.
 

James Sawyer Ford

Gold Member
I think it'll be mixed between resolution drops and other settings lowered. Less AA, lower quality DoF/AO, and fewer particles. I could also see PS4 possibly getting some extra cosmetic physics if they do it on the GPU.

I wonder if Microsoft doesn't care about this scenario much either.

I wonder how many people actually have 1080p screens....on this forum, I'd wager a very high percentage, but the population at large I wouldn't be surprised if it's still quite in the minority.
 
I get the feeling this will be that PS3 style of power that not even first party devs seemed to be able to tap into in a way that showed it's undeniably better than the 360. Killzone only running at 30fps is a sign of that already being the case.
I wonder how many people actually have 1080p screens....on this forum, I'd wager a very high percentage, but the population at large I wouldn't be surprised if it's still quite in the minority.
The fact that they don't even make SD televisions anymore is a good sign that HD isn't "quite in the minority." I'd also wager that people who don't see the point of HD probably don't see the point of buying the latest greatest consoles either and aren't a consumer MS or Sony will reach until their price point is well below $100. Especially MS. They probably have zero interest in that type of consumer as they're kinda backwards, seemingly anti-technology, probably have shunned the internet along with good picture quality, and will never contribute to the eco-system.
 

artist

Banned
I get the feeling this will be that PS3 style of power that not even first party devs seemed to be able to tap into in a way that showed it's undeniably better than the 360. Killzone only running at 30fps is a sign of that already being the case.
stanley2qez8c.gif
 

~~Hasan~~

Junior Member
I wonder why epic has released a video of unreal engine on ps4 but not xbox one. I guess MS doesnt want a direct competition right now :/

Ita really hard to believe the ps4 is like 30 to 40% more powerful than xbox one. Is this confirmed or based on rumored specs ? Because as far as i know ms didnt talk about the detailed specs of their system
 
The original Xbox outclassed PS2, I remember some of the multiplats and there was a clear difference despite the much leaner install base.

Given how Sony has got it's act together in terms of hardware and development tools I would not be surprised if the majority of multi format console releases are superior on PS4
 

Orayn

Member
I get the feeling this will be that PS3 style of power that not even first party devs seemed to be able to tap into in a way that showed it's undeniably better than the 360. Killzone only running at 30fps is a sign of that already being the case.

How? Why? Does Sony have some anti-secret-sauce that makes their hardware secretly worse or harder to use? Everything we know about the PS4 and Xbox One so far puts the PS4 ahead in an apples to apples comparison. Hell, if you scroll up you'll see an explanation of how the One's ESRAM setup has the potential to make it trickier to program for, and thus more "PS3-like" than the PS4. Admittedly, Microsoft probably has a better toolchain for developers, but it's kind of hard to draw too many conclusions about what, if any, effect that's had on the games we've seen previews of so far.

Also, Killzone runs at 30FPS because Guerrilla deliberately targeted that framerate, just like how Forza 5 runs at 60. It really doesn't mean much. The original Playstation had a large number of beautiful 60FPS games, but Final Fantasy VII's battles ran at 15.
 

TheD

The Detective
I get the feeling this will be that PS3 style of power that not even first party devs seemed to be able to tap into in a way that showed it's undeniably better than the 360. Killzone only running at 30fps is a sign of that already being the case.

No, this is nothing like the PS3 at all!

The extra power in the PS3 was in the Cell's SPEs, something that needed a large effort to make use of for graphics (and could do nothing to help the RSX's TMUs and ROPs.), the PS4 on the other hand has all the extra power with what normally does the graphics and thus does not need any extra programming effort to be made use of.
 
If the rumors of downgraded clocks + 10% reservation for non-gaming are true, we are looking at ~.8-1 TFlops, with of course less memory bandwidth than officially stated.

What implications does this now have stacked against the PS4? That would be almost double the performance...the gap was large before, but now it seems genuinely massive.

I'm guessing devs will probably target 1080p for PS4 and then adjust the Xbox One resolution in order to get framerate parity, but we could be looking at resolutions as low as 720p in that case?

I suspect devs want to keep framerate roughly the same, so as to be the most authentic replication of the experience, so adjusting resolution seems more logical.

Resolution alone won't cut it... just doubling the pixels does not double the gpu power required.
720 p will help with the shoddy memory bandwidth but I don't see them porting a 30 fps ps4 game to xbox at 720 p and ending up with anywhere near 30 fps
 
I wonder why epic has released a video of unreal engine on ps4 but not xbox one. I guess MS doesnt want a direct competition right now :/

Ita really hard to believe the ps4 is like 30 to 40% more powerful than xbox one. Is this confirmed or based on rumored specs ? Because as far as i know ms didnt talk about the detailed specs of their system

It's significantly weaker enough that the XBOne will likely never have its full specs revealed, a business practice that would ne emulated from Nintendo.

I get the feeling this will be that PS3 style of power that not even first party devs seemed to be able to tap into in a way that showed it's undeniably better than the 360. Killzone only running at 30fps is a sign of that already being the case.

The fact that they don't even make SD televisions anymore is a good sign that HD isn't "quite in the minority." I'd also wager that people who don't see the point of HD probably don't see the point of buying the latest greatest consoles either and aren't a consumer MS or Sony will reach until their price point is well below $100. Especially MS. They probably have zero interest in that type of consumer as they're kinda backwards, seemingly anti-technology, probably have shunned the internet along with good picture quality, and will never contribute to the eco-system.

They aren't comparable in the slightest this post shows a rather large lack of understanding.
 

Durante

Member
Admittedly, Microsoft probably has a better toolchain for developers
You'd totally think so given historical precedent, right?

But then you get multiple reliable sources indicating that Sony's tools are actually ahead of MS'. It's hard to fathom, but there it is.
 

StuBurns

Banned
It's not that surprising to me, didn't MS say they've only been working on it a couple of years? PS4 started three years before that.
 

i-Lo

Member
I get the feeling this will be that PS3 style of power that not even first party devs seemed to be able to tap into in a way that showed it's undeniably better than the 360. Killzone only running at 30fps is a sign of that already being the case.

I find your lack of knowledge or bold faced trolling disturbing. Let me know when the next halo is running at 1080p 60fps. Like other members have already stated it's the same architecture with more power. It's like asserting that, for example, the benefits of better performance provided by AMD r7850 over r7790 are just theoretical which is downright incorrect.
 

RoboPlato

I'd be in the dick
You'd totally think so given historical precedent, right?

But then you get multiple reliable sources indicating that Sony's tools are actually ahead of MS'. It's hard to fathom, but there it is.

This is the most shocking thing to me. I have no idea what's gone on at either company that has caused Sony to improve the tools so much while MS has dropped the ball.
 

Orayn

Member
You'd totally think so given historical precedent, right?

But then you get multiple reliable sources indicating that Sony's tools are actually ahead of MS'. It's hard to fathom, but there it is.

Seriously? Jesus, wow. Microsoft's possible advantages are just evaporating one by one at this point. I'm still cynical and expect them to do outrageously well on brand loyalty and recognition alone, but they have a lot to lose if they don't.
 
Top Bottom