• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"GPU compute is the new SPU" - Will this be a big advantage for PS4 long-term?

rdrr gnr

Member
I'm sorry if this comes across as offensive, but this is the most ludicrous ass-backwards thing I have ever read on NeoGaff.

Graphics tech is DESIGNED to create a certain aesthetic look to a game. Graphical effects are used together with art to create a game's look. You CANNOT separate a games' graphics enigine technology from its art and then go on to say that game "A" is the best LOOKING game solely because of its tech (i.e. it includes certain graphical effects and features, regardless of the quality of said features), when a game'd LOOK by definition is a function of both its art AND underlying technology.

I mean lets look at BF4 and Crysis for example. Both are games that look astounding on PC, but morso in part because of their lighting engine and realtime GI implementation. On consoles however, the grossly simplified approximations for GI that both those games use - ostensibly at the expense of other graphical factors like texture res etc - don't look anywhere near as polished or high quality as say games like Halo 4, Uncharted 3 or TLOU. The sacrifices in other areas for the sake of their poor GI implementation that BF4 and Crysis 3 makes on consoles, renders those two games as mediocre at best graphically when compared with other games that trade those compromises for a much cheaper higher quality (albeit less dynamic) baked lighting solution.

And then we get to the art arguement, wherein I can agree that art design itself can be subjective, however the quality of a title's ART DIRECTION cannot be judged as anything but objective. Again we can use one of your examples Crysis 3, which along with its predecessor (C2) both are some of the worst games in terms of art direction this generation. Not only are the game's grossly inconsistent in their individual aesthetics, they also do not follow any kind of artistic design consistency with their founding title. Artistically Crysis 3 is a mess, and BF4 is boring photorealism that itself fails in what it tries to achieve due to its many compromises being too severe in the console versions.

Neither of the games you mentioned are unanimously held as some of the best looking games on current gen consoles. Sure you can have your own subjective opinions, but don't try to proclaim those as objective facts based on some poorly reasoned twisted notion that a game's look and overall graphical presentation can and should be judged solely on its graphical technology; which exists solely as a means to an end and not an end in and of itself.
This entire post is a truth bomb -- especially the bolded.
 

KKRT00

Member
So, how do you value each individual thing? Where does motion blur rank against caustics (which are actually really cool)? Because I can tell you right now that wind interacting with vegetation is meaningless when compared to say texture work or the animations of your avatar. You haven't convinced me that your entire argument isn't an appreciation for the engine somehow definitively meaning better looking. Because I look at Crysis 3 and see a hideously unattractive title.

By compute cost of it. Texture work depends of many aspects and its also subjective, animation is only art production issue, it doesnt cost more to use U3 animation than to use BF 3 animation for an engine.

==========

Graphics tech is DESIGNED to create a certain aesthetic look to a game. Graphical effects are used together with art to create a game's look. You CANNOT separate a games' graphics enigine technology from its art and then go on to say that game "A" is the best LOOKING game solely because of its tech (i.e. it includes certain graphical effects and features, regardless of the quality of said features), when a game'd LOOK by definition is a function of both its art AND underlying technology.

Thats so not true. If it was true than there was no multigenre engines like CryEngine, Unreal Engine or Frostbite. Tech features can be used for art purposes, but its not like people create only specific tech just for specific art in engines.
And yes, You can and You should seperate art from tech, because its totally subjective. Its like You would say that Crysis 2 isnt look amazing, because its art isnt that great or decrease importance of its tech, because art is not up to pair, right?

On consoles however, the grossly simplified approximations for GI that both those games use - ostensibly at the expense of other graphical factors like texture res etc - don't look anywhere near as polished or high quality as say games like Halo 4, Uncharted 3 or TLOU.
What does it even means 'as polished or high quality' when all Your mentioned games, look far from excellent or even great in most situations. I can post dozen of screenshots from those all games that look bad or even atrocious.
 
Killzone 2 and 3 do not use HDR, but use anisotropic filtering. None of ND game use anisotropic filtering and TLoU uses prebaked lighting, also particles are bad, not only they do not react to wind, but they arent lit by light sources like in every other game i've mentioned. One 1st party game cut something, other 1st party cut other thing, but in all those titles i've mentioned they all are there. KZ 3 even cut object based motion blur in comparison to KZ 2 just to push other stuff around.

NO game uses every single graphical feature that exists in a single title. We're taking about consoles that are fixed platforms, and so hw performance is fundamentally finite.

ALL game developers are thus forced to maked compromises in their titles, and Crysis 3 for example trades rudimentary things like texture resolution for a more dynamic (albeit low quality) lighting solution. Textures in Crysis 3 are horrible, and as a gamer judging the final product I can easily say that it was a mistake for Crytek to elevate the dynamism of a lighting solution over everything else graphically in their game.

You simply cannot judge a game by what engine or renderin features it includes and doesn't, beacause even if a game inculdes every fancy graphical technique under the sun, but in extremely shitty quality, it will still look shit compared to another that makes suitable compromises in some areas in order to create a desired look and vastly more polished overall aesthetic.

I can categorically say that Killzone 2 and 3 are better looking games than either of the Crysis games on consoles. So is Halo 4 and many othe exclusive and even third party titles.

It almost feels to me as if you've spent too long playing the PC versions of the games you mentioned and are actually misremembering how they actually look on console.
 

c0de

Member
Can we just leave KKRT00 his opinion and go on with what this thread is about? There is already defense-force present... *sigh*
 

KKRT00

Member
ALL game developers are thus forced to maked compromises in their titles, and Crysis 3 for example trades rudimentary things like texture resolution for a more dynamic (albeit low quality) lighting solution. Textures in Crysis 3 are horrible, and as a gamer judging the final product I can easily say that it was a mistake for Crytek to elevate the dynamism of a lighting solution over everything else graphically in their game.

Textures are not horrible in Crysis 3, what are You talking about? And lighting is on pair with prebaked lighting from TLoU in most cases and thats only counting sunlight, because lighting from dynamic light sources is better.

NO game uses every single graphical feature that exists in a single title. We're taking about consoles that are fixed platforms, and so hw performance is fundamentally finite.

Thats the thing, best engines do almost everything and thats what i'm talking about.

Why cant people accept that 1st Sony do not have the best engines? Their games still looks amazing, nothing is changed, except that they arent the best, because they arent.
 
Can we just leave KKRT00 his opinion and go on with what this thread is about? There is already defense-force present... *sigh*

Whilst agree with your first point, the 'defense-force' part wasn't really necessary. Most of the people disagreeing aren't arguing for any particular side.
 

rdrr gnr

Member
Can we just leave KKRT00 his opinion and go on with what this thread is about? There is already defense-force present... *sigh*
That's the problem. And defense-force what?
Why cant people accept that 1st Sony do not have the best engines? Their games still looks amazing, nothing is changed, except that they arent the best, because they arent.
This is why your entire argument is terrible as you have just moved the goalposts. No one is arguing that the company who licenses middleware tech for a living is being outdone by a studio with totally different priorities relegated to an old platform. And since this has gotten off-topic enough, feel free to create a thread titled "Crysis 3 is objectively the best looking console game."
 
Thats so not true. If it was true than there was no multigenre engines like CryEngine, Unreal Engine or Frostbite. Tech features can be used for art purposes, but its not like people create only specific tech just for specific art in engines.
And yes, You can and You should seperate art from tech, because its totally subjective. Its like You would said that Crysis 2 isnt look amazing, because its art isnt that great or decrease importance of its tech, because art is not up to pair, right?

What are you even saying here?!? "Tech features can be used for art purposes, but its not like people create only specific tech just for specific art in engines"... huh?

Graphical features in an engine like: texture filtering, anti-aliasing, ambient occlusion, radiosity, sub surface scattering, parallax occlusion mapping etc etc etc were all invented to take game art and try to visualise it in a way that makes it look clear, believable and realistic. Graphics technology doesn't exist without art, because its the art that the technology works on to create the image that you see on screen. Do you even know what you are talking about because it certainly doesn't sound like you do?

Also, almost every graphical technological engine feature can be implemented in a game to varying degrees of quality. Take a look at AA for example, you have MSAA, MLAA, FXAA, SSAA and a whole host of other more exotic implementations as well as combinations of different techniques. Just saying this game has "X" feature is meaningless because a game like Crysis can use a more performance demanding feature like their "realtime" GI approximation, and yet a game with a much cheaper to implement baked lighting solution can display vastly more impressive results.

Killzone games for example don't use HDR lighting, and yet the lighting in those games outshine a great plethora of other titles that spend copious GPU cycles computing high quality HDR lighting.

Every games on a fixed platform makes compromises in terms of what graphical features it includes and it doesn't, as well as to what quality of the implemented features is used. Thus its just straight up ignorant to compare games solely on the list of graphical techniques that are used, and completely ignoring everything else.

What does it even means 'as polished or high quality' when all Your mentioned games, look far from excellent or even great in most situations. I can post dozen of screenshots from those all games that look bad or even atrocious.

Well at least you're asking questions now, which is really what you should be doing to learn more about the subject matter of which you are clearly not an authority, despite the way you ty to present yourself with your posts. See above for the answer.

As for the rest of your post, you're very much entilted to hold such an opinion about those game, despite your opinion being objectively an unpopular one amongst the masses. The prevailing mindshare however would disagree with you, and so would I as luck would have it.
 
Can we just leave KKRT00 his opinion and go on with what this thread is about? There is already defense-force present... *sigh*

There's nothing more that needs to be said from me on the subject. As long as the rest of Gaf can clearly see that he speaks from nothing other than opinion and little else. I simply felt the need to correct him in his thinking and reasoning (or apparent lack thereof), in order to bring a little more education and rationalism to the discussion.

I am happy for the discussion to get back on topic. Apologies to all in here if my posts were seen as anything resembling a thread derailment.
 

Feindflug

Member
That's the problem. And defense-force what?

This is why your entire argument is terrible as you have just moved the goalposts. No one is arguing that the company who licenses middleware tech for a living is being outdone by a studio with totally different priorities relegated to an old platform. And since this has gotten off-topic enough, feel free to create a thread titled "Crysis 3 is objectively the best looking console game."

Best looking console games or not the Cryengine is one of the most advanced engines we've seen on the consoles. Both Crysis games look amazing considering the hardware IMO, well at least the 360 versions that I have played...same with Metro Last Light, when I played the game I couldn't believe 4A Games pulled this off on the ancient 360 hardware.

That being said some PS3 exclusives like TLoU, GoW3/Ascension and Uncharted 2/3 look absolutely amazing as well.
 

bj00rn_

Banned
PS3 and 360 where much much closer and there were still significant differences, ones that we cared about between many multiplats.

Everything is relative. Meaning arguably nobody in "the real world" cared if f.ex. Red Dead Redemption showed a couple of bushes less in the PS3 version over the 360 version when they are sitting there playing the game. Same with theoretical hardware differences. The reason we care here, relatively speaking, is because it is technically interesting to do direct comparison (and of course..."system wars" but f that..) and make a discussion around it.

So in the big picture the difference between the PS3 and the 360 was basically nothing. And I suspect the same will be true between the XbOne and the PS4. Also, because PC hardware will be significantly faster at launch and beyond, it will make the difference even less practically relevant.
 
Hopefully this feature extends the lifespan of the console. I thought this gen's length was perfect. Publishers can complain all they want to but I don't want to be asked to buy a new console every five years. That is almost a new console with every new president - no thanks. And what if they cant stay afloat with that kind of cycle? Every 4 years? They simply have to find better ways to make cash & cut costs and quit asking more of me as consumer like I'm not doing my part or something. It almost feels like they are blaming us without saying it outright every time they speak of the industry.
 
Well, if you want to use the ESRAM to store your framebuffer you're wasting some amount of bandwidth since the ROPs can't actually saturate the bus and you can't use it as a low latency data source for compute. If you want to use the ESRAM as a data source for compute and/or texturing then you have to put your framebuffer in DDR3 where you will suddenly be fill limited and have lots of contention. In both cases there is bandwidth waste from moving data back and forth between the pools so that the total effective bandwidth is significantly less than the apparent aggregate bandwidth.
From all the info we have, it's not a 'either one of these' scnearios. The DMEs can tile pretty much any type of data that makes sense to be tiled, like textures render targets or even the framebuffer.

One interesting scenario would have part of the framebuffer located in esram, part in the main ram, and similar setups for the shadow buffer, textures, shader source data etc...

In a situation like that esram could be use for souce of data, even for texturing and the gpu could do frame buffer writes in both ram pools at the same time.

That's how the system is supposed to have a real aggregate bandwidth of 170 GB/s (or higher according to that DF leak)
 

ethomaz

Banned
From all the info we have, it's not a 'either one of these' scnearios. The DMEs can tile pretty much any type of data that makes sense to be tiled, like textures render targets or even the framebuffer.

One interesting scenario would have part of the framebuffer located in esram, part in the main ram, and similar setups for the shadow buffer, textures, shader source data etc...

In a situation like that esram could be use for souce of data, even for texturing and the gpu could do frame buffer writes in both ram pools at the same time.

That's how the system is supposed to have a real aggregate bandwidth of 170 GB/s (or higher according to that DF leak)
You are close to the best scenario and any developer will have trouble to make this work... that's what make the use of eSRAM more complex compared to PC or PS4 development.

And the DF leaks... the eSRAM can't read and write at the same time... it is one or another task... you can't double the bandwidth like the DF told to you.
 

c0de

Member
You are close to the best scenario and any developer will have trouble to make this work... that's what make the use of eSRAM more complex compared to PC or PS4 development.

And the DF leaks... the eSRAM can't read and write at the same time... it is one or another task... you can't double the bandwidth like the DF told to you.

So you are a developer and are currently working with a devkit or where did you get the info from?
 
You are close to the best scenario and any developer will have trouble to make this work... that's what make the use of eSRAM more complex compared to PC or PS4 development.
I can imagine being a nightmare coordinating the DMEs to get all the needed setup in order at the right time, to make the magic happen.

And the DF leaks... the eSRAM can't read and write at the same time... it is one or another task... you can't double the bandwidth like the DF told to you.
To be fair, they didn't said the bandwidth would double (which even cause rage as some took that as confirmation that there were indeed a downclock), they said they had loop holes that could be exploited by some very specific operations...

But yeah, none of that makes much sense.
 
I'm sorry if this comes across as offensive, but this is the most ludicrous ass-backwards thing I have ever read on NeoGaff.

Graphics tech is DESIGNED to create a certain aesthetic look to a game. Graphical effects are used together with art to create a game's look. You CANNOT separate a games' graphics enigine technology from its art and then go on to say that game "A" is the best LOOKING game solely because of its tech (i.e. it includes certain graphical effects and features, regardless of the quality of said features), when a game'd LOOK by definition is a function of both its art AND underlying technology.

I mean lets look at BF4 and Crysis for example. Both are games that look astounding on PC, but morso in part because of their lighting engine and realtime GI implementation. On consoles however, the grossly simplified approximations for GI that both those games use - ostensibly at the expense of other graphical factors like texture res etc - don't look anywhere near as polished or high quality as say games like Halo 4, Uncharted 3 or TLOU. The sacrifices in other areas for the sake of their poor GI implementation that BF4 and Crysis 3 makes on consoles, renders those two games as mediocre at best graphically when compared with other games that trade those compromises for a much cheaper higher quality (albeit less dynamic) baked lighting solution.

And then we get to the art arguement, wherein I can agree that art design itself can be subjective, however the quality of a title's ART DIRECTION cannot be judged as anything but objective. Again we can use one of your examples Crysis 3, which along with its predecessor (C2) both are some of the worst games in terms of art direction this generation. Not only are the game's grossly inconsistent in their individual aesthetics, they also do not follow any kind of artistic design consistency with their founding title. Artistically Crysis 3 is a mess, and BF4 is boring photorealism that itself fails in what it tries to achieve due to its many compromises being too severe in the console versions.

Neither of the games you mentioned are unanimously held as some of the best looking games on current gen consoles. Sure you can have your own subjective opinions, but don't try to proclaim those as objective facts based on some poorly reasoned twisted notion that a game's look and overall graphical presentation can and should be judged solely on its graphical technology; which exists solely as a means to an end and not an end in and of itself.

Confused how you see Crysis 3 as a confused artistic mess. Most of its art looks rather consistent in my minds eye.

Did you play the game on PC? Did you beat the whole game? Looked pretty consistent to me...
 

c0de

Member
seems ps4 still has some bugs in it:

py9x.jpg
 

ethomaz

Banned
So you are a developer and are currently working with a devkit or where did you get the info from?
PC and hardware developer not related to games... and read and write at the same time is physical impossible... you need an electric pulse to read and a electric pulse to write on any type of memory with the tech available.

GDDR memory simulate read/write simultaneous... it divide the memory allocation in a a lot of chunks.. so you can write in one chunk and read in another one at the same time.

But at the end the bandwidth didn't change... for example... 50GB/s is 50GB/s for read in all chunks, write in all chunks or read and write mixed in all chunks.

The possible ways to increase the eSRAM bandwidth is increasing the clock, increasing the bus width or creating other bus width... the first two I don't need to explain but the third is like the GPU access the eSRAM using a 128bits width and the CPU using a 32bits width... so you can read or write different chunks of data in eSRAM at the same time with two bus width.

Now the issues... 32MB eSRAM is too small to link other bus width... it is possible? Yes... and the numbers leaked by DF can't be true unless you downclock the clock of the eSRAM and MS already confirmed they didn't downclocked it.

The best "theory" is yet that the some DataMove (or CPU) access the eSRAM using a different bus than the 128bits used by GPU but that make the DF article again wrong because the numbers are probability fakes.

There are another "theory" that I'm just talking bullshits here... I'm a Junior and don't know anything about that... so you don't need to take my comments too serious ;-)... I have a lot to learn yet.
 

Dr. Kaos

Banned
sometimes a game can hover at performance between 30 and 60 fps and is locked at 30. The ps4 performance difference might allow it to reach 60 in some cases.

The framerate differences would easily be enough to where a dev could target 35-40FPS on Xbox One (but lock it to 30 FPS), and have enough horsepower on PS4 to run it close to 60 FPS.

But I think ultimately devs will try and keep framerate consistent, and adjust resolution/effects instead [...] And of course first party titles will truly show some differences.

For the PS4 to run a 30fps xbone game at a locked 60fps, it needs to be twice as powerful, not "just" 25%-70% more powerful. Either that or the game must target the PS4's strengths exclusively and then port to xbone as an afterthought. This is very unlikely. We've already been told that PC is the main target for next-gen (lucky Oculus Rift owners..)

What it will likely do is either run the same game at a solid 30FPS when the xbone has slowdowns/frame drops, or have a sharper image because the xbone will render internally at a lower resolution and then upscale, as they mimicked in the DF article.
 
For the PS4 to run a 30fps xbone game at a locked 60fps, it needs to be twice as powerful, not "just" 25%-70% more powerful. Either that or the game must target the PS4's strengths exclusively and then port to xbone as an afterthought. This is very unlikely. We've already been told that PC is the main target for next-gen (lucky Oculus Rift owners..)

What it will likely do is either run the same game at a solid 30FPS when the xbone has slowdowns/frame drops, or have a sharper image because the xbone will render internally at a lower resolution and then upscale, as they mimicked in the DF article.
I think you missed their point. They said that if a game is running at 60fps on PS4 and only 45fps on XBOne, it'd be better to simply lock the XBOne version at 30fps because 45fps would be a stuttering mess. They didn't really say the XBOne could only muster 30fps.
 

Dr. Kaos

Banned
I think you missed their point. They said that if a game is running at 60fps on PS4 and only 45fps on XBOne, it'd be better to simply lock the XBOne version at 30fps because 45fps would be a stuttering mess. They didn't really say the XBOne could only muster 30fps.

I agree with you (and them) that it can and probably will happen on a few games. I just think that decreasing the internal rendering resolution for xbone or using smaller textures and targeting the same FPS on both machines will be the path more developers choose. I might be wrong :)

During the game, however, it's very likely that the actual playing experience yields smoother framerates in hectic situations, as BigDug pointedly mentioned:

I'd be perfectly fine with multiplats looking exactly the same between XBO and PS4. That should result in a more stable framerate with less slowdown in high action scenes on PS4 however. And perfectly stable frame rates in a game's most demanding action sequences is just as important if not more important than the actual visuals, so by all means, tune the visuals to be exactly the same between the two systems and let the PS4 have additional hardware muscle waiting to kick in when things get hectic.
 

MercGH

Banned
As someone who owns both consoles and enjoys all games, I think the 360 is better than the PS3 for its os, games, andonline play/multiplayer/community. Xbox Live is fantastic. The PS3 is also good as well with great exclusives but think the 360 is overall the better console.

With that said, it is very disheartening MS went this route with the XB1. Sony looks to have one upped them in all ways hardware. If MS had something unique/custom to the hardware then surely we would have heard of it by now. The Xbox One actually looks like it was hastily designed and not as planned/thought out as Sony. With all the success they made from the 360, it is sad MS could not have invested in better hardware than Sony and even just more than 32 ESRAM.

The MS choices in hardware leave me no choice but to buy a PS4.
 

KKRT00

Member
1st stuff

You are completely mistaking art and assets. And most graphical techniques are not created to enhance art, but to make better representation of physical properties.
Assets are part of the tech, Art is way of combining assets and effects into visually pleasing image or to achieve distinct look.
You can take the same assets, color them differently and mix one into artistic image and second into completely random way. Both will cost the same, both will use same tech, but one will be artistically pleasing and one will not, but from tech perspective they will be equal, because tech can be judged objectively.


2nd stuff
Its nice to take a fragment of the whole sentence and rip it apart, its also easy. Now do the same with whole sentence, because i dont see how those games are polished or high quality, when theirs assets and tech distribution is not equal in every department. Like in TLoU You have brilliant faces and main characters, but others are low poly or poorly textured, You have nice textures in some surfaces, but then everything is messed up by lack of AF, You have pretty ok vegetation, but they pop-in 10 meters away from You, You have good baked lighting, but Your real times shadows are awful etc
If thats polish for You, that i dont know what to say.

---
Also You think that Crysis 3 or BF3 have poor and not consistent art? So You havent even played those games?
Go to bullshot thread and check all C3/BF3 shots then.
 
Confused how you see Crysis 3 as a confused artistic mess. Most of its art looks rather consistent in my minds eye.

Did you play the game on PC? Did you beat the whole game? Looked pretty consistent to me...

There was a thread about this right here on Gaf where some more eloquent posters made some very poignant points and expressed how Crysis 2 & 3s art direction was far less than consistent and in many cases a mess.

I played all Crysis games on both console and PC, however frankly it matters not which version I played as that's not really got much to do with the arguement that the games have rather bad art direction.

What I mean by that is the thematic elements of the game's visual aesthetic, together with how everything is designed graphically, and how consitently it all meshes together. Crysis games since the first started chasing technology implementation over visual conhesiveness and I think it shows in almost every aspect of the game's aesthetic.

For example, just look at the artistic design of the nanosuit in C2/3 compared to Nomad's on the first game. Nomad was suitably proportioned (no hideously broad shoulders and tank ass) and in keeping with the first game's pseudo-realistic military themes. Everything from the designs of the world, to the vehicles, to the weapons, to the korean enemies, even as far as the alien enemy designs were visually striking, but moreso consistent, unique and distinctive in its style. The entire colour palette for the game was perfect for the world's believability, in subtle greens and blues with muted tones.

Crysis 2/3 on the other hand is a mishmash mess of garshily bright colours, contrasted horrifically with the torn-up urbanised landscapes, cell soldiers that suddenly look like something out of a poor man's mass effect, and alien enemy designs that look like they might as well have been a completely different species than the ones presented in the first game; as even down to the technology of which they make employ the designs are grossly dissimilar.

Crysis 2 & 3 look like they were crafted by completely different artists than the first game, and the overall aesthetic of both those games just comes across to me as uncomfortable, messy and overall badly designed, regardless of the level of sophistication in the graphics technology running under the hood.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
seems ps4 still has some bugs in it:

py9x.jpg

Cort is a Ice team driver guy, of course the low level stuff isn't bug free or optimized yet. This is good news, all the current demos are using these old drivers. AMD probably had next to nothing to start with for BSD.
 
I agree with you (and them) that it can and probably will happen on a few games. I just think that decreasing the internal rendering resolution for xbone or using smaller textures and targeting the same FPS on both machines will be the path more developers choose. I might be wrong :)

During the game, however, it's very likely that the actual playing experience yields smoother framerates in hectic situations, as BigDug pointedly mentioned:

I believe you are mostly likely correct in that this is most likley how developers will do it (i.e. take advantage of the extra horsepower on PS4).

Going from 60fps down to 30fps will be too great a change, actually changing the fundamental gameplay of a title, and would most likley be met with fierce hostility from the XB1's fanbase. That would be counter productive considering they still want to sell as many copies of their game on both platforms.

Changing the framerate between the versions would be commercial suicide on the developer, publisher and marketing front, whereas dropping the resolution slightly on the weaker version in order to maintain the same target framerate would be the better choice.

Problem is though I cannot imagine a drop in resolution by a third as beign a viable option for most devs (assuming linear scaling of the game engine when going from the PS4 to the XB1 with 33% less GPU performance).

So I do wonder if a dev targeted PS4 first at 60fps, how much would the internal rendering resolution need to be dropped in order to stay at a consistant 60fps on the XB1? I suppose we'll gain more of an idea with some of the launch software like Watchdogs, BF4 and AC4 when they release.

It will certainly be interesting.
 

c0de

Member
Cort is a Ice team driver guy, of course the low level stuff isn't bug free or optimized yet. This is good news, all the current demos are using these old drivers. AMD probably had next to nothing to start with for BSD.


I know who both are, that's why I posted.
 
From all the info we have, it's not a 'either one of these' scnearios. The DMEs can tile pretty much any type of data that makes sense to be tiled, like textures render targets or even the framebuffer.

One interesting scenario would have part of the framebuffer located in esram, part in the main ram, and similar setups for the shadow buffer, textures, shader source data etc...

In a situation like that esram could be use for souce of data, even for texturing and the gpu could do frame buffer writes in both ram pools at the same time.

That's how the system is supposed to have a real aggregate bandwidth of 170 GB/s (or higher according to that DF leak)

Where is the data in the ESRAM coming from? You had to copy it from somewhere, so you consumed DDR3 bandwidth reading it, Move Engine bandwidth moving it, ESRAM bandwidth writing it and then more ESRAM bandwidth reading it while you render. Like I said, whatever you do, it's a trade off because it either costs you something somewhere else or prevents you from fully exploiting your resources. You will literally never, ever manage to get anywhere close to the full 170GB/s aggregate in practice which is why that figure simply cannot be compared to the 176GB/s of the PS4's GDDR5.
 

ethomaz

Banned
Where is the data in the ESRAM coming from? You had to copy it from somewhere, so you consumed DDR3 bandwidth reading it, Move Engine bandwidth moving it, ESRAM bandwidth writing it and then more ESRAM bandwidth reading it while you render. Like I said, whatever you do, it's a trade off because it either costs you something somewhere else or prevents you from fully exploiting your resources. You will literally never, ever manage to get anywhere close to the full 170GB/s aggregate in practice which is why that figure simply cannot be compared to the 176GB/s of the PS4's GDDR5.
eSRAM is a band-aid... it is not suppose to make the Xbone reach the potential bandwidth of PS4... it is just to try to avoid possible bottleneck caused by the low memory bandwidth in Xbone.

At full potential the PS4 will always do better in bandwidth terms but games is not just bandwidth... there are more running.

Edit - Just a guesswork... I guess MS always have the plan to go with 8GB memory but the GDDR5 for high bandwidth was not available in 4Gbps to make the design viable... so they choose to go with DDR3 with low bandwidth plus eSRAM to give a option for developers avoid the bottlenecks caused by that low bandwidth. Sony plan was always go with 4GB memory and the GDDR5 2Gbps always supported this design... both company worked with what they have at hands in early days... so 2013 came and 4Gbps gets viable... so just changed the GDDR5 2Gbps per GDRR5 4 Gbps mantaining the same design... and MS can't change the design in time to release the console in 2013... so we have the actual scenario... the 8GB GDDR5 really got MS pants low... and Sony just have lucky because the 4Gbps gets ready and viable at the right time.
 

antic604

Banned
Edit - Just a guesswork... I guess MS always have the plan to go with 8GB memory but the GDDR5 for high bandwidth was not available in 4Gbps to make the design viable... so they choose to go with DDR3 with low bandwidth plus eSRAM to give a option for developers avoid the bottlenecks caused by that low bandwidth. Sony plan was always go with 4GB memory and the GDDR5 2Gbps always supported this design... both company worked with what they have at hands in early days... so 2013 came and 4Gbps gets viable... so just changed the GDDR5 2Gbps per GDRR5 4 Gbps mantaining the same design... and MS can't change the design in time to release the console in 2013... so we have the actual scenario... the 8GB GDDR5 really got MS pants low... and Sony just have lucky because the 4Gbps gets ready and viable at the right time.

Yes, that's the prevailing interpretation of Feb's 8Gb GDDR5 reveal. Sony got lucky, but they helped their luck having Mark Cerny on board :)
 

mrklaw

MrArseFace
From all the info we have, it's not a 'either one of these' scnearios. The DMEs can tile pretty much any type of data that makes sense to be tiled, like textures render targets or even the framebuffer.

One interesting scenario would have part of the framebuffer located in esram, part in the main ram, and similar setups for the shadow buffer, textures, shader source data etc...

In a situation like that esram could be use for souce of data, even for texturing and the gpu could do frame buffer writes in both ram pools at the same time.

That's how the system is supposed to have a real aggregate bandwidth of 170 GB/s (or higher according to that DF leak)

Don't the DMEs have a limit of 25Gb/s?
 
Top Bottom