• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Project CAR Dev: XB1 ESRAM Mitigates Some of PS4s Unified Memory Advantage

Yep sound about right

DDR3 is too slow if I recall for Deferred rendering to work properly

Still "mitigates" sounds like a bandaid solution to me
 

TheD

The Detective
Since y'all are talking about PS4 vs. Xbone tech, I thought I'd ask this. I was watching a video from IGN showing a side by side comparison of whatever the new Call of Duty game is, and my eyes must suck or something, cause the Xbone version looks superior. The PS4 version seems to lack a lot of lighting.

Did the developer purposely gimp the PS4 version or is there really some truth to the secret sauce stuff in the Xbone?

It does not look better http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-ghosts-next-gen-face-off ,
You are just seeing the result of video compression + a sharping filler on the XB1 version.
 

HORRORSHØW

Member
Since y'all are talking about PS4 vs. Xbone tech, I thought I'd ask this. I was watching a video from IGN showing a side by side comparison of whatever the new Call of Duty game is, and my eyes must suck or something, cause the Xbone version looks superior. The PS4 version seems to lack a lot of lighting.

Did the developer purposely gimp the PS4 version or is there really some truth to the secret sauce stuff in the Xbone?
Did the Xbone version "pop"? If so, yeah, that's the secret sauce.
 
Games like NFS (1080p and Frostbite 3) and Fifa (1080p and MSAA) indicate XB1 doesn't have a problem with large framebuffer sizes.

Some cherrypicked examples there, and even then they had to cut effects from NFS. The more demanding games all have reduced resolutions.
 
Developers have been designing for the Xbox 360 the last 8 years. It has 10 MB of EDRAm, I don't see how 32 MB of ESRAM on the Xbone makes a world of difference. This isn't some mysterious, out of the blue type of design.

Its purpose and functions are different and how devs use it is up to them.
 

kitch9

Banned
The point is that its not a large difference. Engine scales settings from 1Tflops to 12 Tflops and difference between Xbone and PS4 is up to 500 gflops, so its less than 5% for an engine in grand scale.
For an engine, both platforms are in similar performance ratios and they need to use similar lower end settings to work in 30hz. So no, its not as relevant in grand scheme and its definitely not a big difference for a developer.

----------------


Name one gaming related algorithm/feature that hUMA is enhancing/allowing and how it will affect PC port.

It's shit logic.

No one cares about PC, apart from the pc guys who are too busy fucking around with settings to actually enjoy anything.

(I'm one of them.)
 

KKRT00

Member
You keep ignoring the fact that pixel fillrate is one of the major sources of performance bottlenecks for a GPU. PS4 has twice the pixel fillrate. That's not a small difference.
Eh, both console do not have problem with rendering 1080p framebuffer in 30hz, they have problem with rendering advanced rendering features and assets in those resolutions in stable framerate.

---------

Fixed for playable resolution, it's what I think it will be future PC gaming. I don't think we get more fill rate in the future. It is just too much time and money for devs.

Its not even true for BF 4 and Crysis 3 already. There are also other games with advanced rendering features on PC and we were talking about titles that will come out in 3-4 years.

--------------

Engines scale from low 800x600 to ultra 8K. So that 5% could make a difference between 900p (720p) low and 1080p (900p) medium.
I was not talking about resolution, but rendering features.
hUMA allows you to do Synched GPGPU compute without flushing CPU/GPU cache,
Why couldnt You do this on Xbone too? Like what is technical reason to not use same pointers by CPU and GPU? Its software issue, not hardware issue. Especially when move engines support both CPU and GPU and both CPU and GPU have access to EDRAM as well as DDR3.

=============
I guess the differences between Wii U and Xbox will also start fading as the years roll by. Logic 101 - 350 gflops vs 1.1Tflop (Dat 10% tax) that's 750 GFLOPS of difference, barely much more than the difference between PS4 and Xbone. Nintendo redeemed! You guys just need to wait for it. The generation is going to be 24/7 denial for some people. That's how it works right? Same basic concept as the further you sit from the T.V the more it looks like games look the same. All of them.

And this the time when i stop arguing with You. You deliberately missed the point of 4 of my posts and then when i described everything like to kid, You started to trolling.
Yeah Wii U is same situation as Xbone and PS4, because both have same CPU, similar bandwidth and GCN based GPUs ...

======
It's shit logic.

Its not shit logic, when real world examples support it.
 

Bundy

Banned
XB1 ESRAM Mitigates Some of PS4′s Unified Memory Advantage
So it "mitigates some of the PS4's unified memory advantages."
To bad it can't "mitigate" some of the PS4's superior GPU advantages ;)
etc. etc.
 

Dorfdepp

Neo Member
It's shit logic.

No one cares about PC, apart from the pc guys who are too busy fucking around with settings to actually enjoy anything.

(I'm one of them.)

1386763475103.gif


Just stop, it's hard to take anything you say serious now.
 

Chobel

Member
Why couldnt You do this on Xbone too? Like what is technical reason to not use same pointers by CPU and GPU? Its software issue, not hardware issue. Especially when move engines support both CPU and GPU and both CPU and GPU have access to EDRAM as well as DDR3.

You can't do this in Xbox One because CPU can't access and can't know what in GPU cache. it has nothing to do with eSRAM, DDR3 or move engines, if GPU change something (it will happen in GPU cache as usual) CPU must wait for that to be written in main memory to get access to it.

Eh, both console do not have problem with rendering 1080p framebuffer in 30hz, they have problem with rendering advanced rendering features and assets in those resolutions in stable framerate.

...
I was not talking about resolution, but rendering features

Ok I'm lost ,what are we arguing here about again?
 

kitch9

Banned
1386763475103.gif


Just stop, it's hard to take anything you say serious now.

I spend half my time on my PC fucking with settings / drivers / forums finding fixes.

By the way this conversation is irrelevant to the discussion at hand, much like the chap banging on about PC GPU's that don't even exist and will still have all the same shit problems.
 

mrklaw

MrArseFace
I know majority of this thread is jumping to necks because of hardware.

But I would like to know is the pCARS going to run in @60fps on nextgen consoles? And if it does, under which resolution?

Rest is jibberish.

Honestly, if neither console supports legacy FFB wheels, it'll be a crappy gen for racing games
 

nib95

Banned
Seems about right. I wonder if Dice and other first wave multi platform devs used the eSRAM efficiently.



It's posts like this that makes GAF a horrible place sometimes. Please think before you post.

He's not far off the mark though. I believe games using deferred rendering have frame buffer sizes exceeding 32mb at 1080p, so 900p (or less) is more likely.
 
I respect everyone opinion.

But I can't accept some comments made about xb1 hardware vs ps4, specially when A) are coming from console only users B) it keeps failing in recognizing the obvious difference in raw power.

I'm an avid PC gamer, I actually own an high end pc, some comments just don't make sense. I think the main problem is that most console users don't know or never owned a gaming PC, so they fail to know the subject they're talking about.
Yet they still talk about teraflops and bandwidths like nothing.
Just don't forget that both new consoles are built on x86 platform, also both share very very similarities with same components on PC.

What makes PC different as a platform is that we don't have a single choice of an esacore CPU. We have at least several amd CPUs and several intels.
We don't have only a model of Radeon 7850. There are countless, personalized, enhanced, whatever.

When were down to the point that the discussion is merely raw power between consoles, the winner is clear as a day for obvious reason.

Ps4 has a better gpu. A better Ram, a better system as a whole. It's easier to develop on it because it hasn't got any esram in it, is just plain simple, mostly like PC.

The real thing is the gpu power here. A kill zone like game on xbone would be *approximately* similar only towards the life cycle of the console. There's no way, esram or not, that it can be the same 1080p with variable framerate and adaptive vsync natively on it.
No, just no. Esram doesn't do magic. It will be used and exploited to the max yes , but the system beneath it will be the very same.

What's happening here is what happened with 360 vs ps3 in terms of raw computing; the 360 had a superior bandwidth over ps3, thing that didn't made things easier. I remember capcom explaining why SF4 had to go some extensive cuts here and there on ps3 for the very same reason.

Also, IMHO, Project Cars will run 30fps/1080p, pretty sure of it. Sadly I don't see newer consoles delivering 60fps/1080p anytime soon.
Bf4 attempted to so it, but it's 900p. It still goes into an embarrassing 35fps on large maps with a lot of players. Kill zone has section that runs at 60fps, but mostly is 30, sometimes even lower. Yeah those are launch titles and there's still long way to go, but remember that the more we go further, the more game engines wi be demanding, the sooner we will hit some bottlenecks. And 60/1080p would be something really hard to achieve.

I would consider hybrid framerates if I were a developer. If, as an example, they would have capped KZ at 45fps, it would have turned into a better experience, more linear and smooth.

I know I am probably wrong on everything I said but I had to take all this off my chest.
 
I don't understand how this statement is even news.

Came in thinking they'd be some new ESRAM info. Got nothing.

Of course it will "mitigate some of....", er, this can't be news can it?

Used for the render targets? Like we've not discussed that to death already. Would be news if they'd given some specifics, such as how many render targets they're using, what resolution they're going for and how they are working within the 32mb limitation. That would be thread worthy.
 
He's not far off the mark though. I believe games using deferred rendering have frame buffer sizes exceeding 32mb at 1080p, so 900p (or less) is more likely.

With 4 render targets you're looking at 39mb. That's without AA, and you may want to do more than 4 render targets. I believe Killzone 2 uses 36mb and that's not Full HD, obviously.

Saying that, they've managed 1080 in Need For Speed on the XBone, so you can work it by shifting render targets to and from main memory, and losing some effects (like the bokeh, which I imagine was one render pass too many, hence not including it).
 
Just mitigates some advantage? They even say GDDR5 is faster. This isn't really that great of news for eSRAM.

Yeah but it's what we already knew when esram was announce and why many of us were against it. 50% less GPU that takes the ram from horrible to not as inferior but more worrk/complicated.
 
Wrong. The 360 and XB1 have basically the same configuration, the esram us just more flexible than the RDRAM.

I strongly disagree. For me X1 is closer to the PS2 memory architecture and has not a unified memory architecture. It hasn't.

The 32MB esram is technically some fast video ram, Vram, like the 4MB of the PS2.

The Edram of the X360 was basically just a framebuffer, some kind of specialized memory cache for the GPU used to store the framebuffer and allowed great cheap (but not free) AA. But X360 had the fastest ram at the time for its real unified memory: GDDR3 ram. And people tend to forget X360 had a very powerfull GPU, stronger than PS3's.

X360 was: Stronger GPU, fast unified memory (still roughly twice faster than the memory of the younger Wii U); not the most praised elements in the X1 hardware. The edram was just allowing cheap multisampled aa.

I remember the critics of developers on the PS2 architecture, really similar to the one given to the X1. But It didn't prevent the console to have great games and exclusives.
 

Piggus

Member
This. Remember how at first the PS3 had worse multiplats because it was harder to deal with? GTA V at the end of its generation was the same on both consoles except PS3 had less pop-in. So what's probably going to happen in the end is that X1 starts out at a disadvantage and as tools mature and devs get more used to it we'll see both consoles be about the same with X1 only slightly worse than PS4.

No, actually that's not going to happen.

I suppose I'll elaborate for the 800th time. It's getting old honestly.

The PS3 was at an initial disadvantage for having a slightly weaker GPU and a much stronger CPU that was VERY hard to develope for. So it didn't really start to catch up or surpass the 360 until devs really understood how to program for it. This is a much different situation from what we have now. The PS4 and Bone have essentially the same architecture, but one of them (the PS4) has a much more powerful GPU AND it's easier to develop for. What makes you think Microsoft will improve their tools so well that they'll be able to make up such a huge performance gap? That's assuming Sony is full of idiots and their tools will never improve, which is obviously not the case. That's like saying AMD will release driver that make a Radeon 7770 run as well or better than a Radeon 7870. It's not. Going. To happen. Ever. If a game is exactly the same on both systems then it was basically built for the Bone and then ported to PS4 with no additional improvements or optimizations. That's arguably lazy development (or developers who don't want to offend their masters in Redmond) and has nothing to do with the Xbox "closing the gap."
 

JP

Member
Things obviously will get better in time for teams who are struggling to develop for the Xbox's eSRAM but at the same time things will also improve for the PS4 and the vast majority of games are going to run a little better on the PS4.
 

markao

Member
I know majority of this thread is jumping to necks because of hardware.

But I would like to know is the pCARS going to run in @60fps on nextgen consoles? And if it does, under which resolution?

Rest is jibberish.
Yes it is the target and it is already running at 60 FPS on one of the two systems, take a wild guess. And something I think you might also be interested in Amar, with the same physics tick-rate as the PC version currently runs at (600Hz).

Haven't read anything about resolutions or their targets, so no clue.


Oh and Andy loves his XB1 :p
 

coldfoot

Banned
The PS4 has an advantage, yes. Are you really saying the facts indicate it's a "my god" advantage?
Besides, facts and god don't mix.

The facts are:
6-10 fps advantage in BF4 while rendering at 900p vs 720p.

That's a "my god" advantage in my book. Others might categorize it differently.
 

strata8

Member
The facts are:
6-10 fps advantage in BF4 while rendering at 900p vs 720p.

That's a "my god" advantage in my book. Others might categorize it differently.

A game that's a launch title, plagued with bugs, and runs better on standard PC hardware than it does on the PS4.

BF4 is definitely a "my god" advantage but others (like NFS: Rivals) are not. Once revs get grips with the XB1 memory setup I think we'll see PS4 at 1080p and XB1 at 900p with all else being equal.
 

coldfoot

Banned
A game that's a launch title, plagued with bugs, and runs better on standard PC hardware than it does on the PS4.

BF4 is definitely a "my god" advantage but others (like NFS: Rivals) are not. Once revs get grips with the XB1 memory setup I think we'll see PS4 at 1080p and XB1 at 900p with all else being equal.

The best selling shooter franchise, COD Ghosts is 1080p over 720p, and sometimes judders because the framerate is too damn high (maybe they've patched that by now). That's also a my god difference in my book

We're talking about facts and launch games, not on how something may be in the future, which is not facts but guesses.
 
Even with a tiny amount of mitigation from eSRAM, the Xbone won't suddenly sprout more ROPs and that will hold it back the entire generation.

The discussion is pretty moot, people should get used to a generation of games at 1080p on PS4 and 720p/900p on Xbone.
 

Bradf50

Member
Yes it is the target and it is already running at 60 FPS on one of the two systems, take a wild guess. And something I think you might also be interested in Amar, with the same physics tick-rate as the PC version currently runs at (600Hz).

Haven't read anything about resolutions or their targets, so no clue.


Oh and Andy loves his XB1 :p

Thats great to know about 60fps, are you one of the devs on the team?
 

iceatcs

Junior Member
The facts are:
6-10 fps advantage in BF4 while rendering at 900p vs 720p.

That's a "my god" advantage in my book. Others might categorize it differently.

I think you're quite over the top on the framerate difference.
"my god" level would be something like - 30fps vs 60fps, not lower, because they're always up to 10 fps different, every games, every platforms.

But I do agree with you about any different resolutions (720p vs 900p or 900p vs 1080p is quite big compare to 600p vs 720p). ie COD:Ghost which it seen the way way bigger than PS3 vs 360 differences due 720 vs 1080
Capture1_zps87e83aa1.png

Capture2_zps0c7d10d5.png
 

Vooduu

Member
What makes you think Microsoft will improve their tools so well that they'll be able to make up such a huge performance gap? It's not. Going. To happen. Ever.

I'm pretty sure they could probably achieve 1080p and 30fps later on. Not saying they're going to beat Sony in terms of performance obviously but their software department is much better than Sony's.
 

Jburton

Banned
I'm pretty sure they could probably achieve 1080p and 30fps later on. Not saying they're going to beat Sony in terms of performance obviously but their software department is much better than Sony's.

That is not the consensus at the moment, according to most Sony have the lead in development tools.
 
Top Bottom