• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How much different will X1 and PS4 multiplats be visually?

Status
Not open for further replies.

Alej

Banned
Yes you get more fps from a better gpu, but you also get better fps from a CPU. When I upgraded my CPU from core2quad to i7 my fps doubled with the same cpu. If x1 does have a significant cpu advantage of up to 30% ( big if) then this could place it in the ballpark to compete with ps4.

9% advantage in CPU clock if PS4's CPU is at 1.6GHZ (it's rumoured to be, and it was in january).

The fact is, this gen will be about GPGPU doing a lot of tasks in place of good old CPU. You should count GPGPU in your comparaison, and if you do it's a significant advantage (and not rumoured) for PS4... a very significant one because it's a lot more efficient with hUMA and it doesn't require a choice between compute and graphics as much as on X1 because of 4x the ACEs.

Why i'm answering though? Someone is wrong on the internet it seems.
 

Biker19

Banned
After reading the old post I noticed that X1 camp went from " MS will go all out this gen, Sony can't compete financially" to " 40% is negligible, we have AUDIO" lol

EPIC TRANSFORMATION

Yeah, it's funny. What's also funny is that Xbox fanboys thinks that the Xbox/Entertainment Division has unlimited access to all the money that Microsoft makes from Windows, etc. while also thinking that a 40% difference between the PS4 & Xbox One won't be so big in the near future in terms of exclusives & multiplat games.

I think the X1 could have up to 30% more CPU power than ps4. Here's why.

X1 has dedicated servers freeing up to 10% CPU power for multiplayer and heavily integrated online games that the ps4 CPU has to deal with. Add on the 5-10% CPU savings from x1 having dedicated audio processors, Then the 10% higher clock than ps4 and you have a significant CPU advantage on x1 over ps4.

I also believe the x1's use of directx gives it an advantage simply because its the industry standard and such a mature toolset.

I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.

I've preordered both ps4 and x1 so ill be enjoying all the goodness. From the initial specs I thought the ps4 was gonna blow the x1 out the water, but that doesn't seem to be true. Also can I just say that dedicated servers on every game is a gamechanger. No more host advantage, the reason I fell out with online on past gen. Yay.

I honestly believe ps4 on paper is more powerful than x1 overall. Trouble is sonys first party titles at launch are looking very average graphically and are running at low frame rates so what gives. Yeah dead rising 3 is running at 30fps, but that is on inferior hardware. What I'm trying to say is ps4 should be showing a very clear graphical advantage over the x1, hell I want it to cause I'm getting one and want the best for my money, but to me it's not showing a clear graphical advantage so I'm not gonna give it a free ride cause its cool to bash xbox.

LOL, more delusional posts from an Xbox fanboy.
 

Skenzin

Banned
We still dont know. Devs who have worked on both haven't commented yet.

Ps4 looked way better on paper.. But we don't know how the esram will feed the GPU in practice. Nobody has ever put that much 6t esram in a console. The One GPU may end up being more fully utilized. Its too early. Probably the ps4. But ps3 was crowned a clear monster as well. Sony hype has a way of fading..
 

Tabular

Banned
Yes you get more fps from a better gpu, but you also get better fps from a CPU. When I upgraded my CPU from core2quad to i7 my fps doubled with the same cpu. If x1 does have a significant cpu advantage of up to 30% ( big if) then this could place it in the ballpark to compete with ps4.

Even if it had a 30% faster CPU. (there is no way to argue it does though) It would make no difference to multi platform games. The PS3 had a much greater than 30% advantage over xbox 360 in its CPU and look how much that helped multi's. Almost zilch. Expect that to be the case here.
 

nib95

Banned
Another real time wire frame of the city.

i9ogHZ3w4G9s3.gif
 

buenoblue

Member
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.
 

IN&OUT

Banned
We still dont know. Devs who have worked on both haven't commented yet.

Ps4 looked way better on paper.. But we don't know how the esram will feed the GPU in practice. Nobody has ever put that much 6t esram in a console. The One GPU may end up being more fully utilized. Its too early. Probably the ps4. But ps3 was crowned a clear monster as well. Sony hype has a way of fading..

sole purpose of eSRAM is to increase the bandwidth of the slow DDR3 period. it has no magic, no secrets, nothing at all.

don't make big deal of it just because X1 has it.

also don't forget that PS4 has GDDR5 which is easily the better solution than DD3+eSRAM

if eSRAM have any sort of advantage over GDDR5 you would at least see the modern GPUs using it. but guess what? they are using GDDR5 because it is THE BEST bandwidth solution available today period.
 

Alej

Banned
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.

Okay guys it's nothing, it's just one of our masters playing with us, poor toys peasants we are.
 

IN&OUT

Banned
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.

Can't you at least wait until consoles release?
There is a gap and it will show.
 

ekim

Member
sole purpose of eSRAM is to increase the bandwidth of the slow DDR3 period. it has no magic, no secrets, nothing at all.

don't make big deal of it just because X1 has it.

also don't forget that PS4 has GDDR5 which is easily the better solution than DD3+eSRAM

if eSRAM have any sort of advantage over GDDR5 you would at least see the modern GPUs using it. but guess what? they are using GDDR5 because it is THE BEST bandwidth solution available today period.

Next gen APUs will sport large chunks of embedded ram on their DIE. For example Intel's haswell. The closer it sits to the actual CPU/GPU, the lower the latency which is a good thing when talking about HSA.
 

Mystery

Member
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.

You've said you've been gaming for all these years; you should know that console developers will always try and push the graphical envelope than push for 60fps.

I'm rocking dual Titans on water alongside my hexacore i7, but I'm still getting a PS4 for the exclusives. Sony first-party developers make amazing games. I know where to set my expectations for console gaming, and I also understand (to a degree) the technical makeup of both the PS4 and Xbone. The PS4 has a significant edge from a power perspective over the Xbone, and there's really no refuting that.
 

stryke

Member
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.

You talk about how great it is to have discussions and yet you've still yet to address what we're telling you about the city scene in Killzone. And then proceed to go on some loose tangent about how great PC is over consoles when no one here is really talking about PCs.

What a pathetic post.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Next gen APUs will sport large chunks of embedded ram on their DIE. For example Intel's haswell. The closer it sits to the actual CPU/GPU, the lower the latency which is a good thing when talking about HSA.

I still don't think that Haswell is comparable to the XB1 setup. In Haswell, the embedded memory is a cache shared by all processors. In the XB1, the embedded memory is a scratchpad that is only accessible by the GPU and the DMEs, but apparently not by the CPU. That does not seem to be a setup to support the HSA paradigm in particular.
 

RoboPlato

I'd be in the dick
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.
Why does everyone that gets proven wrong about some of this stuff retreat to, "well PC is more powerful than both?" It's so odd how these conversations keep looping around the same assumptions in different people and ending with the same sign off points. These threads used to be so informative and interesting but now they're just circling the same conversations with different people. I may have to check out of tech threads for a while.
 
X1 is NOT hUMA compatible at all thanks to eSRAM that go against the whole idea of hUMA.

Is there actually any confirmation of this?

Having access to secondary eSRAM doesn't mean that it's the GPU's only access to memory and that everything must pass through eSRAM. The eSRAM is most likely just dedicated to the framebuffer while there are bucketloads of GPU operations being performed in main memory long before it's copied to the framebuffer.
 

Piggus

Member
Yes you get more fps from a better gpu, but you also get better fps from a CPU. When I upgraded my CPU from core2quad to i7 my fps doubled with the same cpu. If x1 does have a significant cpu advantage of up to 30% ( big if) then this could place it in the ballpark to compete with ps4.

What in the holy hell are you talking about.

Okay, first of all, a Core 2 Quad and an i7 are not the same CPU just because they have the same number of cores. The i7 shits all over the Core 2 Quad because of its newer and more advanced architecture. It's also capable of much higher clock speeds. So of course you saw a big boost.

Second, Xbone doesn't have a significant CPU advantage. It has, as of now, a very small (9%) CLOCK SPEED advantage which doesn't even come CLOSE to making up the GPU and memory disadvantages.

Take that i7 of yours, overclock it by 9% (which is very easy to do) and then be amazed at how your framerates will go up by a whopping 1-2 fps (wow!)

Then take your video card out, throw it in the trash, and buy one that costs $100 less. See which one of these changes has a bigger impact on your games.
 

Alej

Banned
Same tactic I've seen so many Xbox guys explore. The Kayle does exactly the same every time he starts losing a debate.

Enough with The Kayle, it's all because of alexandros.
(wtf i'm saying? goodbye guys, have a nice day)
 

buenoblue

Member
the city scene in killzone is not being rendered in realtime, as in you couldn't jump out before your allowed and run around. it's a trick to give the impression of scale, there are no other game elements ie aI, realtime shadows etc. I'd be surprised if it wasn't just a ingame engine cutscene segwayed into actual gameplay, like you know what they did on the last 2 killzones and the uncharted games. if they could render a city in such fidelity in realtime why would they have the actual gameplay in such a closed off banal tiny space? hmm I wonder.
 

IN&OUT

Banned
Next gen APUs will sport large chunks of embedded ram on their DIE. For example Intel's haswell. The closer it sits to the actual CPU/GPU, the lower the latency which is a good thing when talking about HSA.

I have to read more on the subject, but as far as I racall, HSA and hUMA application main purpose is to have a unified pool of memory. that's why MS said that they have application similar to hUMA which immediately tells us that X1 doesn't support hUMA.
 

Tabular

Banned
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.

Saying something is disappointing because it runs at 30fps is disingenuous at best. We've had 60 fps and 30 fps on consoles for decades. PS1 had lots of 60 fps games. It's a design decision plain and simple. PS3 had a number of AAA games running 1080p/60fps. So what? 30 fps will always have advantages so it will continue to be used for certain games. I think killzone SF beats any released PC title even thought it is (only) 30 fps so I think the decision was wise in that case. But critics will always find something to criticize.
 
Okay, first of all, a Core 2 Quad and an i7 are not the same CPU just because they have the same number of cores. The i7 shits all over the Core 2 Quad because of its newer and more advanced architecture. It's also capable of much higher clock speeds. So of course you saw a big boost.

To be fair to him, the i7 also has massively increased memory bandwidth over the Core 2 architecture. The X1 also has similarly greater CPU bandwidth than the PS4 (30GB/s vs. 20GB/s) which may (or may not) be important.

Larger caches and higher IPCs also matter but the higher bandwidth also makes for a large portion of the difference between Core 2 and the later Core iX architectures.
 

avaya

Member
the city scene in killzone is not being rendered in realtime, as in you couldn't jump out before your allowed and run around. it's a trick to give the impression of scale, there are no other game elements ie aI, realtime shadows etc. I'd be surprised if it wasn't just a ingame engine cutscene segwayed into actual gameplay, like you know what they did on the last 2 killzones and the uncharted games. if they could render a city in such fidelity in realtime why would they have the actual gameplay in such a closed off banal tiny space? hmm I wonder.

has to be joke account.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Is there actually any confirmation of this?

Hard to answer, since nobody really knows the details of what AMD understands as a full hUMA feature set, and how much of the discussion that an AMD representative had with a journalist, about PS4 supporting hUMA and XB1 not, was motivated by technological insights or by product-branding.

My conclusion from the leaked documents is that both architectures support a form of cache-coherent access to main memory for CPU and GPU in the sense that the GPU sees the latest data that the CPU sees without any need for synchronization, while the other direction (CPU sees what the GPU sees) needs explicit synchronization. In addition, the PS4 apparently has a more fine-grained control over the cache lines of the GPU while the XB1 has to flush the entire GPU cache to synchronize the contents of the GPU caches with the CPU.
 

avaya

Member
To be fair to him, the i7 also has massively increased memory bandwidth over the Core 2 architecture. The X1 also has similarly greater CPU bandwidth than the PS4 (30GB/s vs. 20GB/s) which may (or may not) be important.

Larger caches and higher IPCs also matter but the higher bandwidth also makes for a large portion of the difference between Core 2 and the later Core iX architectures.

The PS4 has a further 10GB/s on a separate bus, so no it doesn't have a deficit on the CPU side.
 

Norml

Member
Direct X is not an advantage. One of the reasons PS3 first party titles were ahead of the curve vs 360 ones despite a GPU deficit was because of low level coding not constrained by such APIs. It also should be mentioned that Sony's new shader language specific to the PS4 is supposedly more feature rich than both OpenGL and Direct X. It lines up with what some devs have said, that is that Sony's development tools on the PS4 are currently more mature than the Xbox One's.

At Quakcon,Carmak said OpenGL is 10times faster.
http://youtu.be/Uooh0Y9fC_M?t=1h15m14s
 
I'm really not trolling. Well I'm trying not to. Yeah people have dissected what I have said with some valid points, that's what discussions are about. Some good stuff in there. I already have a pc that will no doubt destroy both consoles graphically and am buying both next gen consoles. I'm just putting an argument for the other side that is all. I could understand all the ps4 adulation if it was hands down ripping the xbox one apart graphically. To my eyes it's not. I guess my biggest disappointment is after gaming on pc for the last 4 years i have grown accustomed to 60fps, now faced with a so called powerhouse next gen console spitting out 30fps first party games and then have people claim it's the second coming grates my balls.



Lol, at least the 3rd time you said your getting both systems in this thread alone... It doesnt matter what systems you are getting. The fact you seem like you need to keep saying that shows you know exactly what you are doing...

And its not a discussion when you throw out outlandish statements based on your pipedreams then ignore facts that prove you are greatly mistaken, then keep on doing it.
 

Skenzin

Banned
sole purpose of eSRAM is to increase the bandwidth of the slow DDR3 period. it has no magic, no secrets, nothing at all.

don't make big deal of it just because X1 has it.

also don't forget that PS4 has GDDR5 which is easily the better solution than DD3+eSRAM

if eSRAM have any sort of advantage over GDDR5 you would at least see the modern GPUs using it. but guess what? they are using GDDR5 because it is THE BEST bandwidth solution available today period.


Maybe. But if that was the case why use 6x more transistors then they needed? The could have implemented standard edRAM like wiiu at a fraction of the price. From.my research there's some interesting things that can be done.. Look past your own prejudices.. These multinational corporations won't love you back..
 

buenoblue

Member
if you read my original post you would see that x1 may have released some of the burden from the CPU with custom audio chips and cloud based server processing for online data. ad that to the 9% CPU oc and you could gave a 20-30% CPU advantage. as you know this would be significant, couple that with the gpu oc and you could have a nice boost. I have no source for this, I just thought of it at work today and thought I'd share it. I won't bother in future. lol.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Maybe. But if that was the case why use 6x more transistors then they needed? The could have implemented standard edRAM like wiiu at a fraction of the price. From.my research there's some interesting things that can be done.. Look past your own prejudices.. These multinational corporations won't love you back..

eSRAM needs six times more transistors than eDRAM but is apparently the only possible type of embedded memory that is possible on a 28mn manufacturing process. At least that's what I read, I have little knowledge about semiconductor manufacturing.
 

kel22harris

Neo Member
the city scene in killzone is not being rendered in realtime, as in you couldn't jump out before your allowed and run around. it's a trick to give the impression of scale, there are no other game elements ie aI, realtime shadows etc. I'd be surprised if it wasn't just a ingame engine cutscene segwayed into actual gameplay, like you know what they did on the last 2 killzones and the uncharted games. if they could render a city in such fidelity in realtime why would they have the actual gameplay in such a closed off banal tiny space? hmm I wonder.

whaaaaaaaa>?? not real ti.... theres a youtube vid from a few months back that shows the dev pausing and paning arount the cityscape. someone link im on mobile
 
the city scene in killzone is not being rendered in realtime, as in you couldn't jump out before your allowed and run around. it's a trick to give the impression of scale, there are no other game elements ie aI, realtime shadows etc. I'd be surprised if it wasn't just a ingame engine cutscene segwayed into actual gameplay, like you know what they did on the last 2 killzones and the uncharted games. if they could render a city in such fidelity in realtime why would they have the actual gameplay in such a closed off banal tiny space? hmm I wonder.

Begging for a banning at this point. Goes right back to the ignorance where he has been proven wrong like 10 times now.

Call the Bish cause he is ruining this thread...
 
I have to read more on the subject, but as far as I racall, HSA and hUMA application main purpose is to have a unified pool of memory. that's why MS said that they have application similar to hUMA which immediately tells us that X1 doesn't support hUMA.

The X1 does have a unified pool of memory in the same sense that the 360 has a unified pool of memory. They just have a secondary pool of memory as well (in the form of eSRAM/eDRAM) because the framebuffer needs as much bandwidth as it can get.

Most of the data is still handled in main RAM.
 
sole purpose of eSRAM is to increase the bandwidth of the slow DDR3 period. it has no magic, no secrets, nothing at all.

don't make big deal of it just because X1 has it.

also don't forget that PS4 has GDDR5 which is easily the better solution than DD3+eSRAM

if eSRAM have any sort of advantage over GDDR5 you would at least see the modern GPUs using it. but guess what? they are using GDDR5 because it is THE BEST bandwidth solution available today period.

It not being used on PC has absolutely nothing to do with how good or useful it would be. It isn't used on PC, no doubt because of what it would add to manufacturing complexity, transistor count and cost.

The EDRAM in the Xbox 360, for example, provided more than just extra bandwidth. It allowed some developers to get away with using less external memory bandwidth than would have otherwise been required without it. ESRAM on the Xbox One, thanks to its close proximity to the execution units of the GPU, no matter how many times people try to deny it, will help the Xbox One in a similar way. It will help with more than simply just giving the system some extra memory bandwidth. It will help reduce the bandwidth cost of most post processing algorithms. In fact, with fast enough on chip memory, they can become completely free of external memory bandwidth.

http://beyond3d.com/showpost.php?p=1738762&postcount=3325

Also a fast read/write on chip memory scratchpad (or a big cache) would help a lot with image post processing. Most of the image post process algorithms need no (or just a little) extra memory in addition to the processed backbuffer. With large enough on chip memory (or cache), most post processing algorithms become completely free of external memory bandwidth. Examples: HDR bloom, lens flares/streaks, bokeh/DOF, motion blur (per pixel motion vectors), SSAO/SSDO, post AA filters, color correction, etc, etc. The screen space local reflection (SSLR) algorithm (in Killzone Shadow Fall) would benefit the most from fast on chip local memory, since tracing those secondary rays from the min/max quadtree acceleration structure has quite an incoherent memory access pattern. Incoherent accesses are latency sensitive (lots of cache misses) and the on chip memories tend to have smaller latencies (of course it's implementation specific, but that is usually true, since the memory is closer to the execution units, for example Haswell's 128 MB L4 should be lower latency than the external memory). I would expect to see a lot more post process effects in the future as developers are targeting cinematic rendering with their new engines. Fast on chip memory scratchpad (or a big cache) would reduce bandwidth requirement a lot.

And, for the record, this isn't a stupid vs post. It's simply to showcase, with added evidence from an actual game developer, that the "all you're getting from esram is extra mem bandwidth" argument is false. Who cares what's used on the PC. We are talking about what's used on consoles, and how that may benefit the consoles. The if it was any good, it would be used on the PC argument is the weakest possible argument anyone could even attempt to bring into this discussion. That said, are you going to say this game developer has no idea what he's talking about regarding fast on chip memories such as what's used in the Xbox One as ESRAM?
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
whaaaaaaaa>?? not real ti.... theres a youtube vid from a few months back that shows the dev pausing and paning arount the cityscape. someone link im on mobile

Already happened:


Just to prove that KZ:SF is not pre-rendered: in this tech-demo, the guy from Guerilla paused the game, went into wireframe mode and flew around the city.

kzsfwireframep2y0x.gif
 

IN&OUT

Banned
Maybe. But if that was the case why use 6x more transistors then they needed? The could have implemented standard edRAM like wiiu at a fraction of the price. From.my research there's some interesting things that can be done.. Look past your own prejudices.. These multinational corporations won't love you back..

They didn't use more transistors than they need at all. eSRAM is big, it took a large chunk of the APU. this chunk ended up more utilized in PS4 by adding more compute units and further customization.

more transistors doesn't mean anything at all.
 

stb

Member
What in the holy hell are you talking about.

Okay, first of all, a Core 2 Quad and an i7 are not the same CPU just because they have the same number of cores. The i7 shits all over the Core 2 Quad because of its newer and more advanced architecture. It's also capable of much higher clock speeds. So of course you saw a big boost.

Second, Xbone doesn't have a significant CPU advantage. It has, as of now, a very small (9%) CLOCK SPEED advantage which doesn't even come CLOSE to making up the GPU and memory disadvantages.

Take that i7 of yours, overclock it by 9% (which is very easy to do) and then be amazed at how your framerates will go up by a whopping 1-2 fps (wow!)

Then take your video card out, throw it in the trash, and buy one that costs $100 less. See which one of these changes has a bigger impact on your games.

One of the most sensible posts in this thread.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
The EDRAM in the Xbox 360, for example, provided more than just extra bandwidth. It allowed some developers to get away with using less external memory bandwidth than would have otherwise been required without it.

I don't see why "getting away with using less external memory bandwidth" is any different from providing "extra bandwidth". Operations on render targets in eDRAM require less external bandwidth because all the necessary information (pixelbuffers) is stored in eDRAM. Hence, all the necessary bandwidth for those operations is provided by the eDRAM. There is no need to access main memory because the necessary data just isn't stored there. That does, however, not change the total amount of memory bandwidth consumed by those operations. The only thing that changes is the distribution of bandwidth consumption on two busses and two memory pools.

That in itself is a substantial benefit compared to a system with a single pool of slow memory, because the total bandwidth is higher, but it is not a benefit compared to a system with a single pool of fast memory, because there the total bandwidth is not higher, yet you have less flexibility, since only small amounts of data fit into the fast, embedded pool of memory.
 

Skenzin

Banned
They didn't use more transistors than they need at all. eSRAM is big, it took a large chunk of the APU. this chunk ended up more utilized in PS4 by adding more compute units and further customization.

more transistors doesn't mean anything at all.

My understanding is 6t esram takes 6x more transistors than standard Edram. Its an interesting choice. Maybe tht had to for redundancy of the mfg process. Maybe not.
 
What in the holy hell are you talking about.

Okay, first of all, a Core 2 Quad and an i7 are not the same CPU just because they have the same number of cores. The i7 shits all over the Core 2 Quad because of its newer and more advanced architecture. It's also capable of much higher clock speeds. So of course you saw a big boost.

Second, Xbone doesn't have a significant CPU advantage. It has, as of now, a very small (9%) CLOCK SPEED advantage which doesn't even come CLOSE to making up the GPU and memory disadvantages.

Take that i7 of yours, overclock it by 9% (which is very easy to do) and then be amazed at how your framerates will go up by a whopping 1-2 fps (wow!)

Then take your video card out, throw it in the trash, and buy one that costs $100 less. See which one of these changes has a bigger impact on your games.

Well, people are only focusing on the clock speed increase of the CPU, but if you add the pretty beefy audio block that is said to be capable of offloading an entire CPU core's worth of performance, and then you couple that with things like this that are built onto two of the Xbox One's move engines, then I think you can make a pretty good case that the CPU on the Xbox One is being spared a lot of extra work that it would otherwise have to do without the help of this kind of dedicated hardware.

One move engine out of the four supports generic lossless encoding and one move engine supports generic lossless decoding. These operations act as extensions on top of the standard DMA modes. For instance, a title may decode from main RAM directly into a sub-rectangle of a tiled texture in ESRAM.

The canonical use for the LZ decoder is decompression (or transcoding) of data loaded from off-chip from, for instance, the hard drive or the network. The canonical use for the LZ encoder is compression of data destined for off-chip. Conceivably, LZ compression might also be appropriate for data that will remain in RAM but may not be used again for many frames—for instance, low latency audio clips.

These are things that the CPU would commonly be doing for a game, are they not?
 

Chobel

Member
if you read my original post you would see that x1 may have released some of the burden from the CPU with custom audio chips and cloud based server processing for online data. ad that to the 9% CPU oc and you could gave a 20-30% CPU advantage. as you know this would be significant, couple that with the gpu oc and you could have a nice boost. I have no source for this, I just thought of it at work today and thought I'd share it. I won't bother in future. lol.

You keep repeating the same ridiculous things over and over:

Audio: We have no idea what the audio chip in PS4 can do, but let's assume it does nothing but the basic stuff compress and decompress audio streams. No body is forcing developers to do more than the basic in audio and if they needed more advanced stuff, they can use CPU if there's no hit in performance or just use GPGPU.

Cloud Dedicated server: No, not every online game will have dedicated servers in Xbox one and more importantly dedicated servers are not something exclusive to Xbox one. I almost forget where did get that 10% CPU for P2P hosting.
 
Status
Not open for further replies.
Top Bottom