• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

There's no promise that multiplats will be better. PS4 is obviously more capable (i can't understand people still trying to fight the truthfact), but you don't know if editors/ms won't push for parity, just to get peace from gamers.
As i see it, they'll probably devellop and optimize games for xb1 then push the "port to ps4" button. (i'm exagerating on purpose, please don't bash me gaf)
That won't create peace. That'll piss off Ps4 gamers to appease platform makers. Ps3 owners got shit ports for years.
 
I'm hoping with PSN+ being there to generate extra revenue, we will see a massive overhaul and expansion of the PSN.


Seeing that Sony sold the console at a loss I don't see any additional money flooding in until mid to late next year. Hopefully Sony has already spent money on their infrastructure.
 

Cheech

Member
As i see it, they'll probably devellop and optimize games for xb1 then push the "port to ps4" button. (i'm exagerating on purpose, please don't bash me gaf)

I have seen this thought before, but I don't think it will come to fruition. Few reasons.

1. There was a similar supposed gulf in power for the PS2 and Xbox. Sony OWNED console gaming at that period, but the vast majority of multiplatform games ran better on Xbox. The only ones that didn't, IIRC, were extremely shoddy ports such as MGS2. If "downporting for moneyhats" was a thing, it almost certainly would have made itself known at that time.

2. The bigger reason it's not going to happen is even though there's a power gulf, the architectures for the XBone and PS4 are very, very close. The reason the PS2 and PS3 had nasty running multiplatform games was due to their extremely weird architectures, that only really Sony themselves knew how to crack. And it took them quite a bit of time, given how poorly the first couple gens of first party games ran (and stuff like Gran Turismo taking eons to come out).

3. Microsoft would have to moneyhat every publisher for this to become reality. That would be prohibitively expensive, even for them.


You're not buying a PS4 at launch on a leap of faith are you? You sound like you may be best off waiting and seeing.

Oh, I agree. The smart money would wait until both systems were out for a year. Unfortunately, my money is suicidal and keeps leaping out of my wallet. :(

As far as "horrid network infrastructure", what is it that you don't like?

There were a handful of PS3 games I bought multiplatform that my friends and I had to use Halo 3 chat lobbies to talk because voice was 100% broken. Then, Xbox party chat when they added that.

With no unified matchmaking service, you routinely got put on mismatched teams in numerous games, including plenty of Sony first party stuff.

The PSN itself was never consistently fast like XBL. Plus, PSN hack causing an incredible amount of downtime.

The PSN has been a pretty consistent piece of shit over my PS3 ownership experience. Ironically, the best online Playstation game I ever played was the original SOCOM, probably due to the fact that everybody who played was a hardcore crazy and everyone was pretty consistently on the same skill level. And voice worked, because it was its own peer to peer thing and not fighting with EA servers or the PSN.

So yes. I am hoping they hired a new network team and set up a new NOC, and everybody knows what they're doing and when I fire up my PS4, I get the same seamless experience I'm used to from Microsoft.

Pretty sure the big ones such as cross game invites, party chat, and other social features are there from day 1.

They will be there, sure. Whether or not they work...

Now that we're all paying for PS+, it had better be. I have my discounted year of PS+ card I got from Best Buy a while back ready to go, that I picked up once I'd decided to roll the dice on a PS4 instead of give Microsoft more money.

Abysmal exclusives? I mean I know opinions and everything but...abysmal? TLoU, Uncharted and gran turismo are abysmal?

This is really off topic, but I am talking about consistency. Microsoft's exclusives have been consistently good, Sony's are consistently awful. Go back to the beginning of the generation and add 'em up.

The Last of Us was pretty damn good though. That game, Valkyria Chronicles, and Demon's Souls are literally the only 3 PS3 exclusive games that make me not think the PS3 was a total waste of money for me.
 

LiquidMetal14

hide your water-based mammals
Abysmal exclusives? I mean I know opinions and everything but...abysmal? TLoU, Uncharted and gran turismo are abysmal?

The types of responses some use to downplay getting a PS4 (or PS3 multi platform game) range widely. First it's "my friends all play one one platform", then it's "the games just aren't for me", then it's the PSN infrastructure which is just fine. People seem to want to find reasons to validate ignoring a great platform.

But this thread is more about PS4 and the games will be coming steadily for that. And all should run better unless devs don't put that PS4 muscle to use.
 
Perhaps that sounds immature, but its frustrating that a few months ago I kept hearing, "Where are all the 1080p 60 fps games?" Xbox One has plenty of those.

The 1080p/60fps argument has to be one of the most pathetic things I've read this year on GAF.
Some of the people just kept their mouth shut after we started hearing the real resolutions of these games and some just started going into "well, it still looks good" territory.
And honestly, this applies to both Microsoft and Sony fans, who just can't seem to wait for the games to hit so we can see what's actually real and what's not.
 

Finalizer

Member
Except if moneyhat. I don't see why they wouldn't try that to reinforce their story and their truth.
As I said in another thread, it makes far more sense to spend that money on building dev studios or buying [timed] exclusivity on entire games. Better to be able to show an experience unique to their platform than "at least our games look the same lol."
 

Dragon

Banned
Whether they will work? They work on the Vita. Some people have a weird view of the network infrastructure on the Sony side. The PS3 OS isn't the greatest but I don't understand the people that let that color their perceptions of the network. And mentioning the hack is like mentioning RRoD at this point.
 

velociraptor

Junior Member
Disregard the term 'balance'. It is meaningless. It's used is just there to seed the doubt that the PS4 is somehow unbalanced. It's not a technical term, just a carefully selected marketing one. And the Xbone is the more complicated design, due to its RAM.

The PS4 trumps the Xbone is all areas. It has a single pool of fast unified RAM. The Xbone as its main slow RAM, and a small chunk of fast RAM. Devs have to juggle what goes where, and as per the OP, are finding this to be a pain a big bottleneck.

The PS4 has 50% more CUs (the 'cores' of the GPU). With the recent 6% clock increase of the Xbone, this means that the peak performance of the PS4 is 40-50% more than the Xbone.

The PS4 has custom made chnages to its GPU to give it far more GPGPU capabilities. The PS4 has 8 ACEs (the things that manage GPGPU requests) and 64 queues (not to mention 18 CUs capable of running GPGPU). The Xbone has the standard AMD number of 2 ACEs and 2 queues, with only 14 CUs to run GPGPU on.

Many folks, from Cerny to devs, have said that making use of GPGPU will be the main thing that drives improvements this gen. The PS4 has significantly more room to grow.

The PS4 also has modifications to make sure it is hUMA compliant. The Xbone does not (nor could it, with having two pools of RAM). hUMA is the big direction that AMD have been heading towards.

Both consoles have a bunch of other DSPs for things like audio, decompression and security.

Multiple sources have said the PS4 is easier to develop for. Not only because it has ~50% more horsepower. Having one unified pool of RAM cuts out the need for devs to micro-manage what goes where, removing a big bottleneck (the same bottleneck the PS3 suffered with its two pools of RAM this gen).

Once Xbone devs get better that managing eSRAM and main RAM, there will be improvements and the days of sub-1080p games should be over. But the Xbone has far less customisation than the PS4, with less focus on GPGPU for long term gains.

Both consoles will see improvements over the coming years, but there is more untapped performance in the PS4 than in the Xbone.



There is a misunderstanding around what MS were saying around CUs. MS were trying to play of the fact that the PS4 have 50% more CUs by saying that power doesn't scale linearly with CUs. So the PS4 with 50% more CUs might only get 25% more performance in real gaming applications. But that's still 25%! Pretty massive, and more likely to be higher.

You don't have 'more problems' with more CUs. Adding more CUs is how you get different spec AMD cards in PCs. At a basic level, the more CUs, the more power. It just doesn't scale 1:1, but more is always better as far as performance goes. MS claimed that a slight clock bump was more benefital than enabling the two disabled CUs, but they were probably never in the position to enable them anyway, unless every single Xbone off the production like was tested to have 14 working CUs. Unlikely, as they had two spare to improve yields (an APU can be made with two faulty CUs and the chip still passes - it only needs 12 working).

Superb summary. It brings clarity to the 'Xbox is more balanced' phrase.
 

velociraptor

Junior Member
You know, I really would have almost nothing against the Xbox product itself being weaker than the PS4 if it weren't for some of the rabid fans.

Perhaps that sounds immature, but its frustrating that a few months ago I kept hearing, "Where are all the 1080p 60 fps games?" Xbox One has plenty of those.

And then we found out KI is not 1080p. TitanFall likely will not be 1080p.
And it really just leaves Forza.

And then their biggest graphical hitter isn't 1080p. Ryse.

And then the goal posts keep continuously getting moved for the Xbox One's performance.
Dual GPUs? 12gb of ram? GPU based upon next gen technology? It doesn't stop there.

It's one thing to claim that you think the difference is going to be negligible, but there's a vocal crowd out there that continue to say, "Well PS4 couldn't do TitanFall because it won't have cloud computing so PS4 will never see an experience like that. We will never see an experience like Dead Rising 3 on PS4. Forza's AI will never be replicated on PS4. PS4 will be lucky to have a game as good looking as Ryse."

And I just have to shake my head.
I know it doesn't sound like much, but the difference between 900p and 1080p is staggering.

I've got Far Cry 3 Blood Dragon. 900p gives me 65fps. 1080p gives me 45fps. Despite the lower framerate, I went for the latter. It just looks so much better.

Ryse looks pretty damn good based on what we have seen so far but I highly doubt the low resolution will look as appealing when you play the game in person. I am really sensitive to a lowered resolution on PC games and I would go as far as saying native resolution games, with reduced graphical effects, always look superior to games which have better shadows and effects at a lower res.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I know it doesn't sound like much, but the difference between 900p and 1080p is staggering.

Actually on this I have to heavily disagree.

I was playing Bioshock Infinite the other day on my PC which is connected to my 40 inch Sony 1080p TV.

I ran the game at 1080p and at 900p both with AA. At my normal seating distance, 8 or so feet. I couldn't tell the difference. Especially when the game is moving.
 

kitch9

Banned
Actually on this I have to heavily disagree.

I was playing Bioshock Infinite the other day on my PC which is connected to my 40 inch Sony 1080p TV.

I ran the game at 1080p and at 900p both with AA. At my normal seating distance, 8 or so feet. I couldn't tell the difference. Especially when the game is moving.

As soon as you activate a scaler, any scaler on a 1080p screen you lose a lot. So I disagree with you.
 
Actually on this I have to heavily disagree.

I was playing Bioshock Infinite the other day on my PC which is connected to my 40 inch Sony 1080p TV.

I ran the game at 1080p and at 900p both with AA. At my normal seating distance, 8 or so feet. I couldn't tell the difference. Especially when the game is moving.

I think a good scaler and the hud rendered at 1080p could help a lot with sub-1080p games on Xbox One. Obviously 1080p is better, but I'm really curious to see the difference first hand with these two factors included.

As soon as you activate a scaler, any scaler on a 1080p screen you lose a lot. So I disagree with you.

I would like to know what makes a scaler good, actually. When I render games sub-1080p on my computer monitor things look awful, but on Xbox 360 even sub 720p games scaled to 1080p still look pretty good.
 

ekim

Member
I know it doesn't sound like much, but the difference between 900p and 1080p is staggering.

I've got Far Cry 3 Blood Dragon. 900p gives me 65fps. 1080p gives me 45fps. Despite the lower framerate, I went for the latter. It just looks so much better.

If you sit directly in front of a monitor, sure. On a TV screen where you site 4-8 feet away, you won't notice the difference. (Maybe except using a >60" screen)

Superb summary. It brings clarity to the 'Xbox is more balanced' phrase.

This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.
 

skdoo

Banned
It depends on the 900p vs the 1080p thing... I'm watching videos on a Retina Mac Pro, and the difference is definitely staggering. Those with a crappy 720p Television are certainly not going to notice the difference, and those with a 1080p television might.

Those with a 4K TV definitely will. Remember, though. The majority of Americans shop at Wal-Mart, and probably don't really have that great of a TV.
 

LiquidMetal14

hide your water-based mammals
This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.

But it is a fairly good summary. I doesn't cover all aspects of either platform but there are lots of relevant facts there.

Bare in mind that PS4 is also built very efficiently but this PR babble about balance is just another angle MSFT is taking.
 
Actually on this I have to heavily disagree.

I was playing Bioshock Infinite the other day on my PC which is connected to my 40 inch Sony 1080p TV.

I ran the game at 1080p and at 900p both with AA. At my normal seating distance, 8 or so feet. I couldn't tell the difference. Especially when the game is moving.

Your "normal seating distance" is probably just too big. 8 feet are 2.4 meters - and that's your distance to a small 40 inch TV? I sit 1.6 meters away from a 42 inch TV, that's ~5.2 feet.
 
Gemüsepizza;83288073 said:
Your "normal seating distance" is probably just too big. 8 feet are 2.4 meters - and that's your distance to a small 40 inch TV? I sit 1.6 meters away from a 42 inch TV, that's ~5.2 feet.

That seems far too close to sit for a 42 inch tv but I could be wrong. It just sounds very close.
 
When I close my eyes, I can't tell the difference between 240p and 4K.

Much like sound, some people are more sensitive to certain visual details than others. In my opinion, a game running at 900p is not the end of the world.
 

ekim

Member
But it is a fairly good summary. I doesn't cover all aspects of either platform but there are lots of relevant facts there.

Bare in mind that PS4 is also built very efficiently but this PR babble about balance is just another angle MSFT is taking.

Yeah but this is a technical thread so I assume we don't need to talk about the PR bubbles but instead concentrate of the pile of information MS gave us.
 

skdoo

Banned
This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.

Ekim, sounds like there is a job for you in MS PR... do you really believe what you type? I'm in networks, and you sound an awful lot like some of the irritating salespeople I deal with every day. Especially those people who don't understand "cloud", but have to have it.
 

ekim

Member
Ekim, sounds like there is a job for you in MS PR... do you really believe what you type? I'm in networks, and you sound an awful lot like some of the irritating salespeople I deal with every day. Especially those people who don't understand "cloud", but have to have it.
No arguments left?
 
I've got a question for people saying that you can't really notice the difference between 1920x1080 and 1600x900 on native 1920x1080 displays: if there is no difference between 1920x1080 and 1600x900 with TV screens, why are developers like Turn 10 developing their game for 1920x1080 output? Why not decrease the resolution and put more of the saved power into graphical effects like better lighting and whatnot?
 
This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.

Oh please. We don't know the capabilities of the PS4 audio chip, so why do you write "much more powerful"? We know what the X1 chip can do - about 15 GFLOPS, yay. The PS4 also has a HW Scaler. And those Display Planes wouldn't be needed if all games were 1080p. I also thought Move Engines were already debunked as being just another word for DMA engines or something like that, the importance of latency in game development was also already discussed. And what exactly do you mean with coherent busses when you list it as an advantage the PS4 does not have?

That seems far too close to sit for a 42 inch tv but I could be wrong. It just sounds very close.

Recommended THX viewing distance for a 42 inch TV is between 4.7 and 6.6 feet:

http://myhometheater.homestead.com/viewingdistancecalculator.html
 

skdoo

Banned
I have lots of arguments. However, I tend to take the words of a developer over those of a fanboy on a message board. It's kind of like listening to a tech intern with a CCNA try and talk about BGP with a SP specialist.

It's vaguely amusing, but sometimes irritating.
 

gofreak

GAF's Bob Woodward
This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

Scalers and display planes are not Xbox One unique, nor are coherent buses. Move Engines are not an advantage vs PS4, they're an element of an overall more complex and less flexible memory system. eSRAM is nice from a latency point of view but the biggest demands on eSRAM will come from a processor that is set up to be latency tolerant, and the processor that is more latency sensitive seems to have more abstracted access to it than the former. (i.e. it's not really there for the latency benefits but for the bandwidth afforded)
 

TrueGrime

Member
I've got a question for people saying that you can't really notice the difference between 1920x1080 and 1600x900 on native 1920x1080 displays: if there is no difference between 1920x1080 and 1600x900 with TV screens, why are developers like Turn 10 developing their game for 1920x1080 output? Why not decrease the resolution and put more of the saved power into graphical effects like better lighting and whatnot?

Because people are way too ingrained with 1920x1080. People see that as some sort of a next-gen benchmark and if a game doesn't meet it then it is hardly up to par.
 

KoopaTheCasual

Junior Member
Ekim, sounds like there is a job for you in MS PR... do you really believe what you type? I'm in networks, and you sound an awful lot like some of the irritating salespeople I deal with every day. Especially those people who don't understand "cloud", but have to have it.

What?.. Ekim is an equal opportunity gamer. Don't accuse people without substantial evidence.
 
I have lots of arguments. However, I tend to take the words of a developer over those of a fanboy on a message board. It's kind of like listening to a tech intern with a CCNA try and talk about BGP with a SP specialist.

It's vaguely amusing, but sometimes irritating.

You decided to resort to personal insults instead of making a valid argument. I think it was a fair point for him to make.
 

mrklaw

MrArseFace
If you sit directly in front of a monitor, sure. On a TV screen where you site 4-8 feet away, you won't notice the difference. (Maybe except using a >60" screen)



This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.

- audio chip being 'much' more powerful is questionable. A lot of that is needed for Kinect audio processing. We do not have enough information to make an informed comparison on game-specific (non-kinect) audio acceleration on both platforms. PS4 also has custom hardware for this.

- HW Scaler - PS4 has one too

- HW display planes. Useful if you're going be forced to compromise on lower resolutions for the main game, and can overlay a native res HUD. It is a neat feature

- Move engines. Not seen much that shows they are that different to the DMA engines built into GCN. Might be more of them though. They still steal bandwidth from main memory but can be useful to do stuff in the background while you're chewing on something else. will take effort to leverage properly

- esram latency. Maybe. Again, will take effort to use properly as it doesn't seem to be integrated as cache so will need manual management by each game.

- PS4 has coherent buses everywhere too. Same overall coherent bus speed too 30GB/s)

nobody is ignoring that stuff. Everybody that has their ears open is interested in having a discussion about how the Xbox one can best leverage its many customisations. But instead MS keep banging on about balance and taking pointless potshots at PS4.
 

skdoo

Banned
Hey, maybe its just me... I hear people spout off with big words, and their credibility goes flying out the window. It would be like me trying to discuss circuit engineering - laughable at best. I see a sentence with at LEAST five or six buzzwords, and I dismiss someone completely.

You don't need buzzwords to have an effective discussion.
 
Because people are way too ingrained with 1920x1080. People see that as some sort of a next-gen benchmark and if a game doesn't meet it then it is hardly up to par.

So you're saying Turn 10 is intentionally running their game at 1920x1080 resolution for the purpose of marketing? Are you being factitious here?

No, the reason why Turn 10 picked 1920x1080 is because it is indeed a next generation benchmark. This is 2013 where 99% of modern TFT displays that you can buy new in store are 1920x1080 in resolution. TFT displays do not have resolution independence like CRTs so anything that is not native resolution has decreased image quality. No scaler or algorithm can change this. There is no reason to pick 1600x900 unless you are struggling to deliver what you want.

If you want top image quality, 1920x1080 output is the only option. Turn 10 has chosen 1920x1080 for this reason and that is why I have no doubt the game will look incredible in person, especially compared to something like Killer Instinct.
 

stryke

Member
We don't know what the clock of the PS4 CPU is. The current number is just a guess since Sony hasn't said anything about raising the clock.

It's not a guess. GG revealed it back in Feb/March.

You're more than welcome to hold onto the hope that Sony have gone for an upclock. At this stage I'm choosing not to.
 

skdoo

Banned
Ekim -

I'm sorry if I offended you... it's kind of hard to take your post seriously when you don't even know if the PS4 supports any of those things. And your post reads like a MS PR slide. That's all. I thought it was Penello talking for a minute.
 

foxbeldin

Member
This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)
.

-You know nothing about PS4 audio chip, so don't pretend one is "much more powerful" than the other. Plus yeah, audio, Sony, you know... They said audio could be processed by the GPGPU and that devs will probably do that in a couple of years. By then, they'll stick to the audio chip, so, there's one.

-I'm pretty sure PS4 has HW scaler, and a good one just like in, you know, Sony TVs...

-Display planes are nothing new. PS1 had hw display planes, stop trying to make this revolutionnary and an xb1 exception. It's called DCE in PS4 (DisplayScanOut Engine). The dedicted processors are very similar in both consoles in fact, they just name them differently. MS liked to rename everything with the X1, even the CUs are called something else (SC)

-PS4 doesn't need move engines so it's irrelevant.

-ESRam is 0.38% of the ram pool. Split pool btw. Everything that has to make a journey through DDR3 is limited to 68GB/s and not so good latency, because Micron 2133 Mhz CAS14. Real life latency : (14 / 1066) x 1000 = 13.1 ns
Some GDDR5 chips have better latency than this. Even with a CL 20 it could be close, cause the base clock is higher (1375).
So ESRam will help, but only for a limited type of operations.
 

Riky

$MSFT
Why would Sony even consider an upclock when they have a more powerful machine? Surely the risks are not worth the reward?
 

LiquidMetal14

hide your water-based mammals
Yeah but this is a technical thread so I assume we don't need to talk about the PR bubbles but instead concentrate of the pile of information MS gave us.

It's exhaustively been like that for months. I mean with several aspects and now this thread as well.

And if you're going to criticize a post like SPE's then you should also call out Senjetsu who makes long winded posts in defense of MSFT. There is a little rationality sprinkled there but surely you see both sides objectively.
 

KidBeta

Junior Member
If you sit directly in front of a monitor, sure. On a TV screen where you site 4-8 feet away, you won't notice the difference. (Maybe except using a >60" screen)



This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.

coherent busses everywhere? the thing is identical to the PS4 in its number of coherent busses. The audio chip I could let slide easily, but the PS4 contains what is probably essentially the same scalar (hey it looks to be AMD based tech), the display planes are seriously nothing to write home about they are pretty basic FF units. The move engines aren't nessaracy in the PS4 because itll just the DMA controllers on the GCN card, which does the same thing, but because it doesn't have a split memory pool its much simpler, the eSRAM latency will probably help with a few workloads, but we don't even what latency it is.

People don't keep ignoring a lot of it (well not everyone) people keep ignoring the parts that are there purely to augment a problem the PS4 does not have (the move engines for example).
 
-You know nothing about PS4 audio chip, so don't pretend one is "much more powerful" than the other. Plus yeah, audio, Sony, you know... They said audio could be processed by the GPGPU and that devs will probably do that in a couple of years. By then, they'll stick to the audio chip, so, there's one.

-I'm pretty sure PS4 has HW scaler, and a good one just like in, you know, Sony TVs...

-Display planes are nothing new. PS1 had hw display planes, stop trying to make this revolutionnary and an xb1 exception. It's called DCE in PS4 (DisplayScanOut Engine). The dedicted processors are very similar in both consoles in fact, they just name them differently. MS liked to rename everything with the X1, even the CUs are called something else (SC)

-ESRam is 0.38% of the ram pool. Split pool btw. Everything that has to make a journey through DDR3 is limited to 68GB/s and not so good latency, because Micron 2133 Mhz CAS14. Real life latency : (14 / 1066) x 1000 = 13.1 ns
Some GDDR5 chips have better latency than this. Even with a CL 20 it could be close, cause the base clock is higher (1375).

So does this mean PS4 can render the HUD in 1080p while rendering the 3D image in a lower resolution as well?
 

USC-fan

Banned
If you sit directly in front of a monitor, sure. On a TV screen where you site 4-8 feet away, you won't notice the difference. (Maybe except using a >60" screen)



This is no superb summary. It's biased and not acknowledging certain factors that should be added in favor for the X1. (much more powerful audio chip, HW Scaler, HW Display Planes, Move Engines, ESRam latency, coherent busses everywhere - but people keep ignoring that stuff)

If you put things into perspective a lot of of these features really help offload the CPU/GPU and make the gap not that huge as a lot of people here suggest.

This "much more power" audio processor is in the console because of kinect. Both console have audio chips. Xbone may have some advantages but that is unknown. It was not added to the console for gaming performance.

Both have hw scaler, display planes, move engines and coherent busses everywhere.

Xbone has an extra set of move engines with added functions and have one extra display plane that run the UI overlay. Also esram being on die should aid performance.

people just assume the ps4 lacks these other parts.
 

TrueGrime

Member
So you're saying Turn 10 is intentionally running their game at 1920x1080 resolution for the purpose of marketing? Are you being factitious here?

I'm speaking for myself here. I, for one, will not notice the difference between the resolutions you mentioned on my 60 inch LED. Unless I am purposely sitting so close that I will be able to make out the differences. If a game can reach the 1920x1080 mark that's great. I don't think a game like Forza needs to sacrifice the resolution for any other graphic fidelity. It'll definitely depend on the game and the goals of the developers. Anything less than 1600x900 though and you're getting into very dangerous territory.
 

Damian.

Banned
Because people are way too ingrained with 1920x1080. People see that as some sort of a next-gen benchmark and if a game doesn't meet it then it is hardly up to par.

It's not that, it's people wanting games that are native resolution to the TV they own so they don't have extra artifacts/jaggies/blur that you get from a non-native resolution.
 
Top Bottom