• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: PlayStation 4 supports hUMA, Xbox One does not

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
It should be very interesting to see those multiplatform comparisons at launch, but also down the line after launch. The big question is really will developers be willing to show up the XB1 by making full use of that PS4 horsepower, because if they do it will be noticeable.

And if they don't, there's always a competitor that will be wiling to push things as hard as possible on a platform to gain an advantage in the marketplace.

Damned if they do, damned if they don't. An interesting predicament to be in.
 

Thrakier

Member
...have you actually seen the games? There are differences in favour of PS4 but there is not a Dreamcast to Xbox type power discrepancy here.

It's hard to say, isn't it, because so far all games look pretty shit. The best looking PS4 exclusives however are already ahead of best looking XB1 exclusives and it seems like muiltplatt games are outperforming XB1 games. This gap will only widen from now on. Expecet 2nd or 3rd generation PS4 games to be much better looking than the xb1 counterparts. Both will pale to PC however.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I translated it as literal as possible. I haven't changed the structure of the sentences. Should be accurate. Others are invited to double-check.

I think that part was actually a conclusion from heise, not from the AMD guy.

I think you are right but the way the OP has translated it (which may be from a badly worded article). The clear implication is that a senior exec from AMD is shitting on MS and XBO relative to PS4. That would never happen.
 

orioto

Good Art™
I guess? It seems to me that you can away make these kinds of arguments though, 'oh that one runs at 60fps this one runs at 30fps', etc. My point remains, if the gap were really as big Xbox to DC there would be no argument; it would be massively obvious.

At the rate the Xbone is currently slipping backwards down the hardware power rankings in these threads I wouldn't be surprised if next time people are pushing an Xbox vs NES power gap though. Seems to me both sides of this debate have a taste for their fave's secret sauce.

No that's the thing it won't be obvious. it would be obvious if DC was running on both consoles and couldn't run properly on the One. And even like that, DC is certainly not using the PS4 efficiently.

And the way a game will look good, or realistic, or impressive has more to do with art direction and artist budget, than power. The One IS powerful enough to have great looking games. Even if the PS4 is twice more powerfull... it won't compare with different games and different choices. Journey, with its nice shaders and all, would be showed right now for the first time as a PS4 exclusive, running at 1080p. Everyone would use it as a proof of PS4 superiority, when it's actually a PS3 game.

My point is you won't find a proof of power differences with different games like that.
We'll have to wait for really ambitious first party titles or multy titles that use the power difference.. And again, when different teams are working on different versions.. i'm not even sure this is not random sometimes.
 
The graphical king on consoles or every platforms?

What i see is hUMA will bring performance with some rendering techniques that can't be touched by PCs brute forcing right now, and it's why it make PS4 a "a generation ahead of high end PC" in terms of architecture right now (and until 2014), am i right?

alexandros coming in 3... 2... 1...
You rang? Tell you what guys, how about we try a little role reversal? Since most of the time it is me who is asked to specify PC hardware that could theroretically match the PS4, how about you do it this time?

So here's my question: if you had to make an educated guess, what kind of graphics horsepower would I need to match the PS4? A GTX 780? A TITAN? A 7970?

Second question: What kind of differences do you expect to see in multiplatform games between the PS4 and the XBO? 60 fps vs 30 fps? 1080p vs 720p?

Not this thread. No.
 

twobear

sputum-flecked apoplexy
It's hard to say, isn't it, because so far all games look pretty shit. The best looking PS4 exclusives however are already ahead of best looking XB1 exclusives and it seems like muiltplatt games are outperforming XB1 games. This gap will only widen from now on. Expecet 2nd or 3rd generation PS4 games to be much better looking than the xb1 counterparts. Both will pale to PC however.
Why will the gap get bigger? Won't developer utilisation of Xbone improve over time too?
 

Alej

Banned
So here's my question: if you had to make an educated guess, what kind of graphics horsepower would I need to match the PS4? A GTX 780? A TITAN? A 7970?

Second question: What kind of differences do you expect to see in multiplatform games between the PS4 and the XBO? 60 fps vs 30 fps? 1080p vs 720p?

First question: We can't know at this time. It's what i'm telling since beginning. I'm sure high end PC hardware will bruteforcing things really fast, but we don't know right now what it takes to match PS4 performance in a today's PC.

Second question: I think less than 10fps difference at most, because they will target 30fps and make some sacrifices on XB1 to get that. I'm not sure about resolution, ROPs in the two platforms should be enough. So i'm not expecting framerate and IQ differences, but what we see on PS3 between exclusives and multiplatforms games: differences in textures, lighting and physics.

No doubt XBO as a closed box will have "state-of-the-art" games done exclusively for the platform that will blow our minds.
 
For the people saying there is no difference graphically for huma, you're wrong.

One of the biggest factors for graphics is memory bandwidth. It's why GDDR5 was such a big deal for the PS4 when it was announced. Since a truly shared memory pool means data doesn't have to make a back and forth trip, you're saving a TON of bandwidth. Bandwidth that can then go into higher res textures or higher levels of AA.
 
I don't think anyone will doubt they will both look awesome, however, I believe people can easily point out that the difference this gen is much greater in regards to architecture than last gen. With the gap growing everyday, we should expect to see a difference if the same game is booted up on machines side by side and played.

Growing gap everyday? Will be able to see a difference sure, but I don't think it'll be that big of a difference and of course the system architecture is much greater. When you start offloading draw distance/AI to the cloud to release some of that stress from the main unit you'll see a difference as well. I think Microsoft honestly has the edge because you will be able to run computations outside the box that would normally take a huge amount of internal processing power. The only limit I currently see is internet bandwidth affecting latency. If Microsoft can push for faster internet that won't be a problem and 3,4, and 5 years down the road will make the machine extremely flexible. Running AI simulations is a huge effort. I can tell you that from running computational simulations my self. Granted, the PS4 is the more powerful system right now. There's no denying that. Though watch out in a few years, that cloud computing will seriously augment algorithms.
 
No that's the thing it won't be obvious. it would be obvious if DC was running on both consoles and couldn't run properly on the One. And even like that, DC is certainly not using the PS4 efficiently.

And the way a game will look good, or realistic, or impressive has more to do with art direction and artist budget, than power. The One IS powerful enough to have great looking games. Even if the PS4 is twice more powerfull... it won't compare with different games and different choices. Journey, with its nice shaders and all, would be showed right now for the first time as a PS4 exclusive, running at 1080p. Everyone would use it as a proof of PS4 superiority, when it's actually a PS3 game.

My point is you won't find a proof of power differences with different games like that.
We'll have to wait for really ambitious first party titles or multy titles that use the power difference.. And again, when different teams are working on different versions.. i'm not even sure this is not random sometimes.

I'm not sure you got his point.

he's saying that comparing the gap between ps4 and xbone to the one between the original xbox and DREAMCAST (DC) is flat out wrong. no wiggle room, its wrong.
I'm not sure what DC you're referring to.

there will be differences, but not that big.
 
It's hard to say, isn't it, because so far all games look pretty shit. The best looking PS4 exclusives however are already ahead of best looking XB1 exclusives and it seems like muiltplatt games are outperforming XB1 games. This gap will only widen from now on. Expecet 2nd or 3rd generation PS4 games to be much better looking than the xb1 counterparts. Both will pale to PC however.

Both will pale in comparison to very high end PC, and even then I don't believe the differences will be major. Developers are only going to make the textures so high res, etc.

Of course you've always got the PC ability for higher framerates and downsampling, etc. But that's getting into the niche of the niche, on super expensive hardware, etc.
 

ZiggyRoXx

Banned
that had to be known from day 1 of the planning stages, if the assumption that the ESRAM is the reason its not possible on xbone is correct.

But they only reason they had to go down this shitty design route and put a compromised GPU & RAM in their, was because from day one, they had blown too much of the hardware budget on paying for Kinnect to go in the box, which now isn't even mandatory..lol, the whole XO design process and the philosophy behind it has been a total fuck up.
 
Who gives a shit? Really? And how is it even relevant to the topic at hand?

A "generation ahead" does not equate to "more powerful". But you already know this as low-end next gen GPUs are invariably less powerful than previous gen high-end cards.

Disingenuous much?

If you read the post I quoted you'll see that it's clearly described as an advantage in performance. I'm just curious to see what people's expectations are, that's all.
 

PhatSaqs

Banned
PS4: Smang It Edition.
tumblr_mc609pKQad1ro3an2o1_250.gif
 
But they only reason they had to go down this shitty design route and put a compromised GPU & RAM in their, was because from day one, they had blown too much of the hardware budget on paying for Kinnect to go in the box, which now isn't even mandatory..lol, the whole XO design process and the philosophy behind it has been a total fuck up.

of course. Microsoft made the same gamble that nintendo did with the wiiU.

that is, casuals would flock to the system for a hardware gimmick and tv functionality, and would outnumber the core users that might have been turned off.

The backlash they got at E3 proved that assumption wrong. There is no significant demand for kinect, or tv services- at least compared to the demand for the ps4.

UNLIKE nintendo, they're backtracking as fast as possible to try and make up for lost ground. The core hardware at least isn't a full generation behind, which allows them to still be competitive, but clearly the original strategy was a fuckup and MS knows it.
k
 

tinfoilhatman

all of my posts are my avatar
Awesome lets get the games out there that actually show it please, if true I'd be happy to see MS crash and burn this gen just to learn them a lesson.
 

McHuj

Member
Also. For the GPU or CPU to 'see' the updated data in real time. The respective caches still need to complete a read to the main memory pool ( I think).
I still don't see why there is a advantage over what XBO does. I'm missing a piece of the puzzle.

Mainly from a software development point of view. The programmer isn't responsible for the synchronization of data and copying it. The hardware performs the copying by itself. The GPU or CPU update data that is shared, that modified data will have to propagate from one processor L1 cache to the other ones before it can be used by the other processor. There's no way around that.
 

strata8

Member
But they only reason they had to go down this shitty design route and put a compromised GPU & RAM in their, was because from day one, they had blown too much of the hardware budget on paying for Kinnect to go in the box, which now isn't even mandatory..lol, the whole XO design process and the philosophy behind it has been a total fuck up.

No. Microsoft saves absolutely no money by going with DDR3 + eSRAM. If anything, it's costing them more since the size of the SoC is much larger than the PS4s.

The real reason was a guarantee of 8GB that GDDR5 couldn't provide.
 
This could be a really big deal because devs will send out copies of the PS4 edition to reviewers and it will be mentioned as such in reviews. Just like this gen youd frequently hear that reviewers were sent 360 copies and they dont know how the PS3 version runs in reviews(or podcasts) The devs will send out the version that looks and runs the best and little by little it could add up in the end. Especially with how often it gets posted around which version is best.
 
On a classical system you have a RAM pool and a VRAM pool that are physically speperated. Copying data from one pool to the other creates latency. The GPU is very good ad hiding latency. What it needs most is high bandwidth. The CPU on the other hand is extremely sensitive to latency. The CPU needs extremely low latency to work efficiently. Copying data from the RAM (CPU) to the VRAM (GPU) creates latency, but that's okay for the GPU. Copying data from RAM (CPU) to VRAM (GPU) and back to the RAM (CPU) creates even more latency. It's too much for the CPU. The copying alone takes longer than the computation wich makes this roundtrip highly ineffective.

Xbox360 and older APUs have a unified RAM. This means that the RAM is no longer physically seperated, but even though it's the same RAM chips, the system still distincts between memory partition for the differenct processors. You still need to copy the data between CPU partition and GPU partition, but this will be much more efficient than copying it between physically seperated pools. But it's still too much latency for a CPU, GPU, CPU roundtrip.

PS4 will have hUMA wich means that you no longer need a distinction between CPU partition and GPU partition. Both processors can use the same pieces of data at the same time. You don't need to copy stuff and this allows for completely new algorithms that utilize CPU and GPU at the same time. This is interesting since a GPU is very strong, but extremely dumb. A CPU is extremely smart, but very weak. Since you can utilize both processors at the same time for a single task you have a system that is extremely smart and extremely strong at the same time.

It will allow for an extreme boost for many, many algorithms and parts of algorithms. On top of that it will allow for completely new classes of algorithms. This is a game changer.

Tech is not my field so this post is much appreciated.
 
Way to piss of a major customer bu selling their competitors kit amd.

If ms don't leave the industry , I think they may, then they'll be nvidia next time then.

Re wickets post. You can copy a piece of data into the two ram pools and work on the at the same time in a pc anyway surely? This does however mean you don't need the extra reads I guess.
 

orioto

Good Art™
I'm not sure you got his point.

he's saying that comparing the gap between ps4 and xbone to the one between the original xbox and DREAMCAST (DC) is flat out wrong. no wiggle room, its wrong.
I'm not sure what DC you're referring to.

there will be differences, but not that big.

#DriveClub
 
Back against the wall and odds
With the strength of Cerny and a cause
Your specs are called outstanding
You're architecturally complex

Against the grain of fanboyish claims
Not the thoughts your actions entertain
And you
Have proved
To be

A real hUMA thing, and a real hero
Real hUMA thing, and a real hero
Real hUMA thing, and a real hero
Real hUMA thing

mtxlOtH.jpg


Song
 

TheD

The Detective
Growing gap everyday? Will be able to see a difference sure, but I don't think it'll be that big of a difference and of course the system architecture is much greater. When you start offloading draw distance/AI to the cloud to release some of that stress from the main unit you'll see a difference as well. I think Microsoft honestly has the edge because you will be able to run computations outside the box that would normally take a huge amount of internal processing power. The only limit I currently see is internet bandwidth affecting latency. If Microsoft can push for faster internet that won't be a problem and 3,4, and 5 years down the road will make the machine extremely flexible. Running AI simulations is a huge effort. I can tell you that from running computational simulations my self. Granted, the PS4 is the more powerful system right now. There's no denying that. Though watch out in a few years, that cloud computing will seriously augment algorithms.

Bandwidth =! latency!
 

Myshkin

Member
One big thing people might be missing, not only is processing power being saved with hUMA, but PS4 effectively has even more memory now since data isn't duplicated in memory for the GPU and CPU. A single pointer can now be used by both the GPU and CPU.

It might not work out that way. There are reasons why we have concepts like triple buffering.
 

efyu_lemonardo

May I have a cookie?
can somebody please explain to me why hUMA doesn't pose a serious security risk in a PC environment?

maybe this has already been addressed in the thread but it sounds like an attacker could load malicious code disguised as a texture or shader instruction only to have the CPU load this code later on..
 
Bandwidth =! latency!
I'm using them in proper terms, when I am talking about bandwidth I'm talking about internet speed and how it may affect cloud computation and thus affecting game play. If your running internet and cloud and the Bandwidth chokes you might run into latency issues.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
On a classical system you have a RAM pool and a VRAM pool that are physically speperated. Copying data from one pool to the other creates latency. The GPU is very good ad hiding latency. What it needs most is high bandwidth. The CPU on the other hand is extremely sensitive to latency. The CPU needs extremely low latency to work efficiently. Copying data from the RAM (CPU) to the VRAM (GPU) creates latency, but that's okay for the GPU. Copying data from RAM (CPU) to VRAM (GPU) and back to the RAM (CPU) creates even more latency. It's too much for the CPU. The copying alone takes longer than the computation wich makes this roundtrip highly ineffective.

Xbox360 and older APUs have a unified RAM. This means that the RAM is no longer physically seperated, but even though it's the same RAM chips, the system still distincts between memory partition for the differenct processors. You still need to copy the data between CPU partition and GPU partition, but this will be much more efficient than copying it between physically seperated pools. But it's still too much latency for a CPU, GPU, CPU roundtrip.

PS4 will have hUMA wich means that you no longer need a distinction between CPU partition and GPU partition. Both processors can use the same pieces of data at the same time. You don't need to copy stuff and this allows for completely new algorithms that utilize CPU and GPU at the same time. This is interesting since a GPU is very strong, but extremely dumb. A CPU is extremely smart, but very weak. Since you can utilize both processors at the same time for a single task you have a system that is extremely smart and extremely strong at the same time.

It will allow for an extreme boost for many, many algorithms and parts of algorithms. On top of that it will allow for completely new classes of algorithms. This is a game changer.

Well summarized. Added to the OP.
 

astraycat

Member
can somebody please explain to me why hUMA doesn't pose a serious security risk in a PC environment?

maybe this has already been addressed in the thread but it sounds like an attacker could load malicious code disguised as a texture only to have the CPU load this code later on..

When memory is mapped it's tagged as CPU executable or not. When you allocate and map memory for a texture, you're certainly not going to mark it as CPU executable.

I doubt that userland apps will actually get to mark memory as CPU executable at all, really.
 
"With the PS4 and HSA, AMD may well be well on its way to dominating the gaming scene in future. Not only will their platform be easier to code for, but Intel and Nvidia are currently not releasing any products with the same benefits. Intel’s Haswell graphics still partition off the GPU memory and Nvidia’s graphics cards won’t be compatible with hUMA. Its a great time to be an AMD fan."

http://mygaming.co.za/news/hardware/53750-amds-plan-for-the-future-huma-fully-detailed.html

Nice. Had to search to get a better understanding. Seems legit.

Glorious AMD overlords.

So I'm just wondering, when will I be able to put a hUMA gpu/cpu amd board in a PC? I'm getting PS4 but I also want to upgrade the PC a bit later on, I'd love if I could just stick a motherboard in there with both gpu and cpu on it and hUMA up the ass and so forth, which might save a lot of money and just make it easy.
 

Chobel

Member
No. Microsoft saves absolutely no money by going with DDR3 + eSRAM. If anything, it's costing them more since the size of the SoC is much larger than the PS4s.

The real reason was a guarantee of 8GB that GDDR5 couldn't provide.

Can someone explain this 8GB GDDR5 thing, Sony getting lucky and MS not?
 

ZiggyRoXx

Banned
No. Microsoft saves absolutely no money by going with DDR3 + eSRAM. If anything, it's costing them more since the size of the SoC is much larger than the PS4s.

The real reason was a guarantee of 8GB that GDDR5 couldn't provide.

Whatever the real reasons, the outcome is the same..an under performing GPU & Memory combination compared to the competition.

Clever programming by Dev teams to work around the XO's shortcomings will only get them so far.

..In 2 or 3 years when the PS4 is really beginning to stretch it's legs, the XO's deficiencies will become blindingly obvious.
 
First question: We can't know at this time. It's what i'm telling since beginning. I'm sure high end PC hardware will bruteforcing things really fast, but we don't know right now what it takes to match PS4 performance in a today's PC.

Second question: I think less than 10fps difference at most, because they will target 30fps and make some sacrifices on XB1 to get that. I'm not sure about resolution, ROPs in the two platforms should be enough. So i'm not expecting framerate and IQ differences, but what we see on PS3 between exclusives and multiplatforms games: differences in textures, lighting and physics.

No doubt XBO as a closed box will have "state-of-the-art" games done exclusively for the platform that will blow our minds.

Quite how a 1.8 tf gpu system can come cose to something with the massively higher processing power of a 780 I have no idea.

It might be elegant but it isn't in that league.
 
#DriveClub

yeah, you misread him. he was addressing someone else who had compared the xbone to the dreamcast (DC) and the ps4 to the original xbox.

Driveclub looks nice and all, but isn't a definitive 'this console is dramatically more powerful' game.

what he was saying is that if the gap between the two consoles was DC to Xbox big there would be no debate. the xbox launched 3(?) years later and was somewhere around 7 times more powerful.

the ps4 and xbone are much, much closer than this. a performance increase of 50%, not 700%
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Did everyone ignore this post?

AMD said in April that hUMA is compatible with embedded memory like eDRAM or eSRAM. Just an interesting discussion point.

Sure, it's compatible. But apparently they didn't extend the concept to main memory. However, nobody knows what part those data move engines play. One guess of mine would be that they copy data between main memory spaces while GPU and CPU perform computations.
 

TheD

The Detective
"With the PS4 and HSA, AMD may well be well on its way to dominating the gaming scene in future. Not only will their platform be easier to code for, but Intel and Nvidia are currently not releasing any products with the same benefits. Intel’s Haswell graphics still partition off the GPU memory and Nvidia’s graphics cards won’t be compatible with hUMA. Its a great time to be an AMD fan."

http://mygaming.co.za/news/hardware/53750-amds-plan-for-the-future-huma-fully-detailed.html

Nice. Had to search to get a better understanding. Seems legit.

Load of shit.

The majority of PC versions of games are going to be targeted at Intel CPUs and Nvidia GPUs due to the fact they rule the market.
 

efyu_lemonardo

May I have a cookie?
When memory is mapped it's tagged as CPU executable or not. When you allocate and map memory for a texture, you're certainly not going to mark it as CPU executable.

I doubt that userland apps will actually get to mark memory as CPU executable at all, really.

yeah, I quickly edited it to say texture or shader instruction, but evidently not quick enough.

so the question this brings up is if hUMA allows tagging of memory as CPU executable, GPU executable, both or neither?

edit: because the idea of having code meant to run on one processor running on a different processor just seems like asking for trouble...
 
Top Bottom