• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft GDC EU Presentation Shows Playstation 4 & Xbox One CPU & GPU Performance- RGT

nullpoynter

Member
I'm fine with whats in my consoles. I already own a gaming pc and dont want to be spending 800-1000 on a console. Just give me fun games with great MP experience.

Exactly! I'm happy with my consoles too. All that I want to do after work is set down and relax to a fun game, FUN being the keyword. I don't want to worry about teraflops, gigaflops, megabytes, etc. I also don't want to break the bank with a console that costs over $500 plus either. Just give me the great experience that I've had since the early NES days.
 

Marlenus

Member
Intel APU wouldn't be Atom level, it'll be Iris Pro with much stronger CPU (probably mobile i5 level). It would probably be more like $500 rather than $400, though.

Intel CPU w/ Nvidia GPU is basically the OG Xbox, expensive as fuck but insane amounts of power.

Iris Pro is a weak GPU compared to what is in PS4 or Xbox One. The CPU would not make up the difference so the overall console is weaker.

A laptop i5 is essentially a desktop i3. The only non Atom option would have been a custom quadcore Pentium. Faster than an Atom but slower than an i5.

Intel CPU with Nvidia GPU will cost more and Kepler is not as good as GCN at GPU compute so it would probably be roughly the same overall with less requirements to offload stuff from the CPU to the GPU.

The point is the PS4 is about as well balanced as they could make it within the restrictions that they had. If Sony and MS were willing to subsidise the hardware again then we probably would have gotten Steamroller based APUs with beefier GPUs but they were not willing to do that so this is what we have. Still a decent level of hardware for the money.

Regarding the benchmark I wonder if it is using ESRAM in the Xbox One because the only reason for such a discrepancy is that it is using the low bandwidth DDR3. Now in reality the render targets would be in ESRAM so I suppose it is a realistic overview of what it can do compute wise and it is a lot worse than I thought. You would expect a figure closer to 1100 based on the GPU alone so something is causing it to lose 25% of its performance and bandwidth is about the only thing I can think of.

The CPU is also showing > clock speed scaling so I wonder if the 30GB/s of bandwidth the Xbox One has apportioned to the CPU is giving it a slight boost above the clockspeed increase. Either way it does show how powerful GPU compute can be.
 

SapientWolf

Trucker Sexologist
I don't understand why the actors couldn't behave independently. Yes, each actor has his own set of inputs, but then those inputs are all being processed by a common algorithm, assuming they're the same type of actor, right?

If you see a threat, flee. One guy sees a threat, the other guy doesn't, and they each act accordingly. Isn't that Single Instruction - Multiple Data? Why can't stuff like decision making and path-finding be done on the GPU? Setting an actor75's state to Fleeing shouldn't be any harder than setting pixel75's color to Blue, it seems to me.
It seems like it would involve a lot of branching code. Each state of the actor might represent a different branch, and the user can affect those states. Branches are a bottleneck for GPGPU programming.

edit: And you're not going to see a big speedup unless you are processing a lot of AI code in parallel. It's one of those things that would benefit from better single threaded performance.
 
I see complaints of the APU being used but what was a realistic alternative?

It was either a Super Cell, APU (AMD) or Intel w/ separate GPU. I don't think AMD had the capability to do anything much stronger than what was released at the time.

If they expected to reach their price target without losing 200+ on each console I think it was the only option.
 

pixlexic

Banned
I see complaints of the APU being used but what was a realistic alternative?

It was either a Super Cell, APU (AMD) or Intel w/ separate GPU. I don't think AMD had the capability to do anything much stronger than what was released at the time.

If they expected to reach their price target without losing 200+ on each console I think it was the only option.

apu mixed with a dedicated gpu and 4 gigs of ram minus all the social/media features.
 

Renekton

Member
It seems like it would involve a lot of branching code. Each state of the actor might represent a different branch, and the user can affect those states. Branches are a bottleneck for GPGPU programming.

edit: And you're not going to see a big speedup unless you are processing a lot of AI code in parallel. It's one of those things that would benefit from better single threaded performance.
Pathfinding seems like a good candidate on paper, it can calculate multiple paths independently. Not sure about implementation though, just thinking out loud.
 

Loakum

Banned
I usually don't find threads talking about techy babble very interesting, but this thread has been a good read thus far.
 
I never said that. But it is a simple fact that moms playing Sims and people buying the odd indie game to play on their laptops are an audience that is neither particularly relevant nor directly comparable to the core gaming market.
Yes, you did say that.
Windows PCs in general are an irrelevant metric when we're talking about PC gaming. Even the Steam hardware survey that some people like to bring up from time to time is massively skewed by the fact that a lot of people who participate in it are from very poor countries
You can't remove a good portion of the data because it doesn't support your idea of what real PC gaming is.

There are plenty of gamers (core or not) that don't have a brand new and/or top of the line gaming PC. That's the reason Tomb Raider, Battlefield 4 or Dragon Age: Inquisition can run on 2006 level hardware or why Alien Isolation runs on a GT430.
 
Yes, you did say that.
You can't remove a good portion of the data because it doesn't support your idea of what real PC gaming is.

There are plenty of gamers (core or not) that don't have a brand new and/or top of the line gaming PC. That's the reason Tomb Raider, Battlefield 4 or Dragon Age: Inquisition can run on 2006 level hardware or why Alien Isolation runs on a GT430.

Those games run on 2006 hardware because they are cross gen. Do you honestly believe there was any point during the development of the game when someone from the dev team said "people we need to limit the scope of our game because it won't run on a gt430"?

During the previous console transition there were quite a few companies that released PS2 ports on PC well into the second or third year of the previous gen. Why? Because those two consoles were really powerful compared to gaming PCs at launch so it was very true that a lot of PC gamers didn't have that level of hardware yet. Now, less than one year into the new generation, the only game that is not fully next gen on PC is Pro Evolution Soccer. A simulation of a sport that is extremely popular in countries with low average income. You think that is a coincidence?
 
So, there is people that actually believe than an Intel solution would have been more hot and big than the current AMD proposal.

Not long ago, it was true that Intel stuff was prohibitive for third parties to use due to their insane profit margins, since actual manufacturing costs of those parts are actually cheaper for them than for AMD. But times have changed, x86 is in a struggle to live and Intel tactics are way more aggresive.

This is an All in One HDMI dongle able to run Windows10 (or any other x86 OS such as Linux or Android) on any TV. This thing, along 1GB DDR3, WiFi N, BT, 32gb storage and some other I/O stuff, includes a full fledged Intel Atom Z3735F Bay Trail Quad Core @ 1.83 GHz. This toy is fabbed at 22nm for a crazy 2W SDP (~4W worst case scenario TDP). This is, 1/4th power consumption of a similar clocked 4C Jaguar but with +50% to over +100% more IPC.

Best of this? Bulk price is 60.5$ for orders above 500 units. Nope, no zero missing.

Throw in 4 to 8GB RAM, and replace crappy 4EU iGPU with a custom Iris Pro with double EU count (enough to catch PS4 GPU, and just as AMD did with current consoles APUs), and you would have an under 45w TDP chip able to nuke current consoles with smaller and better form factor. Include a control pad and you would had ~300 bucks headroom to accomplish this with xurrent consoles price tag. Tech was already there.

Sure chips would be more expensive than current AMD giveaways, but just think about lowered manufacturing cost from cheaper vent solutions and savings in logistics with smaller boxes to distribute. Imagine buying desktop consoles on Vita sized boxes.

In this type of threads, people tend to confuse what tech is able to do with what business models did at one point. That fanless stick CPU is better than current consoles CPUs. That is the level of crappiness we are facing with those consoles tech. And some small chinese company is selling those for 60 bucks. I just can't believe that Microsoft, with all those Wintel deals that built their core business empire, can't make a better deal with Intel than Shenzhen T.D.S. Electronic Technology Co. I have hard times believing Sony wouldn't be able to convince Intel to do that for them with their sales and marketing muscle.

From a pure tech standpoint, current gen is atrocious. Only saved somehow by the not more than competent PS4 GPU.

Walloftext aside, I still find funny how not initiated people is so easily amazed by CELL tricks even nowadays. I think it's not that hard to realize that, SPU synthetics benchmarks aside, CPU performance is really poor even by 2007 standards. Throw there some heterogenous real world™ loads and performance will plummet. Best value of out of order designs is that they can handle more branches of code without begging for mercy and without the need of trucks of programmers crafting every line of code so CPU doesn't choke.

I thought that PS4 embracing current computing paradigm of CPU+GPGPU would open some people eyes, but I guess I was wrong.
 

vcc

Member
It doesn't matter when virtually all AAA games are built for console specs. All PC versions will get is better IQ and a few effects. The core assets of the games are all built for console specs. No publisher in their right mind would greenlight a AAA game built to fully take advantage of high-end PC hardware... financial suicide.

The studio that rode the high end PC horse was crytek. And they seem to be doing fi... oh wait.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
This is an All in One HDMI dongle able to run Windows10 (or any other x86 OS such as Linux or Android) on any TV. This thing, along 1GB DDR3, WiFi N, BT, 32gb storage and some other I/O stuff, includes a full fledged Intel Atom Z3735F Bay Trail Quad Core @ 1.83 GHz. This toy is fabbed at 22nm for a crazy 2W SDP (~4W worst case scenario TDP). This is, 1/4th power consumption of a similar clocked 4C Jaguar but with +50% to over +100% more IPC.
Do you have any data to back that up?
 
It doesn't matter when virtually all AAA games are built for console specs. All PC versions will get is better IQ and a few effects. The core assets of the games are all built for console specs. No publisher in their right mind would greenlight a AAA game built to fully take advantage of high-end PC hardware... financial suicide.

AAA game production is crap anyway. Ill take 10 star citizens for all that Destiny money. It would be a prettier game too :p
 
Yes, you did say that.
You can't remove a good portion of the data because it doesn't support your idea of what real PC gaming is.

There are plenty of gamers (core or not) that don't have a brand new and/or top of the line gaming PC. That's the reason Tomb Raider, Battlefield 4 or Dragon Age: Inquisition can run on 2006 level hardware or why Alien Isolation runs on a GT430.

Alien Isolation runs better on a $80 GPU than a PS4, so you're lucky they didn't aim any higher.
 

Astral Dog

Member
I dont know much about this, but im surprised about this GPGPU thing that has been going, having a CPU comparable(in some ways) to what was in the 360/PS3 seems weird to me, even if it makes sense, it appears to have its disadvantages too, even if the GPU is ten times as powerful.
 
It's too bad that Intel probably would have demanded two arms and a leg for the privilege of using their CPUs so we got stuck with AMD in these consoles.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Any?

Same TDP, lower power comsumption, up to triple performance per watt?

What's your point?
I asked you specifically about the bolded part, if that was not clear. Do you know what IPC stands for?
 

vcc

Member
I dont know much about this, but im surprised about this GPGPU thing that has been going, having a CPU comparable(in some ways) to what was in the 360/PS3 seems weird to me, even if it makes sense, it appears to have its disadvantages too, even if the GPU is ten times as powerful.

It's the nature of the problem. GPGPU involves floating point problems are are easily parallelizable. Which the PS3 Cell is reasonably good at.

The CPU portion of the PS4/XB1 is more capable than the pure CPU elements of the 360/PS3 but the PS3 Cell has a lot of area dedicated to a GPU like number crunching units which helps it int his benchmark.

In a general sense the PS3 has 1/3 of it's GPU next to tit's CPU.
 

Blanquito

Member
During the previous console transition there were quite a few companies that released PS2 ports on PC well into the second or third year of the previous gen. Why? Because those two consoles were really powerful compared to gaming PCs at launch so it was very true that a lot of PC gamers didn't have that level of hardware yet.

I'm curious: it appears that a major help to cross-platform development are the multitude of engines that compile to different platforms (Unity, UE, etc.). I'm not sure if any large cross-platform tools were widely used back in the PS2 days. Were they?

In addition, you can't ignore the fact that the [edit] current [/edit] consoles' hardware is very similar to a computer's hardware, and that surely helps the porting process to be much quicker.

I guess what I'm saying is, I don't see how you can come to the conclusion that it was the PS2's power that made the ports take longer, when other factors such as cross-platform engines and completely different architectures would be a much more logical reason.

However, I am open to any evidence you can find of the power being the reason.
 
I asked you specifically about the bolded part, if that was not clear. Do you know what IPC stands for?

How silly.

What do you want? A clock by clock comparison when my point turns around power and silicon usage for better integration capabilities? What interest does that have?
 
It's too bad that Intel probably would have demanded two arms and a leg for the privilege of using their CPUs so we got stuck with AMD in these consoles.

You don't need a monster CPU in these consoles. It makes total sense to have a cheaper CPU and spend more of the component cost budget on the GPU.

Two separate teams of engineers (Playstation's Mark Cerny + co & Xbox "technical fellows") went off to design successors to the PS3 and 360 and both came back with exactly the same solution when it came to the CPU in their new products. That's not a coincidence.
 

truth411

Member
You don't need a monster CPU in these consoles. It makes total sense to have a cheaper CPU and spend more of the component cost budget on the GPU.

Two separate teams of engineers (Playstation's Mark Cerny + co & Xbox "technical fellows") went off to design successors to the PS3 and 360 and both came back with exactly the same solution when it came to the CPU in their new products. That's not a coincidence.

But that has more to do with cost than performance.
 
All i know is that Sony's first party stable will make PS4's innards sing like no other and that's all i care about, i don't care about the tech specs that much, i know a home console will never again have a tech advantage over PC's and that's fine by me.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
How silly.

What do you want? A clock by clock comparison when my point turns around power and silicon usage for better integration capabilities? What interest does that have?
Err, you made a simple quantifying statement: 50% to over 100% higher IPC. I asked you to back those up. You backpedal. How silly indeed.
 
Err, you made a simple quantifying statement: 50% to 100% higher IPC. I asked you to back those up. You backpedal. How silly indeed.

<4W budget gives you 1230 Cinebench single thread score on Intel, and only 753 on AMD.

Number of cores doesn't matter on a single thread benchmark.

I will ask you again:

What is your point?
 

Teremap

Banned
But that has more to do with cost than performance.
Bingo.

The PS3 was double the cost of the PS4 for its BoM. There was simply no way the PS4 would live up to its pedigree with that tiny budget.

Hence, underpowered, partially-outdated cheap-ass machines.
 
Atom Z3770 has a base clock of 1.46GHz. With Turbo, it goes up to 2.4GHz.
A4-1200's CPU is fixed at 1GHz.

Intel can pack 4 cores at higher clocks and better performance on the same budget that AMD needs for only 2.

I will quote myself:

Me said:
So, there is people that actually believe than an Intel solution would have been more hot and big than the current AMD proposal.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
<4W budget gives you 1230 Cinebench single thread score on Intel, and only 753 on AMD.

Number of cores doesn't matter on a single thread benchmark.

I will ask you again:

What is your point?
Single-threaded performance means the core will turbo boost. A Z3770 turbo-boosts to 2.39GHz. An A4-1200 turbo-boost to 1.4GHz (ed: if it does at all. AlStong suggests it might not).

1.4 / 2.39 = .5857

which means the clock drop alone is 1 - .5857 = .4143, or 41%. But the difference on the test you're citing is -39% (http://www.notebookcheck.net/Review-HP-Omni-10-5600eg-F4W59EA-Tablet.108702.0.html - Cinebench R10, Rendering Single-threaded, 32bit). So according to that test, the Jaguar has higher IPC than the Silvermont.

Again, do you know what IPC stands for? A simple yes or no would suffice.
 
Alien Isolation runs better on a $80 GPU than a PS4, so you're lucky they didn't aim any higher.

You really think that game runs like it does on PS4 because they aimed really high?

But that has more to do with cost than performance.

They obviously had a budget in mind, but the point he was making was both designer teams for the new consoles made the same choice. Suggesting there wasn't really a better choice at a reasonable price. They got the balanced right by all accounts, and delivered machines that aren't going to bankrupt them.
 

AlStrong

Member
Intel can pack 4 cores at higher clocks and better performance on the same budget that AMD needs for only 2.

I will quote myself:

That may well be, but the issue is that the IPC isn't +50-100% for an atom vs a jaguar core.

(ed: if it does at all. AlStong suggests it might not).
Was going off this chart: http://en.wikipedia.org/wiki/List_o..._microprocessors#Temash.2C_Elite_Mobility_APU (can't stand all these model #s :p)

Seems like it's just the A6-1450 with the turbo. The notebookcheck.net link to the Toshiba tablet mentions no turbo as well I guess.
 
Single-threaded performance means the core will turbo boost. A Z3770 turbo-boosts to 2.39GHz. An A4-1200 turbo-boost to 1.4GHz (ed: if it does at all. AlStong suggests it might not).

1.4 / 2.39 = .5857

which means the clock drop alone is 1 - .5857 = .4143, or 41%. But the difference on the test you're citing is -39% (http://www.notebookcheck.net/Review-HP-Omni-10-5600eg-F4W59EA-Tablet.108702.0.html - Cinebench R10, Rendering Single-threaded, 32bit). So according to that test, the Jaguar has higher IPC than the Silvermont.

Again, do you know what IPC stands for? A simple yes or no would suffice.

No. That CPU doesn't have any turbo feature according to AMD. So your accounts are even more favorable to you.

But, once again, I don't care about clock speed. They are different architectures, so it is meaningless. Yes, I can admit i was loose using IPC refering to performance instead of instructions per clock. Maybe you wouldn't have jumped if I used NeoGAF™ Unit for CPU Power, AKA GFLOPS, instead. But my point is still fair and clear.

You have no point at all.

Same TDP budget gives me 63,35% more cinebenchies on that Intel CPU.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
No. That CPU doesn't have any turbo feature according to AMD. So your accounts are even more favorable to you.

But, once again, I don't care about clock speed. They are different architectures, so it is meaningless. Yes, I can admit i was loose using IPC refering to performance instead of instructions per clock. Maybe you wouldn't have jumped if I used NeoGAF™ Unit for CPU Power, AKA GFLOPS, instead. But my point is still fair and clear.

You have no point at all.

Same TDP budget gives me 63,35% more cinebenchies on that Intel CPU.
I'll just leave that quote here.

; )
 

truth411

Member
Bingo.

The PS3 was double the cost of the PS4 for its BoM. There was simply no way the PS4 would live up to its pedigree with that tiny budget.

Hence, underpowered, partially-outdated cheap-ass machines.

To be fair, most of the PS3 cost was the bluray drive. At the time of release bluray players cost $1000 or more.
 
Wrong, atleast for cinebench as can be seen here.

biggrin.gif
 

Marlenus

Member

Ultimately bay trail was only relesed in September 2013 so may not have even been available in time for a console release in November.

A custom quad core pentium 2020m or a quad i5 ultra low power would have been significantly faster but Iris pro, even beefed up is a bit weak and Intel have little experience with high end graphics. Usimg multiple vendors would have been too costly for the gain to be worth it.

At the time of design an AMD apu was about the best they could get in terms of performance/$. If the PS3 had been profitable and the 360 did not have the rrod issue perhaps they would have being willing to push the boat out a bit more.
 
I guess what I'm saying is, I don't see how you can come to the conclusion that it was the PS2's power that made the ports take longer, when other factors such as cross-platform engines and completely different architectures would be a much more logical.

Not the PS2's power, the 360's. Even though the 360 was easier to develop for and PC ports were really easy from that machine, quite a few devs kept releasing PC versions based on the PS2 game. It would make no sense to do that unless they believed that the audience didn't have the hardware to run a 360 port.
 
Top Bottom