• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

So, it seems that AMD's CPU is a bottleneck for PS4 and XB1

I don't believe so. AI heavy and Physics heavy games require CPU input. Its why MS is pushing the Azure offload because their CPU is a bottleneck. They want to free the local CPU up from those calculations. I've tried to make posts about this before but I just get shat on.

It can definitely help for physics (PhysX, as an example) at the cost of some GPU power, but with regards to AI, I'm not entirely sure. However, I'm inclined to say no because AI isn't really about heavy number-crunching, but I may be completely wrong about this. Someone more informed, please chime in on this :)

It is a CPU designed for netbooks. What were you expecting?

It is well known that both the PS4 and Xbone have incredibly weak CPUs.

I'm surprised a lot of people seemed to overlook this aspect once the consoles hit the market last year. Prior to launch, the flimsy CPUs were at the center of attention.
 
i've no doubt that burgeoning GPGPU technology will alleviate all and any downsides of duct-taping two ageing netbook CPUs together in ways that i can't begin to explain.

I'm like 99% sure this is sarcasm...

My 99% comment is 100% sarcasm
 
I think all of computing is CPU-bound at the moment. Battery-powered devices have ensured power-per-watt as the R&D focus over raw performance.

GPU progress in this regard has also slowed, but I don't perceive it as much.
 
Intel is utter garbage to work with though.



If the CPU is the bottleneck, then that wouldn't be the case.
It might be due increased costs on Xbone that aren't there on PS4. Like virtualization, drivers with huge overhead, impossibility (at the moment) to multithread draw calls etc...
 
Well, CPU vs. GPU bounding is something that depends on what your game actually is.

It's not as much a state of a machine as it is a state of the needs of the program. You're always going to run into CPU, GPU, or memory bounds first, and that will be what your game is bound by performance wise.

Now, there are some systems where one of the parts is way out of whack and almost every title is bound by that part, but that doesn't really seem to be the case here given we've seen a lot of GPU bound titles this generation.
 
This gen consoles are pathetic. Crappy CPUs with weak GPUs. One year in and I am ready for nextgen consoles already.

Happens when both system launch with shitty specs that was beaten before they were launch, Last gen it seem like 360/PS3 was ahead of PC for a good while..
 
It can definitely help for physics (PhysX, as an example), but with regards to AI, I'm not entirely sure. However, I'm inclined to say no because AI isn't really about heavy number-crunching, but I may be completely wrong about this. Someone more informed, please chime in on this :)
The GPU can help with Physics, however the questions is how much of the GPU is the dev willing to sacrifice for that. The new consoles aren't exactly powerhouses like we used to get.
 
The GPU can help with Physics, however the questions is how much of the GPU is the dev willing to sacrifice for that. The new consoles aren't exactly powerhouses like we used to get.

Yeah, I left the cost of performance issue out before I edited :P
 
So how about that whole cloud thing that was suppose to be the magical panacea to offload this AI issue? Oh right ya.. it was a load of bullshit.
 
It's only a bottleneck for Ubisoft in AC:Unity.

Others will tackle problems differently and not have the same bottleneck.
 
As far as I've heard though, spreading your code to many threads is not easy at all.
It is the way everything is going. The PS3 was a spearhead on consoles and games for that mindset.

Companies like SSM, guerrilla and ND (of course) amongst others dedicate effort to this exact kind of process.

I think GPU driven rendering pipelines using gpu compute will big a large step we are about a year or so out from seeing I feel.

They are weak in comparison to decent pc cpu's but this latest example from ubisoft hardly proves that as we have yet to see the pc version running!
 
This PC gravedancing is way, way, way too premature. We still have metal coding, hUMA and GPGPU and Carmack's 2x console performance to take advantage of before we can say that PS4 is GPU limited. It's not Sony's fault that a few lazy devs have run into walls before even trying to take advantage of them.
 
A few Ubisoft games have been terribly CPU heavy for me with an i5-3570k clocked at 4.5Ghz so I don't doubt they are having issues with the CPU in both those consoles. Other devs seem to be getting by. Sounds like an Ubisoft problem to me.
 
They use mobile CPUs. The only people who didn't realize they had relatively weak CPUs are the hardcore fanboys, and I think that's more a case of not wanting to admit it, then not realizing it

That said, it is Ubisoft though
 
Well, CPU vs. GPU bounding is something that depends on what your game actually is.

It's not as much a state of a machine as it is a state of the needs of the program. You're always going to run into CPU, GPU, or memory bounds first, and that will be what your game is bound by performance wise.

Now, there are some systems where one of the parts is way out of whack and almost every title is bound by that part, but that doesn't really seem to be the case here given we've seen a lot of GPU bound titles this generation.

It also sort of depends on what kind of game you're making and how it's taxing your CPU. VR games, going forward, will utilize as many cores as possible. That's obviously not a huge concern of the Xbox One, but it's something the Playstation 4 will have to worry about. It's not inconceivable to have your headset polling on one core, your positional tracking tech for your hands on another, binarual audio processing on a third, then all your normal game stuff on the rest.
 
Wow, I knew they were tablet/netbook CPUs, but I didn't realize they were that compromised.

I sure hope more games like Shadow of Mordor aren't crippled due to that though
(obviously, they will be)
.
 
how is the A.I in their Assassin Creed games ? I know it was shit in Splinter Cell. I refuse to believe their programmers are that good to hit the top so early into generation. And working on multiple consoles guarantees rushed results of their work.
 
Anyone who's seen the benchmarks of AMD's mobile CPUs could've seen this a mile away.

Spoiler: They are horrible.
 
For a game that is attempting to break the record for biggest interactive crowds that actually behave like people, this isn't that surprising.
 
The simple truth is that both consoles are severely constrained by having to hit their price point without selling at a major loss like previous generation consoles did. This is the first generation of "fiscally responsible" consoles, which I think in the long run is a good thing for the industry as next time they can just take the best AMD SOC they can afford now that the integration work has been done.

However if you look at it from a design budget perspective focusing on the GPU was a good move, but it came at a price. Just take a look at how much the PS3 cost to make at release - this is a different world now.

The only thing MS and Sony really disagreed on was how to boost the weak CPU: Sony added more GPU compute while MS is trying to leverage their existing cloud computing platform. Time will tell if who was right, it´s still way too early to call.
 
The CPU on PS4 and Xbox One is a mobile AMD Jaguar chip, designed for low energy situations. On PS4 it's clocked at 1.6ghz, on Xbox One they firmware clock to 1.75ghz as they have more thermal capacity (see the large case and big heat sink). The CPUs have 8 cores, although two are reserved by the OS/Hyper-V.

Both consoles are significantly under powered in CPU. Seriously: game journalists have barely covered it (as they don't understand), but if you go out and buy a piss poor processor for your gaming PC, you have a much better CPU than in new gen consoles.

Why is CPU important? Because you have to have frames ready to render. You can multithread, but each 16ms (for 60fps) every thread has to be complete for an image to be rendered with GPU. Example: you can't draw NPC locations on screen if the AI threads can't keep up.
 
Wow, great find!
lol

We've known that the cpu is the weakest link since the specs were announced.
Obviously it will impact games like AC more because of all the AI routines.
 
How can a strong GPU bottleneck a weak CPU? Isn't it the other way around?
The actual CPU needs of games vary. Running a pathing algorithm for 5000 NPCs is a serious concern for Assassin's Creed. A game like Civilization also has a gigantic CPU workload, or perhaps the ultimate example, a chess simulation is basically 100% CPU load that can take supercomputers to their knees.

In Call of Duty on the other hand, you have like eight NPCs and most of them are simple scripts, so when looking at the AI concerns you're never going to put a significant load on the CPU. Similarly, most of the physics (major explosions and other setpieces where lots of independent objects move) are pre-rendered animations, so that's something that doesn't put much load on the CPU either. You can go through each of the CPU based components of the game and find that they're unlikely to be taxing the system very heavily.

As such, the framerate runs into GPU bottlenecks way before CPU bottlenecks, because CoD tries to run a whole bunch of advanced graphical effects that can actually strain the GPU, and tries to hit 60 fps to boot. This causes it to be a GPU bound game instead, even if the GPU is more advanced than the CPU.

This is actually likely to be the more common scenario in games, since not many take ambitious CPU approaches compared to their graphical ambitions.
 
Happens when both system launch with shitty specs that was beaten before they were launch, Last gen it seem like 360/PS3 was ahead of PC for a good while..

While both consoles were so expensive that Microsoft and Sony were forced to delay a new gen for several years.
 
AMD suck at making CPUs so this is to be expected. Such a shame that both companies cheaped out and went for APUs instead of intel + any GPU. The difference in power requirements is negligable (look at the haswell line) and they perform so much better than the competition.

Didn't take long for someone to come along spewing hyperbole

Why don't you look up the name of the X64 instruction set?

Hint:
AMD is in the name
 
alexandros said:
So I'd like some thoughts on what happens next.
For at least the last two generations, console games were CPU limited in vast majority of cases, you can extrapolate how developers dealt with it from there.
 
AMD suck at making CPUs so this is to be expected. Such a shame that both companies cheaped out and went for APUs instead of intel + any GPU. The difference in power requirements is negligable (look at the haswell line) and they perform so much better than the competition.

As an Intel user for the last 2 decades, I can tell you that this is a garbage-tier fanboy reply and you should be ashamed of yourself.

AMD makes fine CPUs. Your problem is that you know little to nothing about how different utilization can bottleneck a CPU or GPU. Huge swarms of independently calculated entities, especially AIs, will tax any system. Look at Planetside 2, a game that can struggle to hit 60 FPS on even monstrous systems due to CPU bottlenecking. It's also a unique development skill, so developers who aren't used to those kinds of optimizations can suck at them.

As others have said, we won't know for sure until the game hits PC. At that point we can tell if:

A) The game is built in such a way that it is hugely taxing on CPUs
B) The game is poorly optimized
C) The AMD CPU in X1 and PS4 has some specific limitation.

Considering I have a dozen Ubisoft games in my Steam library, I would bet anything that B plays a huge part in this.
 
I have to agree. I'm really underwhelmed with all three of the current gen consoles so far.
Yeah, this is a big part of why I haven't even bothered buying one yet. The other reason, obviously, being their lackluster libraries.

Last generation we got $600-800 value hardware with $400-500 price tags. This generation they're selling their machines basically at cost. It's worthless from a price-performance perspective.
 
I don't think anyone that has played a recent Ubisoft PC port is surprised that they are CPU bound.
How can you end up with CPU bound algorithms on consoles? That's beyond me. Ubisoft has probably known what's inside PS4 and XB1 for quite some time now so it's not like console CPU performances are a surprise. Knowing that, how do you develop CPU bound algorithms when both architectures scream GPGPU? Do the guys at Ubisoft really understand their targets? I'm really wondering. Are they recycling last gen algorithms that were designed for powerful CPUs?
 
i would have loved that they did this, im just afraid of how much they would cost then
I'm sure since intel would have the exclusive CPU maker of both consoles they'd charge less for each unit since they'd produce them in bulk.
 
Didn't take long for someone to come along spewing hyperbole

Why don't you look up the name of the X64 instruction set?

Hint:
AMD is in the name

This kind of comment just comes from people who see some benchmarks and see Intel has more things with bigger bars than AMD. Therefore their CPUs must be bad.
 
I don't believe so. AI heavy and Physics heavy games can require significant CPU input. Its why MS is pushing the Azure offload because their CPU is a bottleneck. They want to free the local CPU up from those calculations. I've tried to make posts about this before but I just get shat on.

Woah! How did I miss this info
 
In what way a bottleneck?
No matter what you put in a console, you will always get cpu-bound games, cause dev's will always try to push what they get.

That said, ofc these cpu's are incredibly weak, but that is nothing new.
 
The CPU is not the reason they are 900P on PS4 if that is what you're hinting at. The guy outright admitted they gimped the PS4 to avoid debates [about the PS4 resolution hitting 30 pages].
 
Top Bottom