• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Eurogamer\DF] Orbis Unmasked: what to expect from the next-gen PlayStation.

Durante

Member
So you basically want all the people who made the right decision in purchasing 4 core Sandy Bridges instead of AMD processors to get screwed over by consoles? Again?
A 4 core Sandy Bridge is faster than a 8 core 1.6 GHz Jaguar for pretty much everything.
 

Nachtmaer

Member
another point is - this is not a jaguar with 8 cores, this is 2 jaguars with 4 cores each, right?

Supposedly, yes. I do wonder how easily these two groups can communicate with each other and how it will work out with having certain core(s) dedicated to the OS.
 

Proxy

Member
In general, so would I. Personally, I'd be forced to upgrade again and that's bad news for me.

Plus, AMD is just not a robust enough company to compete with Intel. At best, the situation would just force Intel to make 8 core processors a standard. This would shake the market for the consumer while still keeping Intel on top. AMD is going to diminish.

True. AMD was massively screwed over by Intel a long time ago and is unlikely to ever recover the market share they once had.
 

Bombadil

Banned
A 4 core Sandy Bridge is faster than a 8 core 1.6 GHz Jaguar for pretty much everything.

Maybe, but if developers who make games first for consoles and then for PC follow the same route in the future, games optimized for 8 core CPUs are going to run like shit on 4 core CPUs.
 

ghst

thanks for the laugh
A 4 core Sandy Bridge is faster than a 8 core 1.6 GHz Jaguar for pretty much everything.

oh it's 1.6 ghz? i missed that part. i wonder what we can expect if a dual core bobcat at 1.8 ghz gets a passmark of 843?

i thought one of the big advantages of jaguar (as well as being able to fit more cores on a single chip) was the clockspeed increase?
 

Vol5

Member
Developers had a much larger say in what they wanted in the next consoles. And I imagined a lot of them said that if they had to develop for a familiar homogeneous architecture versus an esoteric heterogeneous architecture, they'll always pick the former. It really doesn't matter how much money Sony put into Cell if developers will end up relegating it to 2nd-tier status for development.

Perhaps for coding games, but is it too much of a stretch to imagine the Cell running an autonomous OS separate from the GPU+APU?
 
First console next generation since i'm really into games.. Exciting! E3 hype is much much higher then the last few years, because they must show it now
 
if this thing is revealed before durango, i can see them going for less power but earlier to market strategy i.e. ps1 & ps2.

the last one out the gate is always the beast.
 

Nachtmaer

Member
oh it's 1.6 ghz? i missed that part. i wonder what we can expect if a dual core bobcat at 1.8 ghz gets a passmark of 843?

I bet they would be able to make them run at a slightly higher frequency than that; let's say about 2GHz. Even though Jaguar is designed for low powered machines and consoles have their own TDP restrictions, upping the frequency at least a bit won't probably break the bank on cooling solutions.
 

Ryoku

Member
Maybe, but if developers who make games first for consoles and then for PC follow the same route in the future, games optimized for 8 core CPUs are going to run like shit on 4 core CPUs.

Nope. Intel owns the high performance CPU market on PC. AMD's performance-grade 6-core CPU still struggles to hold its own against Intel's last two generations of CPUs.
Even at 8 cores, a Jaguar CPU is still a Jaguar CPU. There have been talks of modifications, however, and if that's not regarding the higher number of cores, then we will see.
 

ghst

thanks for the laugh
That's the popular rumour. I hope it's higher.

It would be nice to have a reason to upgrade my CPU after 4 years!

Nah, Jaguar is designed for low power, no high clocks.

"In order to boost clock-speed by 10% compared to today's Bobcat-powered chips, Jaguar micro-architecture features longer pipeline"

http://www.xbitlabs.com/news/cpu/display/20120904201534_AMD_Discloses_Peculiarities_of_Next_Generation_Jaguar_Micro_Architecture.html

has something happened since september?

Maybe, but if developers who make games first for consoles and then for PC follow the same route in the future, games optimized for 8 core CPUs are going to run like shit on 4 core CPUs.

as i explained on the previous page, that would take some masterfully terrible programming given that it would take all 8 cores perfectly optimized at 100% load to even scrape half of what a mid range i5 can do. i'm convinced there will be a couple of pieces of custom silicon to ease some of the heavy lifting for specific gaming related tasks, but all told it should be comfortable within an i5's remit.
 
Perhaps for coding games, but is it too much of a stretch to imagine the Cell running an autonomous OS separate from the GPU+APU?

I think that if a Cell chip is there, it will be used to run the PS4 OS and for audio processing, BC, and SPUs could also be used to assist the main CPU with vector intensive code. It would be really cool to have a Cell chip inside the PS4, but I don't know if it's feasible in terms of costs though.
 

Bombadil

Banned
Nope. Intel owns the high performance CPU market on PC. AMD's performance-grade 6-core CPU still struggles to hold its own against Intel's last two generations of CPUs.
Even at 8 cores, a Jaguar CPU is still a Jaguar CPU. There have been talks of modifications, however, and if that's not regarding the higher number of cores, then we will see.

I'm not a techie, so I'm hoping I'm wrong. But the way I think about it has to do with how the code for the game is split up for each core. If the game's code is specifically split up to 8 cores, then how easy will it be for the developer to re-assign the work for a smaller number of cores for a PC build? Is it a simple process?
 

Ryoku

Member
I'm not a techie, so I'm hoping I'm wrong. But the way I think about it has to do with how the code for the game is split up for each core. If the game's code is specifically split up to 8 cores, then how easy will it be for the developer to re-assign the work for a smaller number of cores for a PC build? Is it a simple process?

To put things into perspective, Xbox 360 games will run fine on a system with Intel's Core 2 Duo CPUs (assuming they have a decent GPU, to go along). The gap between the rumored CPUs and today's gaming-grade PC CPUs is even bigger. Intel's two generation-old CPUs still have much higher IPC (instructions per clock) than Jaguar could even dream of. Same goes for AMD's gaming-grade CPUs, but to a lesser degree.

Not to discredit the consoles entirely, however, I'm sure the GPU and other dedicated silicon will handle things that the CPU did in PS360, freeing up the CPU to do other tasks.

The thing people need to realize is that PS360 used a brute force approach with their CPUs. This generation will see more efficient handling of data with "weaker" CPUs and dedicated silicon. This is already apparent with Wii U.
 
I'm not a techie, so I'm hoping I'm wrong. But the way I think about it has to do with how the code for the game is split up for each core. If the game's code is specifically split up to 8 cores, then how easy will it be for the developer to re-assign the work for a smaller number of cores for a PC build? Is it a simple process?

First you have to find a way to split it up for 8 cores - not everything can be parallelized to fully use the potential of more cores. So now you might get a certain speedup from more cores but certainly not by x8. On the other hand 4 Sandy Bridge cores with higher clock rate and different branch prediction, pipelines, etc. will work better.

a + b + c + d + e + f + g + h = x on a 8 core machine has the following parallel steps:

a+b, c+d, e+f, g+h - so 4 cores are utilized the other 4 have to wait until in at the next cycle you can add up the previous results (ab+cd, ef+gh - so only 2 cores are needed) and so on. The Sandy Bridge 4 core has the advantage because it can to the "+" faster than the Jaguar core. For more complex functions it heavily depends on what you do and how you are going to do it. Try to speedup a sorting algorithm with more cores and you will find out that it is very, very difficult and you rather have 1 fast core instead of 10 slower ones. Multicore systems are an invention because clock frequencies stalled and people need an incentive to buy new PCs. I don't say that they are useless but I would be carefull in what you are wishing for.
 
Maybe, but if developers who make games first for consoles and then for PC follow the same route in the future, games optimized for 8 core CPUs are going to run like shit on 4 core CPUs.

Not true at all, Sandy Bridge can do so many more IPC it's not even really worth bringing up.
 

i-Lo

Member
Sounds like Jaguar is almost a POS even before the performance of it's customized nature in closed boxes has been gauged.

To put things into perspective, Xbox 360 games will run fine on a system with Intel's Core 2 Duo CPUs (assuming they have a decent GPU, to go along). The gap between the rumored CPUs and today's gaming-grade PC CPUs is even bigger.

Not to discredit the consoles entirely, however, as I'm sure the GPU and other dedicated silicon will handle things that the CPU did in PS360, freeing up the CPU to do other tasks.

The thing people need to realize is that PS360 used a brute force approach with their CPUs. This generation will see more efficient handling of data with "weaker" CPUs and dedicated silicon. This is already apparent with Wii U.

Weaker, lol. No sir, that you very much. I shall stick with my PS3 instead. At least that one has perfect cell.
 

Nachtmaer

Member
Yep, and Haswell will be another 20-25% step up from that clock for clock it seems.

I wouldn't put my hopes up if I were you. Sandy Bridge brought about a 15% increase in IPC. Just like Ivy Bridge, Haswell will probably be more focused on GPU improvements. So overall it might be about 20-25% faster including higher frequencies.

where is that said? I read it in the DF article as the other way around. 8 cores with one processor.

Jaguar's cores are grouped up into Compute Unites (CUs) that go up to four. So my guess is that they're using two Jaguar CUs.
356227-amd-jaguar-compute-unit.jpg
 

Ryoku

Member
Sounds like Jaguar is almost a POS even before it has been customized specifically to each console.



Weaker, lol. No sir, that you very much. I shall stick with my PS3 instead. At least that one has perfect cell.

You forgot to bold the other parts of the comment, like the mentioning of custom silicon to free up CPU resources. Besides, the GPU is no slouch, and this generation will showcase GPU-centric development for games, compared to last gen's, more CPU-centric approach. Don't let my PC-sided mind detract you from PS4's power. It's powerful, but not in the way people had first imagined.
 

Ashes

Banned
Weaker, lol. No sir, that you very much. I shall stick with my PS3 instead. At least that one has perfect cell.

A jaguar apu would dominate the cell. The rumours are suggesting it's 10x better and they would be correct [for most things]. Heck, I know I said PlanetSide 2 was cpu intensive, but for 9/10 games, they wouldn't have MMFPS situationss. :p

High End CPUs aren't the bottleneck for games really. In most cases. GPUs are a different matter.
 
Jaguar's cores are grouped up into Compute Unites (CUs) that go up to four. So my guess is that they're using two Jaguar CUs.
356227-amd-jaguar-compute-unit.jpg

Well according to the DF article there being customized into 8 cores for the consoles. Its still all speculations and rumors at this point, but if we are going off the DF article as legit, and all things point to it being pretty credible.

The PC Jaguar products are set to ship later this year in a quad-core configuration - next-gen consoles see the core count double with some customisations added to the overall design.
 

Ashes

Banned
Netbook sales fell off a cliff. PC sales are falling. Intel could had done with 200m over ten years. I mean they could had sold their previous generation ones.

Shame really. Add insult to injury, they are a step ahead on the fab process too. They really could have given a good leg up.
 

Nachtmaer

Member
Well according to the DF article there being customized into 8 cores for the consoles. Its still all speculations and rumors at this point, but if we are going off the DF article as legit, and all things point to it being pretty credible.

The PC Jaguar products are set to ship later this year in a quad-core configuration - next-gen consoles see the core count double with some customisations added to the overall design.

Yeah, I guess they'll be adding the needed logic to make those two CUs be able to communicate with each other or just customize the whole thing as if they are one CU.

How does a quad Yorkfield at 3.5 GHZ compare to these jaguars?

I remember AMD saying that Bobcat has about 80% (if this figure is right) of an Athlon II's performance. Since Jaguar is an improved Bobcat, you can somewhat guess where Jaguar ends up.
 

kharma45

Member
I wouldn't put my hopes up if I were you. Sandy Bridge brought about a 15% increase in IPC. Just like Ivy Bridge, Haswell will probably be more focused on GPU improvements. So overall it might be about 20-25% faster including higher frequencies.

Well if Sandy Bridge to Ivy Bridge was about 10% clock for clock and Haswell is 10% faster again than Ivy Bridge it's not an unreasonable assumption to add the two together.

Haswell will have better OCing potential than those two apparently, and it could well be the end of locked and unlocked processors.
 

i-Lo

Member
You forgot to bold the other parts of the comment, like the mentioning of custom silicon to free up CPU resources. Besides, the GPU is no slouch, and this generation will showcase GPU-centric development for games, compared to last gen's, more CPU-centric approach. Don't let my PC-sided mind detract you from PS4's power. It's powerful, but not in the way people had first imagined.

Meh, if the PS4 fails (which I hope it won't) to impress then I'll simply get a XB3 for next gen. After all, it's apparently blessed by "secret sauce" aka "wizard's jizz".

It just boggles my mind that they didn't go for a more powerful CPU. This article guesstimates that power consumption for the entire system would be around 150W. I remember people were afraid that about Sony and MS crossing their old limits in wattage but this is quite below that. Are they trying to mimic the WiiU in terms of saving on power? Or are they trying to keep Californian governor happy? Are they trying to keep women/girlfriends happy by being quiet during gameplay (only a handful will get this reference, XD)? And pertaining to the GPU, it looks like they could have increased the core clock to at least 850MHz (if not more) and the entire system's consumption would still be below 180W, let alone first PS3's 210W at peak. Would reaching 200W have meant substantial no. of YLOD and Red rings (or their next gen equivalents)?

Feels like this shrinkage is slap in the face of opportunity to increase power without doing something utterly revolutionary. I am hoping to be proven wrong but what good is infallible reliability in a machine/s whose specs may be perceived as underwhelming?
 
Meh, if the PS4 fails (which I hope it won't) to impress then I'll simply get a XB3 for next gen. After all, it's apparently blessed by "secret sauce" aka "wizard's jizz".

It just boggles my mind that they didn't go for a more powerful CPU. This article guesstimates that power consumption for the entire system would be around 150W. I remember people were afraid that about Sony and MS crossing their old limits in wattage but this is quite below that. Are they trying to mimic the WiiU in terms of saving on power? Or are they trying to keep Californian governor happy? Are they trying to keep women/girlfriends happy by being quiet during gameplay (only a handful will get this reference, XD)? And pertaining to the GPU, it looks like they could have increased the core clock to at least 850MHz (if not more) and the entire system's consumption would still be below 180W, let alone first PS3's 210W at peak. Would reaching 200W have meant substantial no. of YLOD and Red rings (or their next gen equivalents)?

Feels like this shrinkage is slap in the face of opportunity to increase power without doing something utterly revolutionary. I am hoping to be proven wrong but what good is infallible reliability in a machine/s whose specs may be perceived as underwhelming?

Unlike PS3 and 360, die shrinks in later revisions are not going to be easy since 22nm and below offer terrible yields so they have to play it safe early.
 
Meh, if the PS4 fails (which I hope it won't) to impress then I'll simply get a XB3 for next gen. After all, it's apparently blessed by "secret sauce" aka "wizard's jizz".

It just boggles my mind that they didn't go for a more powerful CPU. This article guesstimates that power consumption for the entire system would be around 150W. I remember people were afraid that about Sony and MS crossing their old limits in wattage but this is quite below that. Are they trying to mimic the WiiU in terms of saving on power? Or are they trying to keep Californian governor happy? Are they trying to keep women/girlfriends happy by being quiet during gameplay (only a handful will get this reference, XD)? And pertaining to the GPU, it looks like they could have increased the core clock to at least 850MHz (if not more) and the entire system's consumption would still be below 180W, let alone first PS3's 210W at peak. Would reaching 200W have meant substantial no. of YLOD and Red rings (or their next gen equivalents)?

Feels like this shrinkage is slap in the face of opportunity to increase power without doing something utterly revolutionary. I am hoping to be proven wrong but what good is infallible reliability in a machine/s whose specs may be perceived as underwhelming?


We can do this man, don't give up on me now.
 

Maximilian E.

AKA MS-Evangelist
One question..

How does/would a 8core jaguar (what is supposedly in both ps4 and xbox3) compare to a phenom II x6 core?

Some years ago, I speculated over at B3D that next gen Xbox would have the power of a phenom II x6 and a dual GPU from the 5000 series (power equivalent).

Perhaps I was not so far off :)
 
A jaguar apu would dominate the cell. The rumours are suggesting it's 10x better and they would be correct [for most things]. Heck, I know I said PlanetSide 2 was cpu intensive, but for 9/10 games, they wouldn't have MMFPS situationss. :p

High End CPUs aren't the bottleneck for games really. In most cases. GPUs are a different matter.

No it won't the rumor for 10X is talking about the PPU only .
It's such a shame these system going to be using jaguar, i how the power and space they save goes into the GPU .
 

Teletraan1

Banned
Meh, if the PS4 fails (which I hope it won't) to impress then I'll simply get a XB3 for next gen. After all, it's apparently blessed by "secret sauce" aka "wizard's jizz".

It just boggles my mind that they didn't go for a more powerful CPU. This article guesstimates that power consumption for the entire system would be around 150W. I remember people were afraid that about Sony and MS crossing their old limits in wattage but this is quite below that. Are they trying to mimic the WiiU in terms of saving on power? Or are they trying to keep Californian governor happy? Are they trying to keep women/girlfriends happy by being quiet during gameplay (only a handful will get this reference, XD)? And pertaining to the GPU, it looks like they could have increased the core clock to at least 850MHz (if not more) and the entire system's consumption would still be below 180W, let alone first PS3's 210W at peak. Would reaching 200W have meant substantial no. of YLOD and Red rings (or their next gen equivalents)?

Feels like this shrinkage is slap in the face of opportunity to increase power without doing something utterly revolutionary. I am hoping to be proven wrong but what good is infallible reliability in a machine/s whose specs may be perceived as underwhelming?

They are building these consoles to be wine cooler friendly. It is the only logical assumption.
 
Hey folks. Just got approved and this is my first post!

I hate to be that "guy" that brings more "insider information" to the table(especially since it's my first post and nobody knows if I'm trustworthy), but I will be for right now. I have a friend who is very, very close with Sony. I won't say what he does or who he is, but I can relay what he has told me.

I haven't kept up on this page completely so hopefully I'm bringing some new information. I showed him the rumored leaks from yesterday and he said the only thing they really got right was the RAM. According to him(I don't know tech talk very well so you might understand this better than I) the APU has 4 steamroller cores in it, bandwidth to the GDDR5 is high. Cell runs the show(OS, security, IO, etc.) and also acts as a satellite/accelerator processor for the steamroller cores. APU has access to the XDR in a similar way that RSX does. There is an SCC(Super Companion Chip) inside as well.

The ram is only 3.5gb and is in 3 pools. 2GB GDDR5 for the APU(which is not an A10), 1GB DDR3 for the SCC, and 512mb XDR(for Cell). He said the APU has 256GB/s to the GDDR5, devkits though. There is no cell in the devkit yet, maybe 1st party ones, cell integration needs final silicon.

It will be backwards compatible with PS3 AND PS2. As far as the new controller rumor, he doesn't know much as far as design goes but did say that there is no screen on the controller, however, you can use another device to have that. I assume he's hinting at the vita?

Anyways, that's the info I received. No need to believe it and I'm not saying it is 100% accurate, just going by what I've been told.

Welcome, man.

But this doesn't make sense at all. Integrating the Jaguar cores, Cell, XDR, DDR3, GDDR5? 3 different memory pools? This is absolutely contraty to what sony is interested in, which is getting back on track with easier dev tools and lower production cost. You friend is lying. Or Sony got completely crazy!
 
Ow, i knew there was a catch somewhere. I know the new post about the PS4 having three types of memory and Cell sounds batshit insane, but that's what was missing really.

I think the memory thing was a placeholder, but the Cell part could be the secret sauce people are talking about it.

This is just too Sony. There's got to be a weird choice in it, a WTF moment, otherwise it wouldn't be Sony. PS4 was almost like the perfect friendly console everybody hoped for. I still hope if a Cell or something like get in the system, that it doesn't harm the system.

Leaked/confirmed stuff was great, we were just waiting on the special sauce to see yet another ump in specs.

I really think they won't drop Cell forever like this... maybe it's not on PS4, but i think they won't retire like many think they will.
 

Waaghals

Member
Throwing the cell into the mix with the AMD processor would be a mess.
Those are two different processor architectures.

it would needlessly complicate the design and massively increase the power consumption of the system. I'm not sure the cell was ever really such a good fit for a gaming console, no matter how many flops it churned out.
 
Ow, i knew there was a catch somewhere. I know the new post about the PS4 having three types of memory and Cell sounds batshit insane, but that's what was missing really.

I think the memory thing was a placeholder, but the Cell part could be the secret sauce people are talking about it.

This is just too Sony. There's got to be a weird choice in it, a WTF moment, otherwise it wouldn't be Sony. PS4 was almost like the perfect friendly console everybody hoped for. I still hope if a Cell or something like get in the system, that it doesn't harm the system.

Leaked/confirmed stuff was great, we were just waiting on the special sauce to see yet another ump in specs.

I really think they won't drop Cell forever like this... maybe it's not on PS4, but i think they won't retire like many think they will.

Well Sony didn't go "WTF" in any aspect with the PSX (it just wasn't a powerful machine, but it was very developer-friendly), PS2 and PS3 is what made people believe Sony was mad, lol.

And honestly, putting the Cell there just to execute these functions sounds like overkill. And sounds pricey.
 
Top Bottom