• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

Yeah, Arthur Gies even said on the most recent Rebel FM that he had heard independently the next Xbox was planned to launch this fall, but got pushed to next year. That fits well both with the rumor that MS went back to the drawing board when developers like Epic and Crytek balked at the original specs, and the conflicting reports that seemed to suggest MS was literally building two different machines. It would even support the "insider info" from this thread that AMD is scrambling to shift more engineering resources towards Microsoft's system, and the SemiAccurate rumor that MS ordered way more of the "Oban" chips a few months ago for the next console that seemed appropriate for a 2013 system.

We still don't know very much about the relative targets for Sony and MS, though. MS may have raised their sights significantly over their original design, but that doesn't necessarily mean they've overshooting the PS4 meaningfully. And scrambling at the last minute to build a high end system may not bode well for the elegance and efficiency of the final product. It may also lend credence to the idea that they've abandoned PowerPC as out "insider" suggested in favor of a more off the shelf AMD APU design, even if it's two of them duck-taped together!

Wow - this is kind've a worry if true. Every time we've heard a console manufacturer has done a kneejerk spec revision the result has caused headaches for developers and not actually worked out too well for the format in question.

Thinking of: Saturn - added a second SH2 CPU and a second video processor to beef up 3D performance in face on Playstation.

PS3 - dual CELL set up abandoned for Nvidia GPU.


Sure there are other examples.

Personally though I would be quite happy with 360-2 was slightly less powerful than PS4 if it retained backpat (yes I do know the two things aren't 100% connected) - as Sony's desire to change architecture leaves me wondering about how worthwhile it is to continue to buy digitally from Sony if my entire PSN library could become defunct....
 

StevieP

Banned

They're still taking a hit on the 3DS, but what I meant was that they were only making a few dollars on the Wii at launch. That really puts into perspective the kind of money sink the console market was and is.

For those asking for a ton of power under the hood - just remember that the Xbox was moving a lot of software, but not so much hardware for the first couple years. It didn't really go into "beast mode" until Kinect.

So... what's the general consensus about PS4's power?
is Sony on beast mode again or are they taking the modest route this time around?

What was suggested a few pages ago is actually more modest than the PS4's original target specs. Jaguar will have a lot lower per-thread performance than Steamroller (edit: i.e. it's like Bobcat vs Bulldozer)

I'll probably pick up my first Xbox console ever if the PS4 turns out to be significantly weaker. Power alone doesn't win console wars, but I want my console titles to look as purdy as possible and have developed a slight bias in owning the most powerful console in a generation after owning a PS3 (yes, I know most PS3 ports don't look as good because it is harder to develop for).

Perhaps you should save money and buy a PC. It is now, and will always be the "purdiest". Online is also free, and games are generally cheaper. But that's a story for another thread.
 

i-Lo

Member
What was suggested a few pages ago is actually more modest than the PS4's original target specs. Jaguar will have a lot lower per-thread performance than Steamroller (edit: i.e. it's like Bobcat vs Bulldozer)

This, if true, depresses me greatly.

Perhaps you should save money and buy a PC. It is now, and will always be the "purdiest". Online is also free, and games are generally cheaper. But that's a story for another thread.

Except, Sony exclusives that aren't developed by SOE (and even then not always) would not be available of PC.

Gah, looks like Sony like a certain other company are going to push their alleged and rumoured weaksauce hardware by leverage the exclusive software.
 

StevieP

Banned
Except, Sony exclusives that aren't developed by SOE (and even then not always) would not be available of PC.

Gah, looks like Sony like a certain other company are going to push their alleged and rumoured weaksauce hardware by leverage the exclusive software.

Hardware has never, ever ever sold consoles/handhelds/etc etc. It's always the software.
 

test_account

XP-39C²
Hardware has never, ever ever sold consoles/handhelds/etc etc. It's always the software.
True, but hardware and software often goes hand in hand. The hardware gives the option to create software, so by having the "right" hardware, you can create the "right" software. A good recent example is the Wii. It sold a lot because of the motion controls (hardware) in combination with intersting software for it. If Wii Sports had been a Gamecube game with a standard controller, the game would most likely have sold only a small fraction in comparison.
 

StevieP

Banned
True, but hardware and software often goes hand in hand. The hardware gives the option to create software, so by having the "right" hardware, you can create the "right" software. A good recent example is the Wii. It sold a lot because of the motion controls (hardware) in combination with intersting software for it. If Wii Sports had been a Gamecube game with a standard controller, the game would most likely have sold only a small fraction in comparison.

When I say hardware I'm referring to the system internals - the guts - the CPU/GPU/etc.
That has not, and likely never will, sell systems. It's always providing the market with compelling software that the mass market wants will drive your hardware purchases. That Wiis were selling for the same price as PS3s on Ebay a year after launch should drive that point home.

In regards to controller/external hardware you'd better believe Sony's not going to sit back and ship their console with another DualShock and call it a day. The Orbis art diagrams that were leaked and subsequently pulled shortly thereafter are a good indicator of that.
 

test_account

XP-39C²
When I say hardware I'm referring to the system internals - the guts - the CPU/GPU/etc.
That has not, and likely never will, sell systems. It's always providing the market with compelling software that the mass market wants will drive your hardware purchases. That Wiis were selling for the same price as PS3s on Ebay a year after launch should drive that point home.

In regards to controller/external hardware you'd better believe Sony's not going to sit back and ship their console with another DualShock and call it a day. The Orbis art diagrams that were leaked and subsequently pulled shortly thereafter are a good indicator of that.
I mean that hardware power alone can be a factor too, since it allows for better graphics. Obviously better graphics alone isnt enough to sell a game, but it can be a huge factor that pulls people to the game, and having great graphics can add to the experience/atmosphere in the game. But i understand what you mean and i agree that software is the most important thing.

What was the leaked Orbis stuff?
 

i-Lo

Member
Hardware has never, ever ever sold consoles/handhelds/etc etc. It's always the software.

As aforementioned, your argument is in line with mine. That's why it'll be sad to see if they "cheap out" a generation later because yes I (and I assume many more) want to see a generational gap for dem purdy gfx, better physics (of all kind) and improved AI. The problem is that as block buster HD games development are now so tied to console market that the PC version also suffer stagnation for it (yes I know the obvious benefits of uprezzing and better AA and AF). I still remember the hate for Crysis 2 because it came to console. I just wonder if due to next generation of console the evolution in gfx tech on PC would slow down. Then again, my fears are my own and are not backed by statistical data as of yet.

Given that PS is the one of the sectors of Sony that currently yields profit, I am curious if trying to ape their one competitor to gain access mass (casual) market will pay off for next gen.

Oh well, I guess like many here, I'm free to buy Xbawkz tres if PuSS Phour fails.
 

THE:MILKMAN

Member
RE: Backward compatibility via a plug in adaptor which would need parts of the PS3 (Cell at least) and a die shrink to use less power:

From BY3D http://forum.beyond3d.com/showpost.php?p=1639930&postcount=1 which cites this article:

cell+rsx_608.jpg


The Xbox360S was not fanless @ 32nm

The XCGPU of the 360S is at 45nm not 32nm isn't it?
 

StevieP

Banned
As aforementioned, your argument is in line with mine. That's why it'll be sad to see if they "cheap out" a generation later because yes I (and I assume many more) want to see a generational gap for dem purdy gfx, better physics (of all kind) and improved AI. The problem is that as block buster HD games development are now so tied to console market that the PC version also suffer stagnation for it (yes I know the obvious benefits of uprezzing and better AA and AF). I still remember the hate for Crysis 2 because it came to console. I just wonder if due to next generation of console the evolution in gfx tech on PC would slow down. Then again, my fears are my own and are not backed by statistical data as of yet.

Crysis 2 was "hate on" because it was a significantly worse game than the first one. Not because it was on consoles. However, Crytek, in targeting the broader market (hence "console gamers") made a lot of game design decisions that were made to reflect that. It wasn't "CoD-ified" as many hyperbole-ridden posts indicate, but it was definitely "dumbed down" - if there is a way to quantify that better without using words like that. I felt it was a waste of my $50, unlike the first game, and still haven't bothered to finish it. Crysis 3 is still in the same setting so I do not have high hopes in purchasing it at full price.

Anyway, Crytek aside - even a more modest leap in hardware doesn't mean you're left with crap, even with diminishing returns in effect. Whether you get a console-equivalent to Pitcairn or Turks or whatever, you still have a lot more to work with than the incredibly outdated RSX. Think about what the more artistically-talented houses were able to achieve with RSX and then imagine what they could do on more modern hardware.

test_account said:
I mean that hardware power alone can be a factor too, since it allows for better graphics. Obviously better graphics alone isnt enough to sell a game, but it can be a huge factor that pulls people to the game, and having great graphics can add to the experience/atmosphere in the game.

But that view is very myopic to us. Software sales this generation indicate the complete opposite for the most part. Most of the games that have sold over 20 million do not rely on visuals at all, and that includes Call of Duty - which has looked largely the same (i.e. pretty crap and SubHD, but at 60fps) this entire generation on consoles.
 

onQ123

Member
I think I'd rather wait until '14 or '15 for a new console.

that's what I been saying, because I'm really not seeing any time in the next year or 2 where Uncharted 3 / Last of Us graphics are not not good enough for the masses.

Wii-U is the only console that needs to come out in the next year or so, & unless the PS4 & Xbox Next is doing something really innovative I don't think it will get most people to upgrade from PS3 & Xbox 360 in the next few years, last time around it was HD graphics & online gaming that pushed the masses to jump in this time around it's going to be a harder sale unless they have something that's going to really separate the new consoles from the Xbox 360 & PS3. they actually made the PS3 & Xbox 360 too good with the update able FW so even 7 years later it's hard to think of new stuff that can't be done on the old consoles.
 
Perhaps you should save money and buy a PC. It is now, and will always be the "purdiest". Online is also free, and games are generally cheaper. But that's a story for another thread.

That is why I specifically stated 'console titles', I could not care less about PC gaming until there is a box I can easily plug into my HDTV and allow me to conveniently enjoy all my PC games from my couch (hello Steambox :p). Still, my favourite genres usually are console-centric (fighters, action/adventures... well, Uncharted, etc.).
 
What was suggested a few pages ago is actually more modest than the PS4's original target specs. Jaguar will have a lot lower per-thread performance than Steamroller (edit: i.e. it's like Bobcat vs Bulldozer)

Do you think it is likely that they actually go with Jaguar? Would this put it roughly equal to Wii U's rumored specs?
 

StevieP

Banned
That is why I specifically stated 'console titles', I could not care less about PC gaming until there is a box I can easily plug into my HDTV and allow me to conveniently enjoy all my PC games from my couch (hello Steambox :p). Still, my favourite genres usually are console-centric (fighters, action/adventures... well, Uncharted, etc.).

Aside from console-exclusives such as Uncharted, most software is multiplatform now (including PC) and plugs into a TV the same way a console does. This would make it the superior platform for those who care about graphics now, and the same would be true after all 3 new consoles release. It's just as easy to plug in all kinds of controllers and a single HDMI cable allows for a similarly plug-and-play experience.

For example, my setup (which has the i5 2500k+crossfire 5850s) sits neatly in my amp rack:
7SAY9.jpg


proelite said:
Pretty sure Jaguar cpu will be weaker at a significant amount of stuff compared to Cell.

Sure, but maybe this console will be more focused on GPGPU. Jeff Rigby always posts about the importance of stuff like OpenCL to Sony. Maybe they're serious. lol

Giant Panda said:
Do you think it is likely that they actually go with Jaguar? Would this put it roughly equal to Wii U's rumored specs?

We don't know enough about either console to make that call. The original PS4 target specs mentioned in another thread a few months back by brain_stew would certainly make the PS4 beefier, however Jaguar is a netbook/tablet-oriented CPU (like Bobcat) and I think that would have a lower per-thread performance than whatever Power derivative is in the Wii U. bgassasin noted in another thread, however, they could be more focused on GPGPU to make up for a lower-powered CPU.
 

Ashes

Banned
It does suggest that they are opting for cheaper hardware. Now, if the rumours are true, does this give an indication of the entire console's budget, or are they actually pushing for a greater discrete graphic chip budget?
 

deadlast

Member
It does suggest that they are opting for cheaper hardware. Now, if the rumours are true, does this give an indication of the entire console's budget, or are they actually pushing for a greater discrete graphic chip budget?

Or could this cheaper hardware suggest that they are going make some type of cell+rsx BC add-on.

I would pay extra for BC.
 

iceatcs

Junior Member
Nah, everyone now realise go for exotic or powerful tech will fail or bad business. (late release, loss money, angry devs and likely to losing the war)

Look like Sony learned the lesson. Now all they need to make a prefect SKU library and license management to allow easy to develop.
 
Rumour is that Epic and other major third party engine devs had convinced MS to revise their designs to achieve greater power. As a result, it is being said that launch for it has been delayed.

Sorry to go back to this but can I see a link for this rumor?
 

i-Lo

Member
Sorry to go back to this but can I see a link for this rumor?

There is no link for this rumour. IIRC it was most likely StevieP who mentioned it as a possibility. Given MS's history of accepting Epic's proposal for hardware spec I don't think it is an impossibility.
 
Sure, but maybe this console will be more focused on GPGPU. Jeff Rigby always posts about the importance of stuff like OpenCL to Sony. Maybe they're serious. lol

We don't know enough about either console to make that call. The original PS4 target specs mentioned in another thread a few months back by brain_stew would certainly make the PS4 beefier, however Jaguar is a netbook/tablet-oriented CPU (like Bobcat) and I think that would have a lower per-thread performance than whatever Power derivative is in the Wii U. bgassasin noted in another thread, however, they could be more focused on GPGPU to make up for a lower-powered CPU.

moore_power_wall.png


This just to outline where I'm coming from:

1) Different CPUs have use cases that are more efficient. This is the idea behind heterogeneous computing. Choosing an assortment of CPU types for a game console is different than the choices for a general purpose PC.

2) The heat/power envelope is the limiting factor in a game console. This makes 1 above very important. The OS or programmer must use the hardware efficiently, must choose the best processor to fit the use.

A) Many CPUs clocked slower is more energy efficient than one clocked faster. To use a many CPU model efficiently you must change how the OS works and how you write programs.

Rumors have Sony choosing a Laptop Jaguar CPU for the PS4. Laptop CPUs should be more energy efficient than desktop CPUs. The Jaguar CPU does not have a dedicated FPU, it is designed that way because the GPGPU is more efficient at this task. Using a Laptop Jaguar CPU rather than a Desktop Piledriver CPU might make sense...it can always be clocked higher than it would be in a Laptop but a Piledriver CPU if a less efficient design is still inefficient at a lower clock speed.

The software run on an AMD HSA design MUST follow the HSA rules. IF done so it will run fast enough and cool enough for a game console. Don't follow the rules and it will run hot and slow. Brute force as in Bulldozer or Piledriver....why do you think AMD engineers are using these names, creates waste heat. Jaguar a lean fast cat is more efficient using more efficient HSA processes and produces less heat. Do not think of Jaguar like an ARM CPU, think of it as a more energy efficient X86 CPU designed to fit in a fabric computing model.


Remember AMD is holding off using Jaguar until 2013 with 2013 designs and according to their roadmap the 2013-2014 design requires even more HSA efficiencies and is the start of Fabric computing (multiple CPUs designed to use a part hardware, part software fabric on and in which all CPUs can concurrently operate).

The rumored change in CPU from Piledriver to Jaguar means to me that Sony is having a SOC built to order. It could include a FPGA as mentioned by the Sony CTO. IF so the FPGA would be orders of magnitude better at speech recognition than a CPU or even cell SPUs.

OpenCL allows for a common language to use different types of CPUs as well as being an upper level language that allows for forward compatibility. Khronos and W3C are developing multiple open source standards and there is a WebCL version of webkit coming where the javascript engine can use OpenCL. So yes I think Sony is serious about OpenCL (see interests at bottom). From AMD PDF:
MAKE GPUs EASIER TO PROGRAM: PRIMARY PROGRAMMING MODELS
Khronos OpenCL™
 Premier programming environment for heterogeneous computing today
 AMD is a key contributor to OpenCL™ at Khronos
 HSA features and architecture make OpenCL™ more efficient
 May initially enable some HSA features with extensions
Microsoft®C++AMP
 Integrated in Visual Studio and Windows®8 Metro
 Addresses the huge population of Visual Studio developers
 Elegant extension of C++ through two new keywords: “restrict”, “array_view”
 HSA provides a natural roadmap for relaxing “restrict” and using all of C++

OPENCL™– WHAT IS IT?
 OpenCL™– Open Compute Language
 Industry standard programming language for parallel computing
 Specification by Khronos (Open, royalty-free like OpenGL™)
 Provides a unified programming model for CPUs, GPUs, Smart Phones, Tablets, Servers (Cloud)…
 Allows software devs to write software once, runs cross-platform
 Supported by all major hardware & software vendors
 AMD, Intel, Nvidia, Apple, ARM, Imagination Technologies, etc

OPENCL™ AND HSA
 HSA is an optimized platform architecture for OpenCL™
 Not an alternative to OpenCL™
 OpenCL™ on HSA will benefit from
 Avoidance of wasteful copies
 Low latency dispatch
 Improved memory model
 Pointers shared between CPU and GPU
 HSA also exposes a lower level programming interface, for those that want the ultimate in control and performance
 Optimized libraries may choose the lower level interface
And the above applies to a FPGA or any other CPU added to the SOC
 

Ashes

Banned
Thanks for correcting me. Any issue with the rest of the post?

If it is a jaguar cpu based console, might this not be leading to something obvious being left on the table... Somebody has to say it, so I might as well. :p

What if it isn't a traditional console? :p

Nah. That doesn't make sense. It has to be. Forget I said anything.
 
If it is a jaguar cpu based console, might this not be leading to something obvious being left on the table... Somebody has to say it, so I might as well. :p

What if it isn't a traditional console? :p

Nah. That doesn't make sense. It has to be. Forget I said anything.
What's required is more information on what features a Jaguar CPU has so we can understand why, if rumor is correct, Sony chose it for the PS4. It may not have been chosen because it's cheaper...for sure it is a more energy efficient and LATER design which might fit with more HSA efficiencies and the "Fabric" Model.

Does anyone have information on Jaguar? All I have found is no FPU and:
“Jaguar” is the evolution of AMD’s “Bobcat” core architecture for low-power APUs.

The only thing known... All the APUs with the "Jaguar" cores seem to use 2013 HSA
The Jaguar CPU does not have a dedicated FPU, it is designed that way because the GPGPU is more efficient at this task. I.E. The Jaguar CPU is designed to be paired with a GPGPU, Bobcat and Piledriver can operate stand alone without a GPGPU.

The PS4 is not going to be designed like an older PC or Xbox360. I think that's what you are saying?
 

Ashes

Banned
Right. One correction to what I thought earlier: As far as I can see, from the Amd business briefing, Jaguar was definitely in the low power segment, however it could still be a desktop part as opposed to merely a mobile part.

Source:

http://www.extremetech.com/computin...-confirms-apu-cancellations-trims-core-counts


----


All of those apus ^^^ feature graphic core next, whilst Kaveri features some HSA application support, and in Amd's own words, it is their first teraflop apu.

Source:
http://blogs.amd.com/work/2012/02/02/your-new-amd-decoder-key/

The “Steamroller” CPU core will be in our third-generation APU codenamed “Kaveri.” “Kaveri” will also be our first Teraflop-class APU
The “Jaguar” CPU core will be in our low-power and ultra-low power APUs codenamed, “Kabini” and “Temash.” “Kabini” and “Temash” are AMD’s first SOC processors.

And to turn it back to favouring Jaguar, Let's recall this Sony guy:

http://mandetech.com/2012/01/10/sony-masaaki-tsuruta-interview/

Who talks about SoC design.
 
Right. One correction to what I thought earlier: As far as I can see, from the Amd business briefing, Jaguar was definitely in the low power segment, however it could still be a desktop part as opposed to merely a mobile part.

Source:

http://www.extremetech.com/computin...-confirms-apu-cancellations-trims-core-counts


----


All of those apus ^^^ feature graphic core next, whilst Kaveri features some HSA application support, and in Amd's own words, it is their first teraflop apu.

Source:
http://blogs.amd.com/work/2012/02/02/your-new-amd-decoder-key/

The “Jaguar” CPU core will be in our low-power and ultra-low power APUs codenamed, “Kabini” and “Temash.” “Kabini” and “Temash” are AMD’s first SOC processors.

And to turn it back to favouring Jaguar, Let's recall this Sony guy:

http://mandetech.com/2012/01/10/sony-masaaki-tsuruta-interview/

Who talks about SoC design.
So SOC (including IO and support chips) with 2 Jaguar cores + GCN GPGPU and latest HSA efficiencies + Mem + Fabric computing model AND other CPUs possibly a FPGA all mounted on an interposer. Second GPU in separate package plus ?more memory? outside the SOC. This fits planned uses and is a best guess. FPGA gives Sony the advantage it needs to support medical imaging (FPGAs can be used to process imaging arrays, database searches, Protein sequencing, Voice recognition).

FPGA's were extremely expensive, hard to program and run at only about 600Mhz but in some use cases can be 50 times faster than a CPU at 3Ghz. OpenCL makes a FPGA easier to program and 3D wafer stacking is going to make them less expensive.

Latest PDF includes roadmap for Gaming 2.5D & 3D SOC 2013-2014 Memory in SOC package! 3D stacked FPGA and DDR Memory then comes Game console SOC 2.5D and Wide IO + Logic with TSV (must be 3D wafer stacking) for game Consoles. There are both 2.5D and 3D mentioned for game consoles in 2013-2014.

MAKE GPUs EASIER TO PROGRAM: PRIMARY PROGRAMMING MODELS
Khronos OpenCL™
 Premier programming environment for heterogeneous computing today
 AMD is a key contributor to OpenCL™ at Khronos
 HSA features and architecture make OpenCL™ more efficient
 May initially enable some HSA features with extensions
Microsoft®C++AMP
 Integrated in Visual Studio and Windows®8 Metro
 Addresses the huge population of Visual Studio developers
 Elegant extension of C++ through two new keywords: “restrict”, “array_view”
 HSA provides a natural roadmap for relaxing “restrict” and using all of C++

OPENCL™– WHAT IS IT?
 OpenCL™– Open Compute Language
 Industry standard programming language for parallel computing
 Specification by Khronos (Open, royalty-free like OpenGL™)
 Provides a unified programming model for CPUs, GPUs, Smart Phones, Tablets, Servers (Cloud)…
 Allows software devs to write software once, runs cross-platform
 Supported by all major hardware & software vendors
 AMD, Intel, Nvidia, Apple, ARM, Imagination Technologies, etc

OPENCL™ AND HSA
 HSA is an optimized platform architecture for OpenCL™
 Not an alternative to OpenCL™
 OpenCL™ on HSA will benefit from
 Avoidance of wasteful copies
 Low latency dispatch
 Improved memory model
 Pointers shared between CPU and GPU
 HSA also exposes a lower level programming interface, for those that want the ultimate in control and performance
 Optimized libraries may choose the lower level interface
And the above applies to a FPGA or any other CPU added to the SOC but only with the AMD 2014 design SOC and Fabric computing.
 

deadlast

Member
I don't think we will see any information this year for the PS4. I believe you will see price drops, and hardware redesign at E3.

FPGA gives Sony the advantage it needs to support medical imaging (FPGAs can be used to process imaging arrays, database searches, Protein sequencing, Voice recognition).

So the FPGA is the death nail for Cell? From a Sony perspective that is.
 
I don't think we will see any information this year for the PS4. I believe you will see price drops, and hardware redesign at E3.



So the FPGA is the death nail for Cell? From a Sony perspective that is.
Might be but I think GPGPA would have had the same impact.

FPGA's were extremely expensive, hard to program and run at only about 600Mhz but in some use cases can be 50 times faster than a CPU at 3Ghz. OpenCL makes a FPGA easier to program and 3D wafer stacking is going to make them less expensive.

Latest PDF includes roadmap for Gaming 2.5D & 3D SOC 2013-2014 Memory in SOC package! 3D stacked FPGA and DDR Memory then comes Game console SOC 2.5D and Wide IO + Logic with TSV (must be 3D wafer stacking) for game Consoles. There are both 2.5D and 3D mentioned for game consoles in 2013-2014.
 

onQ123

Member
Might be but I think GPGPA would have had the same impact.

FPGA's were extremely expensive, hard to program and run at only about 600Mhz but in some use cases can be 50 times faster than a CPU at 3Ghz. OpenCL makes a FPGA easier to program and 3D wafer stacking is going to make them less expensive.

Latest PDF includes roadmap for Gaming 2.5D & 3D SOC 2013-2014 Memory in SOC package!



I haven't read to deeply into FPGA but I would like to know what types of task it's 50X faster than CPU's because I kinda getting excited of the thoughts of what it can do with video processing for the next PlayStation Eye if it's 50x faster at video processing & things like that because it would leave the GPGPU & CPU free as it does the motion tracking & other task like Voice Controls that's something I also read about being part of the PS4 hardware, on-board voice control.
 
I haven't read to deeply into FPGA but I would like to know what types of task it's 50X faster than CPU's because I kinda getting excited of the thoughts of what it can do with video processing for the next PlayStation Eye if it's 50x faster at video processing & things like that because it would leave the GPGPU & CPU free as it does the motion tracking & other task like Voice Controls that's something I also read about being part of the PS4 hardware, on-board voice control.
@ 600mhz and the length of time it takes to program a FPGA, there are some limitations for their use. 50X was from one white paper, 1000X was mentioned in another. I'm just getting up to speed on this too.

Look at the PDF in the previous message, 3D stacked memory and FPGAs are mentioned as the first users of 3D stacking and ready for mass production this year. Game console Chip production using 2.5 and 3D scheduled for 2013-2014 with high end server 2015. It's what I was saying (guessed at) but was contrary to one article I read.

FPGA is not a lock as the SONY CTO's interview was for future tech and some of what he was talking about might be in a PS5. FPGA looks like it's going to be affordable and fits a use case for Sony so a good maybe.
 
Might be but I think GPGPA would have had the same impact.

FPGA's were extremely expensive, hard to program and run at only about 600Mhz but in some use cases can be 50 times faster than a CPU at 3Ghz. OpenCL makes a FPGA easier to program and 3D wafer stacking is going to make them less expensive.

Latest PDF includes roadmap for Gaming 2.5D & 3D SOC 2013-2014 Memory in SOC package! 3D stacked FPGA and DDR Memory then comes Game console SOC 2.5D.
Thats a hell of a find rigby. Thanks.
 
Does anyone have information on Jaguar? All I have found is no FPU and: The Jaguar CPU does not have a dedicated FPU, it is designed that way because the GPGPU is more efficient at this task. I.E. The Jaguar CPU is designed to be paired with a GPGPU, Bobcat and Piledriver can operate stand alone without a GPGPU.

If the change is true, then not only were they probably looking at more reliance on a GPGPU, but also the DSPs mentioned by the CTO. They may have felt they could reduce some costs by "cutting back" on the CPU and shifting more workload onto the DSPs and GPGPU.
 

RoboPlato

I'd be in the dick
Does the mean that production could start in 2013-2014. Or that hardware developers could begin designing future systems come 2013-2014.

It means that it can start being manufactured soon for late 2013/early 2014 release.

Also, can someone explain to me what a GPGPU is exactly? Every time I look it up I can only find the features of it and not an explanation of what it actually is.
 
Does the mean that production could start in 2013-2014. Or that hardware developers could begin designing future systems come 2013-2014.
According to Global Foundries and AMD, production testing begins soon, by October (after E3) they will know if they can launch late 2013 (this is the reason there shouldn't be an announcement at E3 this year). This is speculation putting together multiple rumors and the PDF I just posted which is the first confirmation of game console chips being manufactured in 2013-2014. With both 2.5D and wide IO + Logic using TSVs (I've heard 512 wide but the widest GPU buss in 2012 is 256) which should be a 3D wafer stacked process, it confirms either WiiU + PS4 or WiiU + Durango or all three.

bgassassin said:
If the change is true, then not only were they probably looking at more reliance on a GPGPU, but also the DSPs mentioned by the CTO. They may have felt they could reduce some costs by "cutting back" on the CPU and shifting more workload onto the DSPs and GPGPU.
Well that's also true but I tend to think of the Jaguar as the first CPU designed with a different memory model and 2 meg cache for fabric computing. That's unconfirmed at this time and I can't find any information to determine what the changes are, going from Bobcat to Jaguar.
 

Clear

CliffyB's Cock Holster
Fascinating as the tech is, I'm mainly interested in how MS and Sony are going to deal with software continuity. i.e. backwards compatibility especially in relation to online store features and products.

There needs to be a compelling reason to burn their bridges as it represents a really significant potential loss to future earnings rebuilding what they currently offer, and worse basically gives away hard won market advantages in the console gaming space over established competitors like Nintendo and any future opposition (Apple) joining the fray.

Anyone have a ballpark for CELL/RSX manufacturing costs these days? I cant imagine they are too expensive given they are 45nm parts with good yields at this point. I believe a 22nm version was projected as the next (and final) fab-shrink (PS3 SOC?) the existence of which naturally must be predicated on cost reduction.
 

onQ123

Member
It means that it can start being manufactured soon for late 2013/early 2014 release.

Also, can someone explain to me what a GPGPU is exactly? Every time I look it up I can only find the features of it and not an explanation of what it actually is.

it's a GPU that can handle general purpose processing that's usually done by the CPU.


just think of it like the Cell in reverse the Cell was a CPU that was able to handle GPU task because it was fast\strong enough.


now look at GPGPU as GPU's that are smart enough to be a CPU's


I guess that make sense.
 

deadlast

Member
Fascinating as the tech is, I'm mainly interested in how MS and Sony are going to deal with software continuity. i.e. backwards compatibility especially in relation to online store features and products.

There needs to be a compelling reason to burn their bridges as it represents a really significant potential loss to future earnings rebuilding what they currently offer, and worse basically gives away hard won market advantages in the console gaming space over established competitors like Nintendo and any future opposition (Apple) joining the fray.

This one of the issues with having games Digitally Distributed. You have to convince people that what they bought is worthless and they should move on, if you have no BC. With retail games users can trade-in games and recoup some of their loses. But a lack of BC for digital games could either increase the time for new product adoption or stop people from adopting the new technology all together.
 
Fascinating as the tech is, I'm mainly interested in how MS and Sony are going to deal with software continuity. i.e. backwards compatibility especially in relation to online store features and products.

There needs to be a compelling reason to burn their bridges as it represents a really significant potential loss to future earnings rebuilding what they currently offer, and worse basically gives away hard won market advantages in the console gaming space over established competitors like Nintendo and any future opposition (Apple) joining the fray.

Anyone have a ballpark for CELL/RSX manufacturing costs these days? I cant imagine they are too expensive given they are 45nm parts with good yields at this point. I believe a 22nm version was projected as the next (and final) fab-shrink (PS3 SOC?) the existence of which naturally must be predicated on cost reduction.
Where did you hear that? If the next shrink is at 32nm it can be done this year and announced at E3 for sale this season, if at 28nm it's less likely this year but announced at E3 but missing the 2012 season. If at 22nm then after the launch of the PS4 or Durango which would be unacceptable. See the problem.

A PS3 Cell/RSX would need some airflow-cooling if used in an adaptor for BC on a PS4 but could be fanless with a large enough heat sink and a case redesign in a PS3.

Tons of accessories are coming later this year.
 

Clear

CliffyB's Cock Holster
jeff_rigby said:
Where did you hear that? If the next shrink is at 32nm it can be done this year and announced at E3 for sale this season, if at 28nm it's less likely this year but announced at E3 but missing the 2012 season. If at 22nm then after the launch of the PS4 or Durango which would be unacceptable. See the problem.

I can't remember where I read it it, but it came out of report that a 32nm re-factor was not even tabled due to issues related to board interconnects (XDR issue?), and that they anticipated waiting for a smaller fab due to this.

It was fairly widely reported around the time of the last PS3 revision when expectations were that Sony might be moving down to 32nm.

EDIT:

LOL it looks like I might have gotten it from one of your posts on B3D Jeff!
 
Top Bottom