• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

Donnie

Member
Is SPU the same as ALU? Not that I know what the latter is...

ALU stands for Arithmetic Logic Unit. Most basically its a circuit that does lots of adds, multiplies ect. A Stream Processing Unit is an ALU, however its just one implementation of an ALU. A bit like The I5 is one implementation of a CPU.
 

slider

Member
ALU stands for Arithmetic Logic Unit. Most basically its a circuit that does lots of adds, multiplies ect. A Stream Processing Unit is an ALU, however its just one implementation of an ALU. A bit like The I5 is one implementation of a CPU.

Thanks Donnie, that makes it much clearer.
 

Donnie

Member
...I... wow... I didn't even know there was even a card named 7790! Is this OEM?

I don't think its even out yet. But I remember it being announced for a release sometime in Q2 2012, seem to remember the GPU codename being Pitcairn LE and it having 18 compute units.
 

CLEEK

Member
Gemüsepizza;37645130 said:
If they go for split RAM with DDR3, the number of chips used for DDR3 is imo not really important - they could use slots like the ones you find on a normal PC mainboard, that way they can put any DDR3 module with the right size in there. This would a) save some space, b) gives them more flexibility and c) they can get some nice discount from Newegg. And yes, it would be cheaper: 2GB GDDR5 RAM ~ $26-31, 4 GB DDR3 RAM ~$15-20 (consumer price!).

"Hello, welcome to Newegg phone sales. How may I help you?"
"Hey Newegg, Sony here."
"Hello Sony, what can I do for you today?"
"Hmm, I just wondered if you have any DDR3 in stock?"
"One moment please... Yes, we do have this item in stock".
"Oh cool. I'd like to by two hundred million GB please!"
 

Donnie

Member
No, I'm assuming it was a typo...

Having 1000+ SPUs? Jesus Christ...

1.1 million teraflops...

Or something.

What do you mean?, plenty of AMD GPU's have over 1000 SPU's now. With the top one topping out at 2048 stream processors and 3.8 teraflops, no idea where you're getting 1.1 million teraflops from??!
 
What do you mean?, plenty of AMD GPU's have over 1000 SPU's now. With the top one topping out at 2048 stream processors and 3.8 teraflops, no idea where you're getting 1.1 million teraflops from??!

I thought you were talking about those "Synergistic Processing Units" that were in the cell.

Didn't phase me that you could have been possibly talking about "stream processing units."

Derrrrrp on me.
 

Chittagong

Gold Member
4-core AMD CPU (originally Steamroller, though the other poster is saying that changed to Jaguar)
2GB GDDR5
18CU GPU (1152 ALUs)

Hmm... My pimped up Alienware X51 could be beefier than this (efficiencies of a closed box yes so not apples to apples).

Not sure if that makes me happy or sad. Happy for my X51 purchase but sad because the games industry won't be at the bleeding edge anymore. For all it sounds it seems, in fact, that Sony is doing their X51.
 
Hmm... My pimped up Alienware X51 could be beefier than this (efficiencies of a closed box yes so not apples to apples).

Not sure if that makes me happy or sad. Happy for my X51 purchase but sad because the games industry won't be at the bleeding edge anymore. For all it sounds it seems, in fact, that Sony is doing their X51.

I am happy as the risk for developers costwise becomes limited as the increase in costs will not be as big compared to last gen. We migh also get more variety for middle budget titles.
 
Hmm... My pimped up Alienware X51 could be beefier than this (efficiencies of a closed box yes so not apples to apples).

Not sure if that makes me happy or sad. Happy for my X51 purchase but sad because the games industry won't be at the bleeding edge anymore. For all it sounds it seems, in fact, that Sony is doing their X51.

Well, to be fair, consoles haven't been at the bleeding edge for graphical tech for decades, if ever.

Consoles will always have compromises due to mainstream pricing levels.

But, the benefit to all of this is that developers will now transition towards next-gen development. PC games, today, are built with consoles in mind and are just ported over with better framerates and settings.

Will be amazing to see what games look like built ground up for these hardware specs.
 

i-Lo

Member
I am happy as the risk for developers costwise becomes limited as the increase in costs will not be as big compared to last gen. We migh also get more variety for middle budget titles.

Technically, we will never see another substantial rise in dev cost between generations like we did with this one because of the jump from not only HD to SD but also various real time effects that are now mainstream. Of course, in time, if the paradigm of designing game shifts because of adapting to completely new concepts then it'll be a different story.
 

StevieP

Banned
XDR2 was introduced seven years ago in 2005. I think it is safe to assume Rambus has an improved memory design along with other things, just look at the Terabyte Initative with improved signaling. Everything continues to evolve.

I'm a bit skeptical that stacked memory will be "omg awesome" compared to what Rambus may have up its sleeve (XDR3 or whatever).

But then again maybe Rambus is concentrating its efforts on signaling for 3d stacked chips.

I wouldn't blink an eye in Rambus' general direction.

Technically, we will never see another substantial rise in dev cost between generations like we did with this one because of the jump from not only HD to SD but also various real time effects that are now mainstream. Of course, in time, if the paradigm of designing game shifts because of adapting to completely new concepts then it'll be a different story.

The costs will still rise, however. Everyone loves to use Samaritan as an example, so if you wanted games to look like that through-and-through they would rise quite a bit and the (financial and creative) stress on the industry will heighten.
 

Ashes

Banned
I wouldn't blink an eye in Rambus' general direction.



The costs will still rise, however. Everyone loves to use Samaritan as an example, so if you wanted games to look like that through-and-through they would rise quite a bit and the (financial and creative) stress on the industry will heighten.

how would that affect the development pipeline? And budgetary expenses?
 
So it seem going by the last few post Sony is going the opposite of PS3 .
In PS3 they had a monster CPU and a weak GPU for PS4 they going to have a good GPU and a weak CPU.
Well i guess they think it's better to spend there thermal budget on GPGPU which can help out the CPU with certain task , looking forward to see how this works out .
You've got some of the idea. SteveP is overly concerned about the move to a more efficient and clocked slower Jaguar X86 CPU.

In a AMD Fusion APU both the X86 CPU (in our rumored PS4 2 CPU cores and 400 GPU cores) and GP GPU can be used as a CPU. Combined they exceed the performance of 24-30 Cell SPUs and with other efficiencies discussed may be 113% more efficient at some tasks....that's equal to 60 SPUs or at some tasks 1200 times faster than a X86 processor alone. Far from weak this supports another model (CPU bound UE4) for games. (Example: bundled Ray tracing for lighting)

This is why I think rumors of Xbox 720 having 2 GPUs suggest Microsoft is also going to have a Fusion CPU-GPU plus second GPU. 4 X86 or 16 PPC or even 24 SPUs pale in comparison to a Fusion CPU-GPU.

Early leaks only mentioned one or both might go with AMD X86 processors and AMD GPUs. Without the Fusion of the two it does not make sense, PPUs + GPGPU or Cell + GPGPU makes more sense. My bad for not being up on AMD Fusion (HSA) & Fabric computing and what it brings to the table. Nvidia is combining an ARM CPU core with a GPGPU to offer the same Fusion and resulting efficiencies.

If Microsoft is using an AMD Fusion + second GPU similar to Sony and AMD supports Windows 8 with rumors the next Xbox might use Windows 8, what will Sony use as an OS considering Microsoft registered the domain name Microsoft Sony.com and Sony Microsoft.com. Consider also the HSA Fusion & fabric computing model scales from Handheld (Sony just bought out Sony-Erricson phones) to Super computer with Windows 8 working on Phones, Tablet, Laptop, Desktop and Game console. Edit: It's possible that IBM is providing a PPU + AMD GPU fusion chip, AMD has said this is possible for a ARM + AMD fusion or any CPU.


xbox8.jpg


Mark Rein of Epic Games talked up the Unreal Engine as he announced at the DICE Summit that the soon-to-be-seen Unreal Engine 4 is “Already Running” on products he claimed could not be named at this time. Heavy speculation is pointing to the Next Generation Xbox, called the Xbox 720 by many as the name formed out of necessity and fan-base creativity. New updates on longstanding rumors point out a new potential naming scheme, Xbox 8.

This new development in combination with the notion that Windows 8 will allow gamers to play Xbox 360 (and assumedly ‘Xbox 8′) games on custom-built gaming PC’s paints a very interesting potential for Microsoft’s next-gen. Many critics of their decisions this generation have often called out for Microsoft to become a software entity in the gaming world and license out Xbox Live as a service rather than sell consoles.
Best guess, the PC must use a Fusion CPU-GPU and HSA which is open source. But how does it support Xbox 360 games? 3 PPUs, software emulation or is this stretching speculation to the breaking point?

Rumor: Xbox 8 Has Unreal Engine 4 and Windows 8 in its Arsenal

“Recently we reported that alpha forms of the next-generation Xbox console which could be named “XBOX 8″ – not 8 because it’s the eighth Xbox, but because it’s the eighth generation home console and heavily shares the same software as upcoming Windows 8 and Windows Phone 8.

It was reported to us that the next-gen console was on display by invite only during CES 2012, running a version of Battlefield 3 in next-gen visuals – most likely to be up on par with current high-spec gaming PC’s.

Microsoft previously reported that there will be no next-gen Xbox announcement at E3 2012, but then again we also remember the amount of times they denied an Xbox 360 Slim console before that was the surprise of E3 2010!” [ThisIsXbox]

The hype for a next-generation Xbox console is well and truly hotting up; more-so since the news hit that Microsoft are rolling out production of developer units that feature an Oban chipset manufactured by IBM.

Whilst the next-gen Xbox is commonly known and referred to as an Xbox 720 – it was previously rumoured due to an MS Nerd source as codename ‘Loop’, whilst other sources point to the direction of ‘Xbox Infinity’. Would it shock you so much that Microsoft are pondering over naming the third generation Xbox console, which just so happens to be an eighth generation home console, “XBOX 8″ in line with Windows 8, Windows Phone 8 that will fluently all interconnect with each other.

So, how did we come by this news – well, an employee for a well known combined Publisher and Developer known to thisisxbox.com (who left his employment just a mere few days ago) stated that despite a current mass run of developer units ready to go into production, themselves and other developers of top titles have already received dev units of hardware, but in an alpha non-console form. Also confirming that newer versions of that unit in a non-console form was shown behind closed doors at the recent CES 2012 event by invite only running a fully playable version of Battlefield 3.

Next up, codename ‘Loop’ and codename ‘Infinity’ are meant to relate to services accessible within the online functions of XBOX 8 and not actually be names of the console itself. Infinity is an enhanced form of what we know now as Xbox LIVE, and will be the platform used for future Xbox 8 digital downloads, Windows 8 digital downloads, and Apps for all three systems. Infinity (note not Xbox Infinity) is an online platform that could also be implemented within future TV sets to bring approved apps to the home without the need for an Xbox console attached to the TV.

Xbox Loop is a codename for a service that will bring all current-gen digital stock into a next-gen era on the new console – it is the backwards compatible functions, but since it is likely to launch from within Infinity on the Xbox 8, we will never hear or see it referred to as Loop outside of the developer units unless it is used within the retail version of Windows 8 later this year.

http://www.windowsmobile8.com/ said:
Windows 8 (October 2012) is designed to be the first Windows client to support systems on a chip (SoC) architectures, including ARM, and since it will be pre-installed on a range of next generation devices, it will also feature Metro, a NUI + GUI interface on the surface combined with a new application platform under-the-hood designed to enable the creation of immersive experiences.

Wikipedia on SOC a must read. Slightly out of date as it doesn't take into account 3D wafer stacking and it's impact on SOC.
Wikipedia article on FPGA (glue logic and configuration after testing a wafer, Security and more)
Applications of FPGAs include digital signal processing, software-defined radio, aerospace and defense systems, ASIC prototyping, medical imaging, computer vision, speech recognition, cryptography, bioinformatics, computer hardware emulation, radio astronomy, metal detection and a growing range of other areas.
FPGA is the Programmable logic array mentioned by the Sony CTO for Coming Playstation Tech. The smaller the array the faster it can run (heat again) and it also benefits from a HSA design where A CPU can load pre-configured designs into the FPGA dynamically as well as (same model with a HSA GPU) pre-fetch data using the branch prediction abilities of the CPU.

http://www.neogaf.com/forum/showthread.php?t=458527&highlight=fpga+ps4 said:
- 'the company is working on a system-on-chip (SoC) to underpin the product for "seven to 10 years".'
- 'He describes the architecture in broad terms: "You are talking about powerful CPU and GPU with extra DSP and programmable logic."' (Alternative quote in another article: ' “We are looking at an architecture where the bulk of processing will still sit on the main board, with CPU and graphics added to by more digital signal processing and some configurable logic.”)
- 'Tsuruta-san picked out emerging ‘through silicon via’ designs. These stack chips with interconnects running vertically through them to reduce length, raise performance and reduce power consumption.'
- 'Tsuruta-san has noted the difficulties in achieving viable yields at 28nm, though he believes that these problems are now moving towards a resolution.'
- Tsuruta: "We are confident that we can now see a way and that we can use some of these advanced methods to create a new kind of system-on-chip. We think that there are the technologies today that can be taken to this project.”
Hirari mentioned that Medical Imaging is a second use for the PS4 SOC.
- Tsuruta: "We understand that for this, we will need to offer a very strong SDK. We will retain our own OS for the main games and support that with a development environment that is viable. For online and other features, we are also thinking of a simpler approach to a Linux-type environment than on the PlayStation 3,"
Not Windows 8, possibly eLinux or FreeBSD Gnome Mobile + GTKWebkit with Wayland not full X windows?



It's possible that IBM is providing a PPU + AMD GPU fusion chip, AMD has said this is possible for a ARM + AMD fusion or any CPU.

The OBAN mentioned as being used by IBM to produce the Xbox 720 can be what it was named for, a blank that is written on or rather a LARGE substrate with bumps upon which 3D stacked and 3D wafers are 2.5D attached. With proper software design tools and standards (IBM, Global Foundries and Samsung) for wafer sub assemblies it should be very easy to design or rework custom SOC without large lead times. AMD has been working on this for 5 years.


According to the data gleaned from presentations by Samsung, Toshiba, AMD, and others, 3D IC assembly gives you the equivalent performance boost of 2 IC generations (assuming Dennard scaling wasn’t dead). Garrou then quoted AMD’s CTO Byran Black, who spoke at the Global Interposer Technology 2011 Workshop last month. AMD has been working on 3D IC assembly for more than five years but has intentionally not been talking about it. AMD’s 22nm Southbridge chips will probably be the last ones to be “impacted by scaling” said Black. AMD’s future belongs to partitioning of functions among chips that are process-optimized for the function (CPU, Cache, DRAM, GPU, analog, SSD) and then assembled as 3D or 2.5D stacks.
This is starting in 2012 with full production scheduled for 2013. It makes sense given standardized building blocks mentioned above in the quote to have a design tool in place to make a blank substrate (Oban) with bumps and traces to allow the building blocks to be attached. This can reduce the time to market and allow for tweeking the design which must be the case as there are rumors of the Oban 720 chip being produced two months ago but redesign rumors last month. This is not possible any other way.

OBAN Japanese Coin


hist_coin13.jpg



The idea of the OBAN, a large blank substrate, to produce a large SOC. It can be custom configured and could be used in the PS4, Xbox 720 and WiiU. This plus standardized building blocks produced by the consortium make sense. It makes sense of the various rumors. Arguments that this would be ready for this Cycle 2013-2014 have supporting cites. Old design and assembly methods for SOC or discrete components with their associated lead times no longer apply.
 

deadlast

Member
“...It was reported to us that the next-gen console was on display by invite only during CES 2012, running a version of Battlefield 3 in next-gen visuals – most likely to be up on par with current high-spec gaming PC’s...” [ThisIsXbox]
If MS is intending for the Xbox720 to make Xbox360 games look better, that would be a fantastic strategy. Imagine having a true transition period of time where you could buy a game that would work on the new console and the old console. The game would just look a lot better on the new console. Then you could phase out the older console by making more games that only run on the newer system.
 
Accelerate HPC Applications with FPGA Coprocessors—Sponsored by AMD and Altera

hpc-amd-net-seminar.gif



Join Altera’s Senior Vice President of R&D and AMD’s Division Manager of Acceleration Strategies to learn:

Why your current HPC solution is not meeting your performance needs
How to accelerate algorithms and applications over 100x
Why porting applications to FPGAs speeds up your entire system, saving time and money
Who Should View

Financial, medical, and insurance industry IT managers

Accelerate applications up to 100X with FPGA. FPGAs are most likely part of the AMD inventory of pre-manufactured and tested built to standards as building blocks in AMD-IBM-Samsung SOCs.
 
Hmm... My pimped up Alienware X51 could be beefier than this (efficiencies of a closed box yes so not apples to apples).

Not sure if that makes me happy or sad. Happy for my X51 purchase but sad because the games industry won't be at the bleeding edge anymore. For all it sounds it seems, in fact, that Sony is doing their X51.

1000+SP's will still be pretty mind blowing in a closed box.

360 has 240's SP's running at 500 mhz for compare. So imagine 1000+ running at 800-1000 mhz.

That would mean devs would have ~8X the shaders to play with as now.

Trust me, it would be a suitable generation jump, though I'd hope for even more.
 

HyperionX

Member
1000+SP's will still be pretty mind blowing in a closed box.

360 has 240's SP's running at 500 mhz for compare. So imagine 1000+ running at 800-1000 mhz.

That would mean devs would have ~8X the shaders to play with as now.

Trust me, it would be a suitable generation jump, though I'd hope for even more.

Not if its stuck with a puny 128-bit memory bus. If that's the case, next-gen will be very underpowered compared to PCs.

On the other hand, if they bit the bullet and used a 256-bit bus, there's absolutely no way they'll hit $300 at launch. Even $400 will be a stretch. If the rumors are true, and Sony and MS have abandoned their original low-power designs, then I fully expect $500-600 at launch.
 

androvsky

Member
Kaz,

An FPGA in a game console? Are you nuts?

Kutaragi


P.S. Call me if you need a modest, cost-conscious design. Seriously. These Bandai guys won't quit talking about Gundam.
 

Proelite

Member
Not if its stuck with a puny 128-bit memory bus. If that's the case, next-gen will be very underpowered compared to PCs.

On the other hand, if they bit the bullet and used a 256-bit bus, there's absolutely no way they'll hit $300 at launch. Even $400 will be a stretch. If the rumors are true, and Sony and MS have abandoned their original low-power designs, then I fully expect $500-600 at launch.

Thats where the subsidized consoles come in.
 

teiresias

Member
Accelerate HPC Applications with FPGA Coprocessors—Sponsored by AMD and Altera

hpc-amd-net-seminar.gif



Join Altera’s Senior Vice President of R&D and AMD’s Division Manager of Acceleration Strategies to learn:

Why your current HPC solution is not meeting your performance needs
How to accelerate algorithms and applications over 100x
Why porting applications to FPGAs speeds up your entire system, saving time and money
Who Should View

Financial, medical, and insurance industry IT managers

Accelerate applications up to 100X with FPGA. FPGAs are most likely part of the AMD inventory of pre-manufactured and tested built to standards as building blocks in AMD-IBM-Samsung SOCs.

I'm failing to see what the application of FPGAs to game consoles is going to buy anyone at this point. The processes needed in game applications and graphics rendering are already well known and the hardware is already designed completely around and to accelerate the relevant algorithms and operations - hence the development of the GPU industry in the first place.

These kind of seminars are mainly to educate people that use general purpose computing and software-based processing algorithms on the advantages of using FPGAs to synthesize targeted hardware processing units for the specific application rather than running something on an x86 in software, and even then, only when it has such a specific use application that it's not cost effective for them to move from an FPGA implementation to an actual static ASIC.

However, if you want FPGAs in your game consoles, bring it on. I look forward to the added level of system fanboys once you get the Altera and Xilinx fanboys duking it out (Xilinx all the way BABY!!!!!)
 

Ashes

Banned
Talking about R&D, in light of recent corporate shuffles ( since Hirai took over), gaming division, is getting a bigger share of the pie.

Within this, imaging, gaming and mobile will be the core areas of focus, taking 70% of its R&D spending going forward, with the aim of yielding 85% of operating profit from these sectors by 2014.
Source

http://www.warc.com/LatestNews/News/Sony_focuses_on_brand_DNA.news?ID=29713

Can't hurt, can it?

Edit: and this is the clearest article I've found on recent results:
http://www.eurogamer.net/articles/2012-05-10-annual-ps3-sales-down-vita-sales-withheld
 

i-Lo

Member
Talking about R&D, in light of recent results ( since Hirai took over), gaming division, is getting a bigger share of the pie.


Source

http://www.warc.com/LatestNews/News/Sony_focuses_on_brand_DNA.news?ID=29713

Can't hurt, can it?

I don't think it would given that they should be displacing some of the cost to AMD. Unlike with PS3, this time around, there is no one off product like Cell (so no more hedging almost all of the bets of R&D on one thing).

I think this time around, a good amount of R&D might be based more on control interface and evolution of PSN.
 

Ashes

Banned
I don't think it would given that they should be displacing some of the cost to AMD. Unlike with PS3, this time around, there is no one off product like Cell (so no more hedging almost all of the bets of R&D on one thing).

I think this time around, a good amount of R&D might be based more on control interface and evolution of PSN.

I'm thinking the same. But, honestly, If I were in Sony's position, I would effectively cheat, and create two maybe three consoles specs. Then see what x8 has, and sell a similar spec, but undercut it by $50 or something.
 
I'm failing to see what the application of FPGAs to game consoles is going to buy anyone at this point. The processes needed in game applications and graphics rendering are already well known and the hardware is already designed completely around and to accelerate the relevant algorithms and operations - hence the development of the GPU industry in the first place.

These kind of seminars are mainly to educate people that use general purpose computing and software-based processing algorithms on the advantages of using FPGAs to synthesize targeted hardware processing units for the specific application rather than running something on an x86 in software, and even then, only when it has such a specific use application that it's not cost effective for them to move from an FPGA implementation to an actual static ASIC.

However, if you want FPGAs in your game consoles, bring it on. I look forward to the added level of system fanboys once you get the Altera and Xilinx fanboys duking it out (Xilinx all the way BABY!!!!!)
FPGA was mentioned by the Sony CTO, there are many use cases that would apply to a Game console (Speech and video recognition) and it's more efficient both in power and performance than CPU or GPU. Cost has come down and we may be talking a lower cost smaller array. If it gives an advantage and reduces heat and is economical then why not?

You do understand that AMD and IBM are designing process optimized building blocks with standards that allow a SOC to be built using "Building blocks" 2.5D attached to a SOC substrate. My point was that a line of FPGAs is also part of the building blocks that Sony and Microsoft can use to build the Next generation game console SOC.
 

i-Lo

Member
It's not.

Which one do you reckon? 256 or 384 bit?

EDIT: One question- All things remaining exactly the same, would the performance improve for a graphics card in real world performance by increasing the bus width and if so then by how much (say in a card like AMD R7770)?
 

onQ123

Member
I'm failing to see what the application of FPGAs to game consoles is going to buy anyone at this point. The processes needed in game applications and graphics rendering are already well known and the hardware is already designed completely around and to accelerate the relevant algorithms and operations - hence the development of the GPU industry in the first place.

These kind of seminars are mainly to educate people that use general purpose computing and software-based processing algorithms on the advantages of using FPGAs to synthesize targeted hardware processing units for the specific application rather than running something on an x86 in software, and even then, only when it has such a specific use application that it's not cost effective for them to move from an FPGA implementation to an actual static ASIC.

However, if you want FPGAs in your game consoles, bring it on. I look forward to the added level of system fanboys once you get the Altera and Xilinx fanboys duking it out (Xilinx all the way BABY!!!!!)

Video\sound Processing for motion tracking & voice controls that want take away from the GPU & CPU & video streaming for remote play all standard without the GPU\CPU taking a hit.
 

TTP

Have a fun! Enjoy!
I remember the times when all I needed to know about electronics was that 32bit was better than 16bit and all that simple stuff.

Now, data discussed in threads like this one just fly over my head :D

Sorry for the interruption. Keep going but please someone sum it up in layman terms from time to time :D

Regards,
An old fart
 

Ashes

Banned
I remember the times when all I needed to know about electronics was that 32bit was better than 16bit and all that simple stuff.

Now, data discussed in threads like this one just fly over my head :D

Sorry for the interruption. Keep going but please someone sum it up in layman terms from time to time :D

Regards,
An old fart

See the thread title, that's what we think it'll be. :p
 
Which one do you reckon? 256 or 384 bit?

EDIT: One question- All things remaining exactly the same, would the performance improve for a graphics card in real world performance by increasing the bus width and if so then by how much (say in a card like AMD R7770)?

Based on the amount of memory I would assume 256-bit.

I can't answer the latter part, but the increased BW would allow more data to be moved faster.
 

i-Lo

Member
Based on the amount of memory I would assume 256-bit.

I can't answer the latter part, but the increased BW would allow more data to be moved faster.

What's the assumed amount of memory?

With regards to latter, I thought it was only a matter of transferring more data per clock cycle. Eh, I may be wrong.

Btw, thanks for the replies so far BG
 

Ashes

Banned
Well, this is what I think (and look like) when I read the APU part :D

;)

Apu is Amd's new thing - chip with graphic and CPU capabilities. Don't shoot me people, I'm trying to simplify this shit, gimme a break!

Right now, they're winging it - putting a CPU and GPU in one chip.

But going forward, they plan to do it proper, actually integrate it and share out the load.

And stacking is basically that - stacking stuff.
 
What's the assumed amount of memory?

With regards to latter, I thought it was only a matter of transferring more data per clock cycle. Eh, I may be wrong.

Btw, thanks for the replies so far BG

YW.

For me I assumed you meant overall performance that's why I said I couldn't answer that. And the assumed amount is 2GB of GDDR5.
 

Ashes

Banned
So I take it the ps4 won't be much stronger than the wii u, if at all? Sorry, I don't know much about graphics tech.

Who knows really?

But going of last gen, and last gen alone, I'd say ps4 is going high-end console tech to compete with X8, and wii u, is getting a HD upgrade.

Right now, rumours suggest a biggish difference, but honestly, we don't know. Consoles are power limited ( thing energy, not performance), but these embedded solutions chips seem too low-ball for my cynicism not to kick in.
 

TTP

Have a fun! Enjoy!
;)

Apu is Amd's new thing - chip with graphic and CPU capabilities. Don't shoot me people, I'm trying to simplify this shit, gimme a break!

Right now, they're winging it - putting a CPU and GPU in one chip.

But going forward, they plan to do it proper, actually integrate it and share out the load.

And stacking is basically that - stacking stuff.

Thanks! :D

Stacking is the 3D thing right? Putting stuff on top of each other to save space on the mb and speed up data transfer.
 

Proelite

Member
I'm thinking the same. But, honestly, If I were in Sony's position, I would effectively cheat, and create two maybe three consoles specs. Then see what x8 has, and sell a similar spec, but undercut it by $50 or something.

Price war with MS? GL. :p

Not to mention MS will cheat by offering subsidized plans.

Sony should focus on creating an unique gaming experience, and offer FREE PSN that rivals / overtakes the paid XBL.
 

StevieP

Banned
So I take it the ps4 won't be much stronger than the wii u, if at all? Sorry, I don't know much about graphics tech.
It will be more powerful (certainly the gpu, no idea about the CPU) but i think it will be the GameCube of the generation if Microsoft has gone the "batshit insane" route.
 

i-Lo

Member
So I take it the ps4 won't be much stronger than the wii u, if at all? Sorry, I don't know much about graphics tech.

You are right. It'll not be any more powerful than WiiU despite coming over a year later.

/trollface

On a more serious note, if what the devs are saying about WiiU is true then both XB3 and PS4 would be head and shoulders above WiiU.

Personally, I am still a bit sceptical about 2GB RAM limit. It's entirely probable purely from a technological and economic viability at this time (2012) and if PS4 went with GDDR3 instead of GDDR5 then 4GB is definitely an option. Still, it comes down what aims Sony have for PS4.

It will be more powerful (certainly the gpu, no idea about the CPU) but i think it will be the GameCube of the generation if Microsoft has gone the "batshit insane" route.

If that's comparison stands true in sales then.. a big sigh from me for Sony.
 
So I take it the ps4 won't be much stronger than the wii u, if at all? Sorry, I don't know much about graphics tech.

Wii U will not be on par with PS4 or Xbox 3, but it should be able to get ports from those consoles.

The way I see it so far it's shaping up like last gen where power-wise:

Wii U = PS2

PS4 = GC

Xbox 3 = Xbox
 
Top Bottom