IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

Use the APU for OpenCL/Physics and the 7670 for the heavy lifting?
I looked at just the specs and it's like 4+ (400 Radon cores at 600 mhz) SPUs for physics, Codecs, OpenCL and could work like SPUs for graphics. Is it possible that it's fast enough to emulate SPUs, with a little work?

AMD A8-3850 Summary

Quote:
Model: A8-3850
Radeon Brand: HD 6550D
CPU Clock Speed: 2.9GHz
CPU Cores on Die: 4 cores
TDP (thermal design power): 100W
Total L2 Cache: 4MB
Radeon Cores on Die: 400
GPU Clock Speed: 600 MHz
DirectX Version: 11
Unified Video Decoder: UVD3
DDR3 Speed: 1866
 
Yes, it has to and i think we will get it with Durango. MS is asking and listening to their third party developers and those made i more than clear that 2 GB of RAM won`t get the job done for what they are targeting for. Epic wants it, Crytek wants it, Dice wants it.... and if you consider that those guys will provide the engines for most next gen games, i`m confident MS will make sure that their console can run those engines the best again.
But of course we get the "lol you don't know what you're talking about" bs if someone wants to suggest more than 2GB of RAM.

I'll listen to Crytek before anyone on a message board who assumes I'm dumb enough to really think Sony should order their RAM from newegg.
 

Lord Error

Insane For Sony
The design behind the Vita is more about best-bang-for-buck, not cutting edge graphics.
Not really. If they wanted best bang for the buck, they could have used dual core ARM + dual core GPU, Iphone 4S style, and definitely wouldn't use OLED screen but some cheaper ipod-touch style non-IPS LCD instead. They clearly wanted machine to be packed with tech that was ahead of anything released in the mobile space at the time of its launch. Don't sell it short.
 
This specification sounds like something reasonable for WiiU.
For PS4? Its a joke.
If this rumored devkit specs is true, even then they dont mean much. X360 had a ton different hardware in their devkits, majority of launch games were made on G4 Macs with SLI gpu-s...

Im more wondering how will devkits look one year from now for both PS4 and X720.
 
If this rumored devkit specs is true, even then they dont mean much. X360 had a ton different hardware in their devkits, majority of launch games were made on G4 Macs with SLI gpu-s...

Im more wondering how will devkits look one year from now for both PS4 and X720.
yup this
 
Far too soon to say how things might turn out, but it'd be a huge risk if either Sony or MS is the sole company to go all-out with specs for their next console. They'd end up with all the disadvantages of high-end specs (higher launch price leading to lower adoption rate, bigger losses) with advantages (better-looking first-party titles, multiplatform games with additional graphical bells and whistles) that seem unlikely to balance out the former category.
 
They can still do that with a ~2 teraflop GPU.
I'm not so sure. 28nm might be too risky (PS3 didn't use 65nm, either) and Southern Islands might be too new for a 2013 system. Planning and developing a console takes years after all. I also don't expect any next generation system to exceed 150W. But we'll see.
 
I'm not so sure. 28nm might be too risky (PS3 didn't use 65nm, either) and Southern Islands might be too new for a 2013 system. Planning and developing a console takes years after all. I also don't expect any next generation system to exceed 150W. But we'll see.
IIRC The 90nm process used by both MS and Sony this gen was about as new as 28nm will be next year. If they weren't planning on using 28nm, there is little reason to wait until 2013 to launch.
 
This wouldn't be a new console generation. It'd be a pathetic leap for a 7 year gap.
Going by benchmarks of the 6670, this card can't even run Crysis smoothly on high settings (averages 25 fps at 1080p w/ 4xAA). If this is true then Sony is really bottlenecking their system (again) with a shitty GPU.
 
Going by benchmarks of the 6670, this card can't even run Crysis smoothly on high settings (averages 25 fps at 1080p w/ 4xAA). If this is true then Sony is really bottlenecking their system (again) with a shitty GPU.
And PS3 GPU can run that same Crysis how much slower? :)
 
But of course we get the "lol you don't know what you're talking about" bs if someone wants to suggest more than 2GB of RAM.

I'll listen to Crytek before anyone on a message board who assumes I'm dumb enough to really think Sony should order their RAM from newegg.
16 gigs of ram would have 0 impact on gaming. Doesn't even help speed up general use. It's only utilized for specific purposes unrelated to gaming like VMs. None of which things consoles are going to do.

That's one reason why people are lol'ing at your suggestions.
 
APU+GPU... might be less powerful than full sli setup rumored for Xbox. Possibly a little more complicated to take advantage of both. But maybe cheaper, with lower power consumption/heat.
It's not hard to imagine the "two gpu" Durango rumor might just be a garbled misinterpretation, and could very well be a APU+GPU combo similar to whatever the PS4 is getting.
 
It's not hard to imagine the "two gpu" Durango rumor might just be a garbled misinterpretation, and could very well be a APU+GPU combo similar to whatever the PS4 is getting.
vg247 said:
The GPUs aren’t structured as they are in a normal dual PC set-up, in which the two chips take it in turns to draw lines of the same object: Xbox 720′s graphics units will be able to work independently, drawing separate items simultaneously
Definitely not APU + Discrete if the above description is true. They'll have to be pretty much be two of the same cards.
 
Just read the specs?

Its a quad core with an integrated gpu + a descrete one.

Whats so hard to get?
Kindly guess or state if this is as good of an option as a modded Pitcairn?

These specs could mean that the console generation may get shorter this time. And that it's one step away from dedicated OnLive like services.

The more I read this (more importantly, if it's true), the more I think that Pastebin article writer's wrath and berating the PS4 seems justified
 
IIRC The 90nm process used by both MS and Sony this gen was about as new as 28nm is now. If they weren't planning on using 28nm, there is little reason to wait until 2013 to launch.
90nm is 2002 tech. IBM used it for mass market products in early 2004, PS3 launched in late 2006 (and reportedly still had massive yield issues). 28nm on the other hand had tons of delays, tons of problems, and ended up being far more expensive than anticipated. Which is... not good for a console. Sony and Microsoft have to make a decision very soon - if the decision wasn't already made - and it's hard to tell how the 28nm yields will be in 2013. 32nm should be an option, though.
 
At this point I just want them both to come out and announce a joint console.
Would love for this to happen, sadly it never will.

These specs could mean that the console generation may get shorter this time. And that it's one step away from dedicated OnLive like services.

The more I read this (more importantly, if it's true), the more I think that Pastebin article writer's wrath and berating the PS4 seems justified
Console generations are likely to become longer, not shorter. It's also likely that we'll see some type of OnLive service introduced for these next gen consoles, not the following systems.

Funny thing about that pastebin rumor is that it matches some details of more recent rumors such as the 24/7 connectivity of Durango or how most developers will receive PS4 dev kits by the end of this year. For a rumor that sounds like such rubbish, it's odd how he's gotten at least 4 specific details correct.

Edit:


90nm is 2002 tech. IBM used it for mass market products in early 2004, PS3 launched in late 2006 (and reportedly still had massive yield issues). 28nm on the other hand had tons of delays, tons of problems, and ended up being far more expensive than anticipated. Which is... not good for a console. Sony and Microsoft have to make a decision very soon - if the decision wasn't already made - and it's hard to tell how the 28nm yields will be in 2013. 32nm should be an option, though.
Was it 2002? Sorry, I thought it was introduced in 2004.

I still expect them to use 28nm, otherwise these systems will be even more disappointing.
 
btw, I wonder if this design, if true, is a consequence of Sony trying break into the market before MS.
This is what I have been thinking also.
If Sony cannot match nextbox in specs then they had better launch first. There will be zero point in them launching after nextbox with weaker hardware.
 
Big part of it would be the ability to run Open CL well on it, which is the kind of stuff Cell SPUs are used for anyway, bu the AMD approach there would be far friendlier and easier. For the general computing, the four core 3850 should be far better than the crummy main core that's in Cell
I wouldnt't be surprised if the main core and all its SPU has more horse power than a Llano cpu.

Each generation we've seen a remarkable upgrade regarding both cpu and gpu (PS1>PS2>PS3). This feels weird for me that Sony would go the cheap route and barely go with a noticeable but small upgrade. It's possible to pack a punch and sell their console below $500. Now it seems it's all about "less for more", aka the nintendo route which imo will not work in the long run.

I probably worry too much but I don't like what we hear concerning next gen. It seems like every console will be made of cheap tech, instead of cutting edge like before.
 
With the current rumors for 720 and PS4 it looks like they are almost the same machine with a different external casing. Multi-plat games might be exactly 1 to 1 for the next gen. And I am guessing 1st party Sony games would not be a show case for PS4, like they have been for PS3 this gen as being the graphic kings for this gen. In a way I am sad to see Sony shift away from its Ken vision of pushing the boundaries and going mainstream, but I guess it makes business sense to use off the shelf components.
 
I myself am hoping for a more powerful card for the gpu though, not enthusiast level GTX 680 or AMD 7990 but a high end card would be nice. What happend to the days that when consoles launched they were more powerful than PC's for some months before being overtaken again. When did we become satisfied with getting technology that is not only already on the market but isn't high end? I'd rather them wait and postpone the next gen, sort out all these RAM issues and wait for GPU's that will bring us a truly next gen experience.
Power consumption happened.

PC gamers are willing to stick 650 - 1000 watt power supplies into their PCs.

Try dissipating 650 watts of heat from a box the size of an Xbox 360.

For the absolute top tier of what you think performance could be like in the new console, you should be looking at gaming LAPTOP parts, not desktop parts.

New consoles aren't going to use more than 250 watts of power, if that even.

I wouldnt't be surprised if the main core and all its SPU has more horse power than a Llano cpu.

Each generation we've seen a remarkable upgrade regarding both cpu and gpu (PS1>PS2>PS3). This feels weird for me that Sony would go the cheap route and barely go with a noticeable but small upgrade. It's possible to pack a punch and sell their console below $500. Now it seems it's all about "less for more", aka the nintendo route which imo will not work in the long run.

I probably worry too much but I don't like what we hear concerning next gen. It seems like every console will be made of cheap tech, instead of cutting edge like before.
The APU in the llano alone is about the same power as the graphics card in the PS3. The consoles are OLD.
 
This is what I have been thinking also.
If Sony cannot match nextbox in specs then they had better launch first. There will be zero point in them launching after nextbox with weaker hardware.
Being a PS fan for a long time, I'd drop them like a stone if they can't match Xbox 3's specs given how they said it was going to be somewhat more powerful. I don't care for more power but at least fucking parity. This is massively disappointing.

I just want to vent and am certain I am not the only one. If this rumours are true and if the performance isn't close what a modified Pitcairn would have offered (which user, BrainStew may have hinted at) then here is a big "Fuck You".

PS: I'll probably end up buying PS4 near the end of its life cycle anyways if not sooner, lol.
 

Lord Error

Insane For Sony
Now it seems it's all about "less for more", aka the nintendo route which imo will not work in the long run.
Especially not for Sony.
Super dumbed down expectations of majority of people buying a Nintendo console: To have Mario and other Nintendo games looking clean, sharp and colorful.
Super dumbed down expectations of majority of people buying a Sony/MS console: Games have to look 'real'!
 
This is what I have been thinking also.
If Sony cannot match nextbox in specs then they had better launch first. There will be zero point in them launching after nextbox with weaker hardware.
Not if Xbox 3 is priced higher and Sony's software lineup is good enough. It's not going to be at a significant disadvantage to Microsoft's next console when it comes to third-party support, regardless of how large or small the performance delta is.
 
I remember the 360's gpu being on par with my radeon 1900xtx which was high end. These rumors of it using a 7670 is so full of shit. No way can Sony get good results if they plan on going up to 4k res(lol) or 1080p 3D. 720 and ps4 seems like they will be 3 generations behind PCs.

Mark my words, they won't be any less than a 6850.
 
"Up to" 1GB of VRAM is disappointing (to say the least) for a product that's being developed in 2012...

If the VRAM is that low, what's the main memory going to be? 3GB? LOL. Come on Sony, RAM is cheap... if last gen taught MS and Sony anything it should be that you can never have enough memory, and unlike a leet CPU (which Orbis may not even have, thus making it all the more inexpensive), it really does not cost much to add more. Go into next gen with what seems like a reasonable amount (i.e. 8-16GB main, 2-4GB VRAM), because by the end of its lifespan, it won't be anything.

Going with 2 or 3GB of main memory, for example, would be about as bad as giving PS3, say, 96MB of main memory last gen... it's just not kosher.
WTF? 1GB of VRAM would be good. There's not going to be more than 2-4GB of memory in total. 10-20GB is just absurd, and it's not going to happen.


Don't worry. I already work 2 jobs.
That joke was old five years ago.
 
Not if Xbox 3 is priced higher and Sony's software lineup is good enough. It's not going to be at a significant disadvantage to Microsoft's next console when it comes to third-party support, regardless of how large or small the performance delta is.
Yes but how long will it be before MS pays enough dollars to ensure that 3rd parties use that extra oomph. Imagine DLC that wasn't possible on PS4 or special effects, frame-rate, higher res textures etc.
How long would it take for word to get out that the 360 version ran a lot better than what they do now.
 
And PS3 GPU can run that same Crysis how much slower? :)
It's not the same. It ran at 720p and no AA, with settings closer to medium (with a much shorter draw distance and less vegetation). It was a pretty decent port overall but the hardware demands are nothing like the PC version with the settings I described.
 
It's not the same. It's 720p and no AA, with settings closer to medium (with a much shorter draw distance and less vegetation). It was a pretty decent port overall but the hardware demands are nothing like the PC version with the settings I described.
That, and the fact it's running on a different engine. I'd love to see the CE3 Crysis running on PC.
 
I looked at just the specs and it's like 4+ (400 Radon cores at 600 mhz) SPUs for physics, Codecs, OpenCL and could work like SPUs for graphics. Is it possible that it's fast enough to emulate SPUs, with a little work?

Radeon Cores on Die: 400
Parallelized emulation is incredibly difficult, if not impossible. You need to have as many cores in the host system as there are discrete threads in the emulated system. If you have fewer, then one emulated thread has to sit and wait for the core it's running on to feed it results from another, related thread. If you have more cores, the extra core can't do anything until it gets results from the on-going thread, so it might as well just keep running on the same core.

Extra cores come in handy when you have to do things like emulate sound processors and vector units, but in the case of going from 6 SPUs to 400 shader units... that just doesn't work. I can't think of any way to make it work, even if devs went back to the source code they'd probably have to completely rewrite everything unless they used something like OpenCL.