• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

onQ123

Member
I have faith that the PS4 will be able to Emulate The Cell with the GPU in the APU with a few added parts from SONY to play PS3 games.
 

theBishop

Banned
Just an echo of what others have been saying here - that GPU is the 'new' processing pack-horse. The CPU in next gen consoles will take a smaller role, be a facilitator and for the traditional branchy stuff that doesn't fit so well on a GPU. GPGPU is such now that the heavy (fp) computation is probably better put on a relatively big gpu than a cpu that will eat into your gpu budget. Last gen and before there was a case for that kind of CPU, but today, given the type of processor GPUs are now, GPU is 'the new Cell'.

And then just goes on to make the point that a closed box, an exemplary implementation of AMD HSA will yield software specialisation and experimentation that you might be less likely to get in another context like PC, but that will benefit other contexts too. If PS4 is AMD HSA and is reasonably powerful I've no doubt AMD will hold it up as an example of what to-the-metal coding can do on that kind of architecture.

So the CPU is basically the PPE to the GPU's SPUs?
 

KageMaru

Member
He alternately mentions the next generation in 2012 and 2012 -2020 for a second generation. He is talking 2012 when he mentions Larabee and Nvidia cuda. He mentions OpenCL in passing. Bypassing DirectX and OpenGL entirely would not apply if waiting for DirectX 12.

It looks like he was fully aware of the projected hardware issues in 2009 but he couldn't predict how OpenGL and DirectX would evolve. In one section there was no mention of OpenCL but Cuda was mentioned, no knowledge of AMD Fusion APUs or HSA although he mentions, "A unified architecture for computing and graphics Hardware Model" CPU-GPU combinations as well as a common memory pool and cache coherence.

This could be interesting if true. It would allow us to confirm as probable some of the rumors. Why, if the rumors are true, 16 PPUs for the Durango CPU and Sony with the first RUMORED choice of 24 SPUs; is Sweeney's vision the answer? Is future hardware here now with the AMD Fusion and HSA? Are IBM and AMD going to provide something similar for Durango (common 80 meg eDRAM cache, common memory pool and controller)?

You're right that DX12 becomes irrelevant if they plan to bypass the API entirely. However my point is that plenty has changed from the time that presentation was made and I wouldn't expect it to apply to next gen.

It's highly unlikely that developers will be able to bypass the API entirely, especially since that didn't happen this gen.

Your last paragraph pretty much proves that you're reading too much into all of this and will only end up disappointed.

I have faith that the PS4 will be able to Emulate The Cell with the GPU in the APU with a few added parts from SONY to play PS3 games.

Yeah, blind faith. ;p
 

i-Lo

Member
isn't that what makes faith, faith not having any real proof?

Precisely. I think it comes down to the magnitude of uncertainty that determines how blind the faith needs to be.

All this talk of ingenious new tech is balanced by the counter argument of time and cost. So we are always going round in that loop. Unless new and more credible information is at hand, this trend will keep on continuing.
 

i-Lo

Member
I had a question pertaining to power supply. If a console's power supply is rated at say, 250W, then can it be run stably, safely and consistently at the aforementioned limit?
 
You're right that DX12 becomes irrelevant if they plan to bypass the API entirely. However my point is that plenty has changed from the time that presentation was made and I wouldn't expect it to apply to next gen.

It's highly unlikely that developers will be able to bypass the API entirely, especially since that didn't happen this gen.

Your last paragraph pretty much proves that you're reading too much into all of this and will only end up disappointed.

You are reading your viewpoints into my posts again. I cited a Sweeney post only 5 months old and it again points to "TONS of CPU cores":

Originally Posted by http://www.joystiq.com/2011/09/28/epic-games-tim-sweeney-talks-unreal-engine-4-be-patient-until/:

Sept 28 2011 Sweeney said: "I spend about 60 percent of my time every day doing research work that's aimed at our next generation engine and the next generation of consoles," Sweeney told IGN, adding that this "technology that won't see the light of day until probably around 2014."

There are two primary technical challenges facing video games today, Sweeney said. The first, and most addressable, is the need to scale up "to tons of CPU cores." While UE3 can divide discrete processes across a handful of cores, "once you have 20 cores" it isn't that simple "because all these parameters change dynamically as different things come on screen and load as you shift from scene to scene." These advancements will help achieve "movie quality graphics" since that outcome has been limited primarily by horsepower. "We just haven't been able to do it because we don't have enough terra flops or petta flops of computer power to make it so," Sweeney said. Less likely to be conquered in the next 10 years: the "simulation of human aspects of the game experience," Sweeney explained. "We've seen very, very little progress in these areas over the past few decades so it leaves me very skeptical about our prospects for breakthroughs in the immediate future."

More than 20 cores, tons of CPU cores can't be properly supported by UE3...UE4 may support tons of CPU cores and support Sweeney's 2009 vision. He mentions in the 2009 slide show 5 years to develop a game engine starting in 2009 would be 2014.

Also I mentioned two posts ago that there would likely be two models for Game developers;

Ureal Engine 3 1) Traditional extension to last generation with OpenGL and a more GPU bound model and
Ureal Engine 4 2) Limited Ray tracing (CPU bound) and more CPU use resulting in less of the, as Tom Sweeney said; " all game engines work the same so the products all look the same as they are using the same APIs (OpenGL-DirectX)". This is what I got from the Epic presentation, he wanted to differentiate his games and engine and the only way to do that is with the CPU (provided next generation had the CPU power which rumors might support).

Model 1 and the PS4 APU can be used for GPU use in combination with the second GPU
Model 2 and the AMD APU is 100% used as a CPU with the second GPU as graphics only. In this model the APU is slightly more powerful than 24 SPUs (roughly 1 SPU = 13 GPU elements) Roughly assumes a new cell2 gets over scaling issues with memory and more. Also OpenCL efficiencies for GPUs were nearly 100% while Cell 90%. Branch prediction would be nice to have if more CPU bound which SPUs and GPUs don't really support well; PPUs and X86 cores support branching ...this might be another reason for an AMD APU.

What does this tell us...it supports some of the speculation in this thread...2014, 7XXX GPU or greater, most likely 2 GPUs (Sony can use the custom APU for medical imaging and with a second GPU for a Game console which supports the two models above)...gives us an idea of the difference between UE3 and UE4....supports a larger number of CPUs for Xbox Durango (16 PPUs or it is also using an AMD APU) and the target is movie quality graphics which requires at 1080p 2.5TFLOPS and some lesser amount at 720P which is more likely. Movie quality graphics even at 720P would be hard to fit into a Game console power budget unless it is taking advantage of HSA efficiencies both hardware and software also mentioned by Sweeney.

Apologies for the duplication of a previous post to those who are paying attention.
 
I'm not sure you quite understand how emulation works if this is the case. lol
The Sony CTO mentioned Field Programmable Gate Arrays being in the PS4....what could they be used for....I mentioned their use in stacking to program a subunit or turn off duplicate defective parts of subunits or wafers but that can be done by fuse-able links and simpler circuits.

A FPGA can emulate a SPU ("actual hardware clone" using a FPGA but wasn't fast enough) as well as provide other functions. Is it practical now? Stacking is being used to produce memory and FPGAs with cost reductions coming. Power PC emulation and then a paper on Virtualized Full-System Emulation of
Multiprocessors using FPGAs
would support a possible, speed being the issue (remember the BSC rumor late last year).

Also, if emulation only has to happen between frames and we have GPUs that are more than 10 times faster than SPUs and X86 processors with branch prediction that can run JIT code efficiently to emulate a SPU with only a small amount of emulation needed with the majority API simulation which is likely to be 10 times faster also, it would seem to me to be possible though very very difficult. How important will this be to Sony?

Edit: The SPUs would have to be emulated by X86 processors which might have nearly the same clock speed. The X86 vector processors with 256 wide registers (twice the SPUs registers) even though more efficient in vector mode because of the wider registers can not sustain 100% duty cycle emulating SPUs as they would overheat due in part to the branch prediction hardware that was stripped out of SPUs to allow them to run faster and cooler. As has been mentioned SPUs were primarily used by developers as part of a shader pipeline to supplement the RSX. Recognizing an engine's use of the SPUs for such a function could have the more efficient GPUs passed that duty.

So only if X86 is used sparingly to JIT emulate the Cell processor and most of the processes are simulated using the more powerful GPU hardware MIGHT it be possible to emulate/simulate real time PS3. We would probably again have some games break this emulation if developer or game engine used SPUs for too many unpredictable uses.

http://en.wikipedia.org/wiki/Emulator said:
However, the speed penalty inherent in interpretation can be a problem when emulating computers whose processor speed is on the same order of magnitude as the host machine. Until not many years ago, emulation in such situations was considered completely impractical by many.

What allowed breaking through this restriction were the advances in dynamic recompilation techniques.
Simple a priori translation of emulated program code into code runnable on the host architecture is usually impossible because of several reasons:

code may be modified while in RAM, even if it is modified only by the emulated operating system when loading the code (for example from disk) there may not be a way to reliably distinguish data (which should not be translated) from executable code. Various forms of dynamic recompilation, including the popular Just In Time compiler (JIT) technique, try to circumvent these problems by waiting until the processor control flow jumps into a location containing untranslated code, and only then ("just in time") translates a block of the code into host code that can be executed. The translated code is kept in a code cache, and the original code is not lost or affected; this way, even data segments can be (meaninglessly) translated by the recompiler, resulting in no more than a waste of translation time.

Edit: The following Durante's posts are accurate and I'm not trying to disagree with anything other than statements that it's absolutely not possible.
 
I have faith that the PS4 will be able to Emulate The Cell with the GPU in the APU with a few added parts from SONY to play PS3 games.

I rather see them not wasting time money and recources on BC. Just buy a 100$ ps3 in two years or plug in the one you allready have. Who has time to play old games that much to make bc even the slightest bit important?
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
I rather see them not wasting time money and resources on BC. Just buy a 100$ ps3 in two years or plug in the one you already have. Who has time to play old games that much to make bc even the slightest bit important?

For most consumers BC is rather important when a console has just been launched. You can sell your old console, use the money to buy the new console and it will boost the new console's library enormously when there's only a handful of PS4 games on store shelves. Even if most people don't use it, it's something that adds value to the purchase and it makes the expensive upgrade easier to swallow.

I certainly wouldn't mind if the PS4 were able to run PS1, PS2 and PS3-games flawlessly, since I'd be able to sell both the PS2 and PS3 and get rid of some of the clutter besides my TV.
 

Gravijah

Member
I rather see them not wasting time money and recources on BC. Just buy a 100$ ps3 in two years or plug in the one you allready have. Who has time to play old games that much to make bc even the slightest bit important?

games are games. they don't suddenly start sucking just because a new console is out.
 
Yes, 6670 is a perfect card for emulating a GPU that will be a part of Kaveri package.

I think that later down the line PS4 devkits will use Kaveri.
Remember the chipset is to be used for Medical Imaging too.

Look at: Go faster - Preprocessing Using FPGA, CPU, GPU => CPU, GPU and FPGA each have areas where they excel. The video explains possible uses for a FPGA in a PS4 or medical imaging.

A custom chip containing most of a AMD APU and FGPA could be used for medical imaging; a much more powerful game console GPU would not be needed. Having a second more powerful separate GPU for a game console makes sense for heat dissipation also. I guess it's eventually going to happen (everything in a few SOC or stacked ICs) and will be an economics decision when a smaller die size allows.
 
So, that's what I was wondering, even though it had a 380W PSU (first generation) could it have handled 380W of peak power?
But then could the heat generated in the CPU and GPU be dissipated.

From what I have been reading; if a programmer does not use some of the software HSA efficiencies like passing CPU pointers rather than moving blocks of memory, the hardware can be run harder than designed and overheat. Built in temp sensors detect overheat and reduce clock speeds to protect the chip. IF a programmer uses all HSA software efficiencies then the hardware should not overheat and theoretical max performance can be achieved.

Tom Sweeney was mentioning 64X AND 1024X times improvements (hardware + software) but the later might be PS5
 

thuway

Member
But then could the heat generated in the CPU and GPU be dissipated.

From what I have been reading; if a programmer does not use some of the software HSA efficiencies like passing CPU pointers rather than moving blocks of memory, the hardware can be run harder than designed and overheat. Built in temp sensors detect overheat and reduce clock speeds to protect the chip. IF a programmer uses all HSA software efficiencies then the hardware should not overheat and theoretical max performance can be achieved.

Tom Sweeney was mentioning 64 AND 1024 times improvements.

What type of theoretical performance are we looking at than?
 

KageMaru

Member
I have a pretty good understanding of it.

& I think a APU with fast enough memory & special modifications can emulate the Cell.

Sorry but I agree with StevieP, you don't understand how complicated emulation is.

You are reading your viewpoints into my posts again. I cited a Sweeney post only 5 months old and it again points to "TONS of CPU cores":

No I'm not reading from my viewpoint, I'm using common sense and logic. You really don't need to type out another wall of text to repeat your opinion.

I understand you spend a lot of time finding these articles and links, but you should think about how they do and don't apply to next Gen before trying to make such assumptions.
 
For most consumers BC is rather important when a console has just been launched. You can sell your old console, use the money to buy the new console and it will boost the new console's library enormously when there's only a handful of PS4 games on store shelves. Even if most people don't use it, it's something that adds value to the purchase and it makes the expensive upgrade easier to swallow.

I certainly wouldn't mind if the PS4 were able to run PS1, PS2 and PS3-games flawlessly, since I'd be able to sell both the PS2 and PS3 and get rid of some of the clutter besides my TV.

Sure, but this option will be used by only a few people.
 
Let me preface by saying, I would love the next gen PS to have 4GB of RAM.

Now let's for one moment assume that Sony is going to limit the VRAM to 1GB GDDR5 for PS4. Let's also make a few other assumptions (based on what we've been hearing):

1. Most games will still run at 720p
2. Texture streaming will still be implemented
3. Tessellation will be used extensively (and improved in efficiency as the gen wears on)
4. 30 fps will still be the base
5. DX11/OpenGL 4.0 will be used with the bells n whistles that come with it
6. Some form of AA (FXAA or Temporal AA, MLAA etc) will be implemented in all games (I hope this one is true)

With these points, how limiting is 1GB VRAM given what we have seen been achieved on PS3 so far with a quarter of what is being proposed for PS4?

PS: Does anyone know how much memory on average is dedicated for sound effects and soundtracks?

Fuck no
 
Next gen everyone is going to do something better than the other guys. PS4 may have better GPU and Xbox 720 more RAM, etc. I think it will be hillarious to see everyone really trying as hard as hell to notice a pixel or shader that is better on one than the other. I still say whomever has more RAM will have the better platform so we shall see.
 

joshwaan

Member
You are reading your viewpoints into my posts again. I cited a Sweeney post only 5 months old and it again points to "TONS of CPU cores":

Originally Posted by http://www.joystiq.com/2011/09/28/epic-games-tim-sweeney-talks-unreal-engine-4-be-patient-until/:



More than 20 cores, tons of CPU cores can't be properly supported by UE3...UE4 may support tons of CPU cores and support Sweeney's 2009 vision. He mentions in the 2009 slide show 5 years to develop a game engine starting in 2009 would be 2014.

Also I mentioned two posts ago that there would likely be two models for Game developers;

Ureal Engine 3 1) Traditional extension to last generation with OpenGL and a more GPU bound model and
Ureal Engine 4 2) Limited Ray tracing (CPU bound) and more CPU use resulting in less of the, as Tom Sweeney said; " all game engines work the same so the products all look the same as they are using the same APIs (OpenGL-DirectX)". This is what I got from the Epic presentation, he wanted to differentiate his games and engine and the only way to do that is with the CPU (provided next generation had the CPU power which rumors might support).

Model 1 and the PS4 APU can be used for GPU use in combination with the second GPU
Model 2 and the AMD APU is 100% used as a CPU with the second GPU as graphics only. In this model the APU is slightly more powerful than 24 SPUs (roughly 1 SPU = 13 GPU elements) Roughly assumes a new cell2 gets over scaling issues with memory and more. Also OpenCL efficiencies for GPUs were nearly 100% while Cell 90%. Branch prediction would be nice to have if more CPU bound which SPUs and GPUs don't really support well; PPUs and X86 cores support branching ...this might be another reason for an AMD APU.

What does this tell us...it supports some of the speculation in this thread...2014, 7XXX GPU or greater, most likely 2 GPUs (Sony can use the custom APU for medical imaging and with a second GPU for a Game console which supports the two models above)...gives us an idea of the difference between UE3 and UE4....supports a larger number of CPUs for Xbox Durango (16 PPUs or it is also using an AMD APU) and the target is movie quality graphics which requires at 1080p 2.5TFLOPS and some lesser amount at 720P which is more likely. Movie quality graphics even at 720P would be hard to fit into a Game console power budget unless it is taking advantage of HSA efficiencies both hardware and software also mentioned by Sweeney.

Apologies for the duplication of a previous post to those who are paying attention.

Hey Jeff thanks for posting this interesting read.

My thoughts are Microsoft is listening to Epic which is why we are hearing about 16 CPU Cores I know sounds crazy and 4GB memory. God I hope it's true tough it will kick ass I just hope they don't rush the console out like the 360 RROD issues.

I hope the rumors are true about 4GB for Sony's machine also would be great for developers to give us better textures and bigger game worlds.
 

Durante

Member
I have a pretty good understanding of it.

& I think a APU with fast enough memory & special modifications can emulate the Cell.
So as he said, you have no idea what you're talking about.

I'll try to make it as simple as possible:
- it's not feasible to parallelize realtime emulation of a single core across multiple cores of lower frequency
- Cell SPEs are clocked at 3.2 GHz, the GPU "cores" in the highest end APUs currently available run at less than 1 GHz

That's not going into integer performance, instruction sets, local store or the ring bus.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
So as he said, you have no idea what you're talking about.

I'll try to make it as simple as possible:
- it's not feasible to parallelize realtime emulation of a single core across multiple cores of lower frequency
- Cell SPEs are clocked at 3.2 GHz, the GPU "cores" in the highest end APUs currently available run at less than 1 GHz

That's not going into integer performance, instruction sets, local store or the ring bus.

Well you can certainly re-write common Cell versions of functions (vector math, etc.) for the GPU. Frequency is not really important, you can throw 100s of stream processors at an operation and whoop the Cell in many instances.
 

Durante

Member
Well you can certainly re-write common Cell versions of functions (vector math, etc.) for the GPU. Frequency is not really important, you can throw 100s of stream processors at an operation and whoop the Cell in many instances.
Of course you can rewrite the program to run on an APU, at least for a lot of use cases. It would be a very sad state of affairs if this was not the case for a platform that's 7 years newer.

However, this has nothing to do with emulation - to emulate an architecture to the extent required for the proposed scenario (running PS3 games on PS4) you need to cover all the code that can potentially be thrown at the SPEs. Unless you somehow expect the emulator to analyse the SPE code in realtime, understand its semantics and replace it with APU code doing the same thing, but optimized for the GPU cores. That sounds like a fascinating research topic, but since it's pretty close to my area of expertise I'd be extremely surprised to see it happening. (And I'd implore whoever managed to do it to share their methods with the broader scientific community)
 

onQ123

Member
Sorry but I agree with StevieP, you don't understand how complicated emulation is.



No I'm not reading from my viewpoint, I'm using common sense commons logic and you really don't need to type out another wall of posts to repeat your opinion.

I understand you spend a lot of time finding these articles and links, but you should think about how they do and don't apply to next Gen before trying to make such assumptions.

So as he said, you have no idea what you're talking about.

I'll try to make it as simple as possible:
- it's not feasible to parallelize realtime emulation of a single core across multiple cores of lower frequency
- Cell SPEs are clocked at 3.2 GHz, the GPU "cores" in the highest end APUs currently available run at less than 1 GHz

That's not going into integer performance, instruction sets, local store or the ring bus.


I said with special modifications, meaning Sony know what's needed to Emulate the Cell with it's SOC & they add the needed parts that would make it possible.


the same people doing all this talking about what can't be done are the same people who said that PS2 games could not be emulated on the PS3 without the PS2 parts.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Of course you can rewrite the program to run on an APU, at least for a lot of use cases. It would be a very sad state of affairs if this was not the case for a platform that's 7 years newer.

However, this has nothing to do with emulation - to emulate an architecture to the extent required for the proposed scenario (running PS3 games on PS4) you need to cover all the code that can potentially be thrown at the SPEs. Unless you somehow expect the emulator to analyse the SPE code in realtime, understand its semantics and replace it with APU code doing the same thing, but optimized for the GPU cores. That sounds like a fascinating research topic, but since it's pretty close to my area of expertise I'd be extremely surprised to see it happening. (And I'd implore whoever managed to do it to share their methods with the broader scientific community)

They had Cell CPU emulation before the PS3 came out. The question is can they emulate it at nearly the same speed for most games. I don't know the answer, but I expect many games don't use the SPUs very often and would be much easier than other games. It is the same issue they have emulating the PS2 on the PS3, some games are easier than others due to the eDRAM and how it is used.
 
I said with special modifications, meaning Sony know what's needed to Emulate the Cell with it's SOC & they add the needed parts that would make it possible.


the same people doing all this talking about what can't be done are the same people who said that PS2 games could not be emulated on the PS3 without the PS2 parts.

I also remember the same people saying it wouldn't be possible to emulate psp games on vita.
 

onQ123

Member
let's say the APU in the PS4 is the AMD Trinity with 4 Piledriver cores clocked at 4 GHz & a pretty good integrated GPU that can help out with the processing, who is to say that there is no way this could emulate The Cell?


you already have 3 extra Cores over The Cells 1 PPU + a integrated GPU


I'm pretty sure the 3 extra cores & the GPU can take on the task of emulating the 6 SPUs that are used in PS3 games.


& there could be some special parts in the PS4 to help achieve the PS3 emulation without having The Cell.
 
I never said anything about the psp and besides psp emulation doesn't really apply here since ps3 games are far more complex and harder to emulate.

I know a certain other poster who likes to strut around in threads like these said it was impossible. Same kind of arguments. Games are too complex and the hardware isn't fast enough to emulate. I don't think ps3 support is a given, I just think Sony knows a hell of a lot more about what can and cannot be done with regards to emulation compared to the people posting here, since they know the ins and outs of the ps3 and its software and they're the ones designing the ps4.
 

onQ123

Member
I know a certain other poster who likes to strut around in threads like these said it was impossible. Same kind of arguments. Games are too complex and the hardware isn't fast enough to emulate. I don't think ps3 support is a given, I just think Sony knows a hell of a lot more about what can and cannot be done with regards to emulation compared to the people posting here, since they know the ins and outs of the ps3 and its software and they're the ones designing the ps4.

My Thoughts exactly
 
So technically, if PS4 came equipped with a 380W PSU once more, it could handle hardware rated for around 250W at peak power with ease.

Yes but you wouldn't match it with that kind of hardware. A PSU is at its best when drawing about 50% of its peak. So for a 380W PSU you would ideally use it with a 190W system - 210W system.
 

androvsky

Member
I know a certain other poster who likes to strut around in threads like these said it was impossible. Same kind of arguments. Games are too complex and the hardware isn't fast enough to emulate. I don't think ps3 support is a given, I just think Sony knows a hell of a lot more about what can and cannot be done with regards to emulation compared to the people posting here, since they know the ins and outs of the ps3 and its software and they're the ones designing the ps4.

You're going to have to name names here, because I've always felt the PS3 should be able to handle most PS2 games just fine, and I don't think there was ever much doubt about the Vita emulating PSP games.

I do recall a Sony exec saying they added a few custom instructions to the ARM cores in the Vita to make PSP emulation easier, and there's no telling what Sony might do to ease emulation of PS3 games on the PS4. But when the discussion is framed as "Can the Cell's SPUs be emulated using a mid-range GPU with a huge number of relatively slow stream processors", the answer is going to be no, probably not.
 

i-Lo

Member
Yes but you wouldn't match it with that kind of hardware. A PSU is at its best when drawing about 50% of its peak. So for a 380W PSU you would ideally use it with a 190W system - 210W system.

You say that yet the until July 2007 the PSU of 360 was rated at 203W and the power it drew on average while playing games was around 185W. Now that works out to be around 90% of the limit.

I think Sony can design a next console with a max power draw of somewhere between 225 to 250W.

I am just wondering how close to THIS can we get with the next gen hardware (minus the IQ i.e. anti-aliasing and etc)?
 

onQ123

Member
You're going to have to name names here, because I've always felt the PS3 should be able to handle most PS2 games just fine, and I don't think there was ever much doubt about the Vita emulating PSP games.

I do recall a Sony exec saying they added a few custom instructions to the ARM cores in the Vita to make PSP emulation easier, and there's no telling what Sony might do to ease emulation of PS3 games on the PS4. But when the discussion is framed as "Can the Cell's SPUs be emulated using a mid-range GPU with a huge number of relatively slow stream processors", the answer is going to be no, probably not.

that was never the question.
 

Durante

Member
I also remember the same people saying it wouldn't be possible to emulate psp games on vita.
I explicitly said that it would be possible to emulate PSP games on Vita. I also said that it's not impossible to do pure software emulation of some PS2 games on PS3, unlike the people you are referring to. Emulating Cell on an APU (with fewer than 8 CPU cores) is an entirely different matter. It's not possible.

I'm pretty sure the 3 extra cores & the GPU can take on the task of emulating the 6 SPUs that are used in PS3 games.
There is no way to meaningfully parallelize the realtime emulation of a single core.
 

onQ123

Member
I explicitly said that it would be possible to emulate PSP games on Vita. I also said that it's not impossible to do pure software emulation of some PS2 games on PS3, unlike the people you are referring to. Emulating Cell on an APU (with fewer than 8 CPU cores) is an entirely different matter. It's not possible.

There is no way to meaningfully parallelize the realtime emulation of a single core.

what if 8 SPU's are part of the SOC?
 
Top Bottom