• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

So far, it is belived that PS4 will have AMD APU [4core/8threads, gpu with around 2TFlops]. Ammount of ram is undecided. No dedicated GPU, no split ram pools, no multiple HDMI ports. :D

But... we dont know anything specific. Same goes for X720, who will have more complicated structure [low powered ARM SoC, High powered gaming APU, maybe even dedicated GPU, shitton of ram dedicated for OS].
Gesh, I thought that you would now believe sweetvar26 when he said both Orbis and Durango were using Jaguar CPUs. 2 CPU packages mentioned for Orbis for a total of 8 Jaguar CPUs. I don't remember if he mentioned an amount for Durango.

4core 8 threads (If Jaguar that's only one CPU package and it can usually have 4) and high power gaming APU is inconsistent with Jaguar CPUs running at 1.6 Ghz

Durango is supposed to have a HDMI in and out and AMD always has Eyefinity which requires a Display Port interface which can direct drive LCD head mounted glasses and multiple HDMI monitors with HDMI adaptors. There is supposed to be a display port hub/adaptor released in 2013.

I can't imagine Sony not supporting the same as Durango (HDMI in and out).
 

z0m3le

Banned
Gesh, I thought that you would now believe sweetvar26 when he said both Orbis and Durango were using Jaguar CPUs. 2 CPU packages mentioned for Orbis for a total of 8 Jaguar CPUs. I don't remember if he mentioned an amount for Durango.

4core 8 threads (If Jaguar that's only one CPU package and it can usually have 4) and high power gaming APU is inconsistent with Jaguar CPUs running at 1.6 Ghz

Durango is supposed to have a HDMI in and out and AMD always has Eyefinity which requires a Display Port interface which can direct drive LCD head mounted glasses and multiple HDMI monitors with HDMI adaptors. There is supposed to be a display port hub/adaptor released in 2013.

I can't imagine Sony not supporting the same as Durango (HDMI in and out).
You should probably correct them about multiple threads on a core. No AMD processor has this technology, in fact smt is something AMD mocks, calling bulldoser's design "hyperthreading done right" jaguar is 1 thread per core, just like every other AMD CPU in existence as they have no multithreaded technology, especially on the short pipelines of jaguar.
 

DieH@rd

Banned
You should probably correct them about multiple threads on a core. No AMD processor has this technology, in fact smt is something AMD mocks, calling bulldoser's design "hyperthreading done right" jaguar is 1 thread per core, just like every other AMD CPU in existence as they have no multithreaded technology, especially on the short pipelines of jaguar.

Yes AMD has CPU modules, each with two hardware accelerated threads.
 

z0m3le

Banned
Yes AMD has CPU modules, each with two hardware accelerated threads.
An AMD module consists of 2 conjoined cores, so 4 cores is 2 modules, and 8 cores is 4 modules, if you buy a quad core bulldozer right now, you'll be buying a 2 module processor. This technically leaves them with less resources than a quad core without AMT, as those cores share no resources.
A Bulldozer module has two CPU cores that share an FPU. That has nothing to do with SMT. It could technically be combined with SMT even.
Exactly, except AMD has no SMT technology, so 1:1 ratio of cores and threads.
 

Grim1ock

Banned
I will truly laugh in despair if sony goes from having a 3.2 ghz processor to a one that is less than half that.

I hope the team at polyphony and kaz somehow have a say on the specs of the system instead of the clowns who run SCEI
 

RoboPlato

I'd be in the dick
We had heard rumors that the PS4 will have a dedicated GPU .
I think that those rumors pertain to dev kits. My bet thatthe dev kits have a discrete GPU added in to simulate the target specs of the final. I bet that the APU will be the only GPU in the final system.

I will truly laugh in despair if sony goes from having a 3.2 ghz processor to a one that is less than half that.

I hope the team at polyphony and kaz somehow have a say on the specs of the system instead of the clowns who run SCEI
Even if the clock speed is half that they'll still wipe the floor with an old processor in game performance.
 

Grim1ock

Banned
Even if the clock speed is half that they'll still wipe the floor with an old processor in game performance.

I doubt that. One of the reasons why the spus were so multifunctional and effective was because of clock speeds. You think things like MLAA and deferred rendering would have been effective if the cell was 1.5ghz?

Clock speed does matter.
 

RoboPlato

I'd be in the dick
I doubt that. One of the reasons why the spus were so multifunctional and effective was because of clock speeds. You think things like MLAA and deferred rendering would have been effective if the cell was 1.5ghz?

Clock speed does matter.
Yeah but that was an older architecture and Cell's a different beast. Modern architectures are WAY more efficient than a 2005 CPU.
 

Pranay

Member
I think that those rumors pertain to dev kits. My bet thatthe dev kits have a discrete GPU added in to simulate the target specs of the final. I bet that the APU will be the only GPU in the final system.


Even if the clock speed is half that they'll still wipe the floor with an old processor in game performance.

I really hope not
 

i-Lo

Member
I think that those rumors pertain to dev kits. My bet thatthe dev kits have a discrete GPU added in to simulate the target specs of the final. I bet that the APU will be the only GPU in the final system.


Even if the clock speed is half that they'll still wipe the floor with an old processor in game performance.

Actually, I was looking into this and having a GPU as capable as dedicated GPU like Pitcairn is currently seems to be out of the realm of possibility. Remember that APU is a single chip not CPU and GPU on a larger die.

Expect the dedicated GPU to be only used during gaming sessions. It means great power savings during all other activities. It's going to be in line with the regulation imposed by state of California.
 
Actually, I was looking into this and having a GPU as capable as dedicated GPU like Pitcairn is currently seems to be out of the realm of possibility. Remember that APU is a single chip not CPU and GPU on a larger die.

Expect the dedicated GPU to be only used during gaming sessions. It means great power savings during all other activities. It's going to be in line with the regulation imposed by state of California.
Might not need to have a second discrete GPU to have power savings.

http://blogs.amd.com/play/2012/01/30/power-efficiency-is-making-a-difference/

AMD ZeroCore Power technology is a game changer which cuts long idle power down to a small fraction of what was previously achievable; which is something we know that our enthusiast audience will truly appreciate.

So how does it work?

When GPUs aren’t tasked with heavy graphics or compute workloads, they are intelligently managed to power down to their minimum working states through a number of techniques such as engine/memory clock reductions, reduced voltages and power gating. This works to minimize power during static screen and other very low workloads where graphics cards can idle at less than their peak power.

However, in the long idle state – where the display is blanked but the rest of the system remains in an active power state – AMD ZeroCore Power technology takes GPU efficiency to a whole new level. When systems enter the long idle state and applications are not using background GPU resources, AMD ZeroCore power technology powers down all major functional blocks of the discrete GPU and reduces overall chip power consumption to well under one Watt. And while core GPU blocks (such as the compute units, display blocks, video engines and memory interfaces) are shut down and consuming zero power, the rest of the system is still available to do what it needs to do. So if you leave your PC on to share/stream/serve content, to simplify remote access or to simply make sure it’s ready to go at the flick of a mouse, AMD ZeroCore Power technology makes it happen with near-zero graphics power.
 

RoboPlato

I'd be in the dick
Actually, I was looking into this and having a GPU as capable as dedicated GPU like Pitcairn is currently seems to be out of the realm of possibility. Remember that APU is a single chip not CPU and GPU on a larger die.

Expect the dedicated GPU to be only used during gaming sessions. It means great power savings during all other activities. It's going to be in line with the regulation imposed by state of California.

Really? I was under the impression that a Kabini line APU could probably fit a pictairn or something very similar into it.
 

leroidys

Member
I don't think he realizes the IR spot system of Kinect 1 will not work well with high resolution. Microsoft has to start all over again. Example; eye tracking requires high resolution IR video and detection not movement against IR spots. Lots of things were not possible because of the low bandwidth USB2 and lack of memory in Xbox 360 and PS3. Both are starting over with high resolution IR depth cameras. What I hear read is that Microsoft is very advanced in voice recognition.

They are currently opening a 1000 engineer department specifically for voice recognition and speech software, so yes, but I doubt anything beyond what they got from their 2007 tellme acquisition would make it into the 720 in time.



You're right, I don't know much about the tech employed for Kinect, but Microsoft has been working with it for much longer than sony. Microsoft will not have to "start over" for an HD Kinect. Eye tracking is trivial compared to tracking a humans body in 3d space. If you have any papers on kinects challenges moving to HD I would love to read them :)

An engineer driven tech company like Sony can't compete?

Software engineering - and no I don't think that they can compete. Microsoft has an R&D budget of nine billion dollars.
 

i-Lo

Member
Really? I was under the impression that a Kabini line APU could probably fit a pictairn or something very similar into it.

Well if a drastic change is possible have Pitcairn like performance integrated with a CPU to create an APU, then I'd be pleasantly surprised. It's just that I haven't seen anything this year to suggest that it's possible. Then again, we are talking about an exclusive product line with its own unique needs and inevitable custom design.

So, if they can pull it off, my I'll be wide eyed and excited.
 

Ashes

Banned
They are currently opening a 1000 engineer department specifically for voice recognition and speech software, so yes, but I doubt anything beyond what they got from their 2007 tellme acquisition would make it into the 720 in time.



You're right, I don't know much about the tech employed for Kinect, but Microsoft has been working with it for much longer than sony. Microsoft will not have to "start over" for an HD Kinect. Eye tracking is trivial compared to tracking a humans body in 3d space. If you have any papers on kinects challenges moving to HD I would love to read them :)



Software engineering - and no I don't think that they can compete. Microsoft has an R&D budget of nine billion dollars.

Microsoft loses in plenty of markets. You must know this. Surely.
 

z0m3le

Banned
Well if a drastic change is possible have Pitcairn like performance integrated with a CPU to create an APU, then I'd be pleasantly surprised. It's just that I haven't seen anything this year to suggest that it's possible. Then again, we are talking about an exclusive product line with its own unique needs and inevitable custom design.

So, if they can pull it off, my I'll be wide eyed and excited.

This is likely why they are using Jaguar cores, which the 4 core jaguar would be 12.4mm^2 (not including interconnections) Adding that to pitcairn (212mm^2) and giving overhead on jaguar, lets just say 240mm^2, This is a reasonable die size.

Pitcairn running @ 800MHz (2TFLOPs) would use ~140-150watts + Jaguar, you are looking at 160watts for just the APU here. This configuration btw is more powerful than a lesser APU and a separate GPU, because it's cheaper, uses less energy and no loss of performance between communication between the two GPUs.

Pitcairn seems to be the top you could expect and it would be a ~200watt console when you combined all other components, which means another big PS3/360 size console. Considering 720's recent rumor of 1.6GHz CPU, it wouldn't surprise me if they are designing a ~100Watt console to sit next to the TV and still give a nice boost in graphics, think HD 7770 which is 80Watts and gives 1.4TFLOPs.
 
I don't think so. For all of its flaws, kinect was a massive leap forward in terms of commercial 3D motion tracking software. I don't think that sony's software engineers can compete, wich is why they will double down on something like move + VR type enhancements like eye tracking rather than go MSs route of controllerless gaming.

You're vastly overrating Microsoft's Kinect technology. I was far more impressed with Move than I was Kinect.
 

leroidys

Member
You're vastly overrating Microsoft's Kinect technology. I was far more impressed with Move than I was Kinect.

You misunderstand me. Move is my favorite motion control method this gen and seriously impressive. I'm saying that Sony can't/shouldn't compete with Microsoft on 3D motion tracking technique that they have plowed resources into.

In what way am I overrating Kinect?

Microsoft loses in plenty of markets. You must know this. Surely.

Yes, with xbox and Kinect being (one of) their greatest recent success(es).

I'm just saying that Sony is not going to be the company that beats them at Kinect-like technology.
 

RaijinFY

Member
This is likely why they are using Jaguar cores, which the 4 core jaguar would be 12.4mm^2 (not including interconnections) Adding that to pitcairn (212mm^2) and giving overhead on jaguar, lets just say 240mm^2, This is a reasonable die size.

Pitcairn running @ 800MHz (2TFLOPs) would use ~140-150watts + Jaguar, you are looking at 160watts for just the APU here. This configuration btw is more powerful than a lesser APU and a separate GPU, because it's cheaper, uses less energy and no loss of performance between communication between the two GPUs.

Pitcairn seems to be the top you could expect and it would be a ~200watt console when you combined all other components, which means another big PS3/360 size console. Considering 720's recent rumor of 1.6GHz CPU, it wouldn't surprise me if they are designing a ~100Watt console to sit next to the TV and still give a nice boost in graphics, think HD 7770 which is 80Watts and gives 1.4TFLOPs.

No. The 7870 (which is a full-fledged Pitcairn at 1Ghz) doesnt go over 120W. So, assuming they go with 18CU version at 800Mhz it would probably go just under 100W at peak usage.
 

Tripolygon

Banned
You misunderstand me. Move is my favorite motion control method this gen and seriously impressive. I'm saying that Sony can't/shouldn't compete with Microsoft on 3D motion tracking technique that they have plowed resources into.
I'm just saying that Sony is not going to be the company that beats them at Kinect-like technology.

You are underestimating Sony

2000 - 2004 Sony R&D

1. Head Tracking http://www.youtube.com/watch?v=pfbMxcJHJAw
2. 3D Object Mapping http://www.youtube.com/watch?v=i9AViYmfWR4
3. 3D Skeletal Tracking ala Kinect http://www.youtube.com/watch?v=jYHr0I-iFHE

2009

1. PlayStation Vision Library https://www.youtube.com/watch?v=gOtPVof2K94
2. PlayStation Voice Recognition https://www.youtube.com/watch?v=5TaUsSy-f0Y

2011 - 2012

1. Sony Smart AR
https://www.youtube.com/watch?v=XCEp7udJ2n4
https://www.youtube.com/watch?v=MTPnAlzdHzE
https://www.youtube.com/watch?v=TLtHIodo_-0

2. Head and Facial Feature Tracking
https://www.youtube.com/watch?v=NRo2z6LtyJo

They have many patents on these things from their different departments and i'm sure they can create something as good or better than kinect just like they did with Wii Mote Vs PS Move. Will it catch on and sell well is something totally different.
 
This is likely why they are using Jaguar cores, which the 4 core jaguar would be 12.4mm^2 (not including interconnections) Adding that to pitcairn (212mm^2) and giving overhead on jaguar, lets just say 240mm^2, This is a reasonable die size.

Pitcairn running @ 800MHz (2TFLOPs) would use ~140-150watts + Jaguar, you are looking at 160watts for just the APU here. This configuration btw is more powerful than a lesser APU and a separate GPU, because it's cheaper, uses less energy and no loss of performance between communication between the two GPUs.

Pitcairn seems to be the top you could expect and it would be a ~200watt console when you combined all other components, which means another big PS3/360 size console. Considering 720's recent rumor of 1.6GHz CPU, it wouldn't surprise me if they are designing a ~100Watt console to sit next to the TV and still give a nice boost in graphics, think HD 7770 which is 80Watts and gives 1.4TFLOPs.

You are forgetting something you just can't add up the parts and say they going to be this amount of watts .
For eg the APU and GPU both have Ram adding to watt count plus console makers going to cut or add certain things they want .
They can also lower the speed if they want to .
 
You misunderstand me. Move is my favorite motion control method this gen and seriously impressive. I'm saying that Sony can't/shouldn't compete with Microsoft on 3D motion tracking technique that they have plowed resources into.

In what way am I overrating Kinect?



Yes, with xbox and Kinect being (one of) their greatest recent success(es).

I'm just saying that Sony is not going to be the company that beats them at Kinect-like technology.

Oh, I see what you're saying.

I thought you were suggesting that Sony can't come out with a competitive motion controller.

I agree - Microsoft has invested a lot of money into the technology that goes into Kinect, and it would be difficult for Sony to try compete in that space.

That being said, Sony has also established their own method of motion control and they should simply expand and refine on that. Move's biggest problem is that it wasn't there from Day 1 and packed in with the system.

With it being packed in the system, it will receive a lot more support and that will hopefully reveal exciting core AAA games that make use of the tech.
 

z0m3le

Banned
No. The 7870 (which is a full-fledged Pitcairn at 1Ghz) doesnt go over 120W. So, assuming they go with 18CU version at 800Mhz it would probably go just under 100W at peak usage.

Don't know where you are getting that from, here: http://en.wikipedia.org/wiki/Compar...g_units#Southern_Islands_.28HD_7xxx.29_series

HD 7870 max TDP is 175watts, which goes higher with stuff like furmark, but of course this is just the TDP you would look at when adding it as a component.

You are forgetting something you just can't add up the parts and say they going to be this amount of watts .
For eg the APU and GPU both have Ram adding to watt count plus console makers going to cut or add certain things they want .
They can also lower the speed if they want to .

As for my estimations of the power usage, that is all it is, you have about ~160watts there between the HD 7870 + 4 core Jaguar @ 2GHz, not saying it's exact, but you will hit around 200watts for a system with those components.

Again, APU + GPU is a ridiculous design, 1 much faster GPU + CPU using MCM or 1 APU with that big GPU on die would out perform it in every way.
 

RaijinFY

Member
Don't know where you are getting that from, here: http://en.wikipedia.org/wiki/Compar...g_units#Southern_Islands_.28HD_7xxx.29_series

HD 7870 max TDP is 175watts, which goes higher with stuff like furmark, but of course this is just the TDP you would look at when adding it as a component.



As for my estimations of the power usage, that is all it is, you have about ~160watts there between the HD 7870 + 4 core Jaguar @ 2GHz, not saying it's exact, but you will hit around 200watts for a system with those components.

Again, APU + GPU is a ridiculous design, 1 much faster GPU + CPU using MCM or 1 APU with that big GPU on die would out perform it in every way.


TDP =/ max power usage!

Just look there:

http://www.techpowerup.com/reviews/AMD/HD_7850_HD_7870/24.html
 

longdi

Banned
You mean the Wii mote? :p

believe it or not, i recently played with Move and Sports champion. Sony table tennis game is way more impressive than Wii Sports. the sensor is very sharp and fast, almost 1:1. I will not mind if PS4 comes with a improved Move controller as standard. It is what i call motion sensing done right.
 
That being said, Sony has also established their own method of motion control and they should simply expand and refine on that. Move's biggest problem is that it wasn't there from Day 1 and packed in with the system.

With it being packed in the system, it will receive a lot more support and that will hopefully reveal exciting core AAA games that make use of the tech.

I think Sony needs to be really careful when it comes to motion controls next-gen. They need to go around to developers and talk to them about that tech. Don't ask them if they think it's cool, ask them if they actually want it. Because if they don't want it then they obviously aren't going to bother to implement it into their games. So it wouldn't matter if it's included or not. Get honest answers and not the usual bullshit shown at conference of developers saying "Oh, this is amazing", and then they never release a single game for it.

I'll be perfectly honest, I think the whole movement toward motion controls is done. The Wii dried up and Kinect shot out the gate and then dropped like a rock. Move never really did much of anything. Yeah, it apparently sold well, but the support is abysmal and you rarely see anyone talk about it now (much like other motion controls). So, Sony needs to make sure that they don't force it if developers don't want it. Because it'll only harm them by unnecessarily increasing the price of what will likely be an already expensive console.
 

Vagabundo

Member
I've been debating on building a htpc over getting any of the next gen consoles and I came across this.

http://www.gamersnexus.net/pc-builds/46-pcbuildupg/977-cheap-gaming-htpc-black-friday

Kind of similar to this rumor no? Found it pretty cheap for being of the shelf parts. Figure you can knock a few bucks off the price since it is off shelf parts and throw in the discrete gpu and you got the rumored ps4 or somewhat close to it?

I really don't think that will be close in performance. Although come back in a year or two and you'll have a kick-ass HTPC system methinks.
 
Sweetvar26 said months ago that both Durango and Orbis would have 2 Jaguar CPU packages = 8 CPUs and a recent tweek leak says the CPU in Durango is 1.6Ghz which is Jaguar's CPU speed in Kabini.

https://twitter.com/marcan42 said:
If you want more evidence that MHz isn't everything, a little birdie points out that Durango (Xbox 720) is specc'ed to have a 1.6GHz CPU.

Kabini is a notebook design on 28nm and I believe Low performance/power bulk silicon @ about 35 watts max not 350 watts. IF everything is in a custom version monolithic chip for game consoles then it has a max power of 65 watts. It also can't be made by anyone else except IBM as at 28nm GloFlo is still gate first.

There are two pluses for going Kabini as the base design has a redesigned wider memory interface and it already has Jaguar CPUs. There are several problems however as it is a low power design on low power bulk and internal power gating as well as the silicon will have problems above 65 watts and I think a next generation console will have a max TDP of about 90-130 watts.

Ok how about Kaveri at 28nm, same issues with gate first, memory interface sucks and about the only thing going for it is it's on high performance Bulk silicon with higher power gating. So take Kaveri, replace the CPUs with Jaguar, replace the memory interface with a newer one like Kabini and increase the CUs in the GPU. Gee it looks like a Kabini on high performance bulk silicon.

Any way you look at it there is nothing off the shelf with minor modification that currently fits. In my opinion, Next generation is going to be more custom than the Investor conference would imply.

We also have this leak:

http://www.neogaf.com/forum/showpost.php?p=44885833&postcount=1229 said:
The next Xbox and PlayStation are not "GPU-centric." There are pretty significant things happening with their processor architectures
Jaguar isn't significant but GCN + Jaguar with wide IO could be or a recent post by Seronx could explain it:

Same Seronx who speculated previously on PS4 and Xbox 720. He is not an insider and these are not leaks just speculation by someone who follows rumors and is very informed about AMD products.

http://forums.anandtech.com/showpost.php?p=34282049&postcount=191 said:
Playstation 4 is using Thebe-J which hasn't finished yet nor is it related to the Jaguar or Trinity or Kaveri architectures. The only one that is showing any signs of finalization is Xbox's Kryptos which is a >450 mm² chip. To get back on Thebe-J it was delayed from Trinity side-by-side development to Kaveri side-by-side development.

I assume if they are going to use Jaguar it is going to be in a big.LITTLE formation. Which will have them in a configuration where the Jaguar portion will control all of the system, os, etc stuff that generally isn't compute intensive. While the Thebe portion will control all of the gaming, hpc, etc. stuff that is generally compute intensive. Since, each year the performance part of the Playstation Orbis was upgraded it is safe to assume that they are going for an APU with the specs.


First:
A8-3850 + HD 7670
400 VLIW5 + 480 VLIW5 => 880 VLIW5 -> VLIW4 => 704

Second:
A10-5700 + HD-7670
384 VLIW4 + 480 VLIW5 => 864 VLIW4/5 -> VLIW4 => 768

I have heard that the third generation of the test Orbis uses an APU with GCN2.
Unknown APU + HD8770
384 GCN2 + 768 GCN2 -> 1152 GCN2

It is assumed that the APU only has four cores because AMD doesn't plan to increase the core count other than the GPU cores from now on.
If I'm reading this correctly then 1-2 Jaguar CPU package (4-8 X86 jaguar CPUs) and 3-2 higher performance CPU packages (6-4 CPUs) which the "side by side development with Kaveri" might mean the same CPUs as in Kaveri. Jaguar (lower power more efficient) CPUs control the OS (Little) and Kaveri type CPUs (big) would be used where needed. 2014 AMD designs choose the best CPU for a task; the example given by AMD is choosing to use the GPU or CPU for compute tasks based on which would do this better. This can be extended to use more power efficient CPUs for OS tasks (little) and Desktop CPUs for games (big). Power usage is going to be a big issue due to mandated game console power usage laws.

Side by side development: I assume test chips on a few wafers at a time have been made every 2-3 months or so for more than a year. These wafers probably contain multiple designs at the same time as long as they are compatible with the same process. For example Thebe-J chips would be included with Kaveri and other AMD projects like discrete GPUs that all use the same process (low power or high power silicon @ 28nm).

I don't know how accurate Seronx might be but his speculation does allow for a wider view on next generation designs.

The above is speculation and I can see where he is getting this: Thebe-Jaguar = big.little

I don't think he has any information other than speculation based on the hyphen between Thebe and Jaguar. It disagrees with Sweetvar26 in that he said 2 jaguar packages with cache which leaves room for 2 more CPU packages which I earlier speculated might be two 1PPU3SPU CPU packages which if more modern versions might be better at game code than Kavari CPUs. Kavari CPUs & Jaguar CPUs is the other possiblity, a mix of low power and high power X86 CPUs for different jobs.

Edit: Are these "X86" CPUs to use ONLY AMD's -64 extension to X86. Is there any need to support older Intel copyrighted 16 bit X86?

HSA IL would require X86 CPUs for AMD libraries for the virtual engine but game code is more to the metal. Die size of all these CPUs & CPU packages are similar. I.E. 4 jaguar CPUs in a CPU package = 1 Kavini X86 CPU package = 1PPU3SPU MSA CPU package

Rumors of Sony not having the ARM A5 for trustzone would support Sony still using SPEs with a encryption key buried in hardware similar to the PS3. Remember developers commenting on the PS4 being harder to develop on than the Xbox 720.

PS4 being harder to develop on than the Xbox 720 does not make sense if the PS4 is to be the simpler less powerful design. I'd discount most rumors that have anything in them stating that one is more powerful than the other just on principal...it could be true but I think we can see what's generating a few of these "leaks" that have outrageous claims of 350 watts.
 
You are underestimating Sony

2000 - 2004 Sony R&D

1. Head Tracking http://www.youtube.com/watch?v=pfbMxcJHJAw
2. 3D Object Mapping http://www.youtube.com/watch?v=i9AViYmfWR4
3. 3D Skeletal Tracking ala Kinect http://www.youtube.com/watch?v=jYHr0I-iFHE

2009

1. PlayStation Vision Library https://www.youtube.com/watch?v=gOtPVof2K94
2. PlayStation Voice Recognition https://www.youtube.com/watch?v=5TaUsSy-f0Y

2011 - 2012

1. Sony Smart AR
https://www.youtube.com/watch?v=XCEp7udJ2n4
https://www.youtube.com/watch?v=MTPnAlzdHzE
https://www.youtube.com/watch?v=TLtHIodo_-0

2. Head and Facial Feature Tracking
https://www.youtube.com/watch?v=NRo2z6LtyJo

They have many patents on these things from their different departments and i'm sure they can create something as good or better than kinect just like they did with Wii Mote Vs PS Move. Will it catch on and sell well is something totally different.

I hope whatever Sony is working on also tracks individual fingers like leap motion. Their new motion controls usage shouldn't limit itself only to gaming.
 

mrklaw

MrArseFace
I think Sony needs to be really careful when it comes to motion controls next-gen. They need to go around to developers and talk to them about that tech. Don't ask them if they think it's cool, ask them if they actually want it. Because if they don't want it then they obviously aren't going to bother to implement it into their games. So it wouldn't matter if it's included or not. Get honest answers and not the usual bullshit shown at conference of developers saying "Oh, this is amazing", and then they never release a single game for it.

I'll be perfectly honest, I think the whole movement toward motion controls is done. The Wii dried up and Kinect shot out the gate and then dropped like a rock. Move never really did much of anything. Yeah, it apparently sold well, but the support is abysmal and you rarely see anyone talk about it now (much like other motion controls). So, Sony needs to make sure that they don't force it if developers don't want it. Because it'll only harm them by unnecessarily increasing the price of what will likely be an already expensive console.


It's is true (and also true for kinect IMO). Even Nintendo are backing off a bit - the. Gamepad is basically an advanced sixaxis - motion controls but not really gestures, and they don't include a wiimote in the box so you can't design a game to rely on it.
 

gofreak

GAF's Bob Woodward
Stumbled across a 2012 paper on the application of eye tracking to video games (aiming, specifically), co-authored by one of SCEJ's R&D team.

https://www.waset.org/journals/waset/v65/v65-212.pdf

For context, in the last 18 months or so SCEA R&D has filed 6+ eye tracking related patents.

It seems, though, like it would be a very difficult thing to do at living-room distances? I don't know. I don't know if it can be done for PS4. But certainly seems like it's been an interesting candidate technology for Sony, something they've been evaluating quite heavily.
 
Eye tracking is interesting for several reasons.

One of them being 'ad views'. Imagine there is an advertisement. Sony will actually be able to say 'this many have seen your advertisement, and xx% of them maintained their gaze there for x amount of seconds. You owe us xx monies please.'

I could also provide some interesting data for game enhancement/improvement. If no-one is actually looking at a certain part of the UI all that much (e.g. the minimap) then in the next installment it could be switched to being off by default.
 

gofreak

GAF's Bob Woodward
I think it's interesting as a component of a broader system, for actual game interaction.

Like, with voice input, characters in a game can know when you are talking to them by who you are looking at.

Or with finer-grained finger or hand interaction with virtual objects. Being able to direct your interaction to specific objects by looking at them.

Or, even, for aiming.

I don't see it as one thing on its own though, I think it's another way for the game to know what you are trying to do, in concert with other things.

This is all dependent on it actually working well of course. I don't know what state the tech is in though, particularly for a living room.
 
Top Bottom