• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4's GPU customization revealed (paging Jeff)

gofreak

GAF's Bob Woodward
As far as I can tell, there's nothing really special in that info. If anything it reveals the Onion/Garlic memory buses are a standard AMD APU feature rather than any kind of PS4 customization.

Onion+ is new (the GPU cache bypass), and in fairness is the only thing Cerny pointed out as a customisation they did on the bus side.

Obviously in the GPU more generally there were the other ACE changes, although I'm not entirely sure if that was Sony-cooked, or is something AMD is prepping for future chips too...
 

i-Lo

Member

Relevant:

On the one hand, seeing a year-old PC demo scaled down a tad to work on PlayStation 4 hardware probably isn't what console gamers would expect, and doesn't quite tally when other developers are talking about PS4 out-powering most PCs for years to come. But it is important to put all of this into context. The DirectX 11 API is very mature while the PS4 tools and APIs are still in their initial stages of development - it's going to take time for devs to fully get to grips with the new hardware. Over and above that, assuming this is the same demo that was shown at the PlayStation 4 reveal, we know for a fact that most studios only received final dev kits in the weeks beforehand, the suggestion being that most of the UE4 work will have been produced on unfinished hardware.
 
All speculation but it fits. It's also totally ass backwards from what I was expecting for the PS4 due to the Sony CTO, Sony SVP Technology platform, Charlie at Semi-Accurate and the Yole PDF. It likely follows that GF is producing Kryptos and it's being packaged by AMKOR and the now rumored delays for Xbox are due to the faster Wide IO memory shortages.

What are these delay rumours for the new Xbox? Presumably these are different from the low chipset yield delays rumoured at the end of last year?
 
What are these delay rumours for the new Xbox? Presumably these are different from the low chipset yield delays rumoured at the end of last year?
Low chipset yield delays were Charlie at SemiAccurate trying to explain Oban, which he discovered was being forged by IBM and GF starting Dec 2011 for Microsoft in production quantities. It's too soon by a year for a 2013 holiday release game console.

I speculated that oban (Japanese blank gold coin with bumps) was a blank interposer with bumps. It would be made first since it's more than 20 times larger than Kryptos which means 20 times as many interposer wafers need to be made. I also assumed incorrectly from the Japanese name that Microsoft and Sony would be sharing the interposer design/production.
 
Low chipset yield delays were Charlie at SemiAccurate trying to explain Oban, which he discovered was being forged by IBM and GF starting Dec 2011 for Microsoft in production quantities. It's too soon by a year for a 2013 holiday release game console.

I speculated that oban (Japanese blank gold coin with bumps) was a blank interposer with bumps. It would be made first since it's more than 20 times larger than Kryptos which means 20 times as many interposer wafers need to be made. I also assumed incorrectly from the Japanese name that Microsoft and Sony would be sharing the interposer design/production.

Oh ok, thanks. I was familiar with this rumour.

There were also similar rumours for PS4 chipset manufacturing problems, but those were before this Oban chipset rumour iirc.

I still believe MS will be hell bent on getting the new Xbox released this year, even if it means releasing in just one territory.
 

daveo42

Banned
Unless I'm completely wrong on this, the GNB is basically allowing the entire APU to function more efficiently to avoid additional caching between the GPU and CPU, right? the GNB is fast enough to serve up data without any loss in throughput?

I really like the technical aspect of this and find it quite interesting. Most of it just goes over my head.
 
We did get an upgraded one. It was pretty significant too. The thing is, it started off as shit, you make it better, it's still going to be at least somewhat shit.
I know there was a HTML5 update, but it's still crashing more than ever (on average every 10 to 15 minutes) and is barely any faster than it was 3 or 4 years ago. The only reason I still use it is because it's convenient in being able quickly browse on the tv without having to boot up another device other than the PS3.

Even if the crashing was greatly reduced without improvement in functionality then I would be happy with that as a real upgrade, but there doesn't appear to be any sign of it happening anytime soon, or maybe ever.
 
Unless I'm completely wrong on this, the GNB is basically allowing the entire APU to function more efficiently to avoid additional caching between the GPU and CPU, right? the GNB is fast enough to serve up data without any loss in throughput?

I really like the technical aspect of this and find it quite interesting. Most of it just goes over my head.

Seems to be. We still have 380 ALU's unaccounted for.
 
I know there was a HTML5 update, but it's still crashing more than ever (on average every 10 to 15 minutes) and is barely any faster than it was 3 or 4 years ago. The only reason I still use it is because it's convenient in being able quickly browse on the tv without having to boot up another device other than the PS3.

Even if the crashing was greatly reduced without improvement in functionality then I would be happy with that as a real upgrade, but there doesn't appear to be any sign of it happening anytime soon, or maybe ever.

I don't see how that invalidates what jeff_rigby said though. The browser was improved quite substantially with HTML5 support as mentioned by jeff_rigby

http://neogaf.com/forum/showpost.php?p=34939364&postcount=175
 

Raoh

Member
Is this a running gag on GAF? Why don't people listen to this Jeff guy?

I actually enjoy his posts. Makes you think and want to research what he posts, he lays out a map and if your interested you can go on a journey to learn/find out for yourself.

I also like that he engages your questions unlike other posters/sites/blogs.

Right or wrong its great geek threads.
 

sangreal

Member
Low chipset yield delays were Charlie at SemiAccurate trying to explain Oban, which he discovered was being forged by IBM and GF starting Dec 2011 for Microsoft in production quantities. It's too soon by a year for a 2013 holiday release game console.

I speculated that oban (Japanese blank gold coin with bumps) was a blank interposer with bumps. It would be made first since it's more than 20 times larger than Kryptos which means 20 times as many interposer wafers need to be made. I also assumed incorrectly from the Japanese name that Microsoft and Sony would be sharing the interposer design/production.

Oban is 32nm Xbox 360 SoC (which afaik hasn't made it into any products yet). I've posted about this before, but you can find a few references to it on linked in. eg:

• BE22 – migration of PS3 Cell processor to 22nm
Responsible for top-level large-block synthesis of the entire SPU (Synergistic Processing Unit). Was able to achieve route-ability and timing closure through sophisticated synthesis techniques involving detailed floor-planning, custom pre-wires and latch placement, custom routing algorithms, and soft/hard placement boundaries.
• Oban – migration of XBOX PowerPC chip to 32nm
Helped to develop the flow for schematic and layout migration of key circuits, and led efforts to trouble-shoot the verification-flow (in addition to passing all checks ahead of schedule). Also led analog migration of the RNG and played a key role in fixing a critical component of the circuit.

http://www.linkedin.com/in/chenguo
 

onQ123

Member
So do you think it has it's own ALU's? Or are those ALU's something extra outside of the GNB?


All I can say is put the pieces of the puzzle together the best you can until E3 or whenever we can get some concrete info.




Additional hardware: GPU-like Compute module, some resources reserved by the OS

However, there's a fair amount of "secret sauce" in Orbis and we can disclose details on one of the more interesting additions. Paired up with the eight AMD cores, we find a bespoke GPU-like "Compute" module, designed to ease the burden on certain operations - physics calculations are a good example of traditional CPU work that are often hived off to GPU cores. We're assured that this is bespoke hardware that is not a part of the main graphics pipeline but we remain rather mystified by its standalone inclusion, bearing in mind Compute functions could be run off the main graphics cores and that devs could have the option to utilise that power for additional graphical grunt, if they so chose.






What was intriguing was new data on how the PlayStation 4's 18-compute-unit AMD graphics core is utilised. Norden talked about "extremely carefully balanced" Compute architecture that allows GPU processing for tasks that usually run on the CPU. Sometimes, employing the massive parallelisation of the graphics hardware better suits specific processing tasks.

"The point of Compute is to be able to take non-graphics code, run it on the GPU and get that data back," he said. "So DSP algorithms... post-processing, anything that's not necessarily graphics-based you can really accelerate with Compute. Compute also has access to the full amount of unified memory."

"The cool thing about Compute on PlayStation 4 is that it runs completely simultaneous with graphics," Norden enthused. "So traditionally with OpenCL or other languages you have to suspend graphics to get good Compute performance. On PS4 you don't, it runs simultaneous with graphics. We've architected the system to take full advantage of Compute at the same time as graphics because we know that everyone wants maximum graphics performance."

Leaked developer documentation suggests that 14 of the PS4's compute units are dedicated to rendering, with four allocated to Compute functions. The reveal of the hardware last month suggested otherwise, with all 18 operating in an apparently "unified" manner. However, running Compute and rendering simultaneously does suggest that each area has its own bespoke resources. It'll be interesting to see what solution Sony eventually takes here.




http://arstechnica.com/gaming/2013/...4s-hardware-power-controller-features-at-gdc/

The system is also set up to run graphics and computational code synchronously, without suspending one to run the other. Norden says that Sony has worked to carefully balance the two processors to provide maximum graphics power of 1.843 teraFLOPS at an 800Mhz clock speed while still leaving enough room for computational tasks. The GPU will also be able to run arbitrary code, allowing developers to run hundreds or thousands of parallelized tasks with full access to the system's 8GB of unified memory.






http://www.neogaf.com/forum/showthread.php?t=532077

This function allows for harmonization of graphics processing and computing, and allows for efficient function of both. Essentially “Harmony” in Japanese. We’re trying to replicate the SPU Runtime System (SPURS) of the PS3 by heavily customizing the cache and bus. SPURS is designed to virtualize and independently manage SPU resources. For the PS4 hardware, the GPU can also be used in an analogous manner as x86-64 to use resources at various levels. This idea has 8 pipes and each pipe(?) has 8 computation queues. Each queue can execute things such as physics computation middle ware, and other prioprietarily designed workflows. This, while simultaneously handling graphics processing.

This type of functionality isn’t used widely in the launch titles. However, I expect this to be used widely in many games throughout the life of the console and see this becoming an extremely important feature.


Graphic North Bridge(GNB) Highlights
Fusion 1.9 support
DCE 7.0
UVD 4.0
VCE
IOMMU
ACP
5x8 GPP PCIE cores
SCLK 800MHz/LCLK 800MHz



http://arstechnica.com/civis/viewtopic.php?f=22&t=1193497&start=440
Poster Blacken00100
"So, a couple of random things I've learned:

-It's not stock x86; there are eight very wide vector engines and some other changes. It's not going to be completely trivial to retarget to it, but it should shut up the morons who were hyperventilating at "OMG! 1.6 JIGGAHURTZ!".

-The memory structure is unified, but weird; it's not like the GPU can just grab arbitrary memory like some people were thinking (rather, it can, but it's slow). They're incorporating another type of shader that can basically read from a ring buffer (supplied in a streaming fashion by the CPU) and write to an output buffer. I don't have all the details, but it seems interesting.


http://forum.beyond3d.com/showpost.php?p=1714789&postcount=828


"Originally Posted by patsu
What do you know about the secondary custom chip ?"

The PS4 specter vector? I'm not a 100% sure what it is at this stage... I'm 45% leaning towards physics offloading or helping out within that department. The other 55% is screaming a modified component for helping PS3 games work within the G/Cloud environment.
 
Oban is 32nm Xbox 360 SoC (which afaik hasn't made it into any products yet). I've posted about this before, but you can find a few references to it on linked in. eg:

http://www.linkedin.com/in/chenguo

• BE22 – migration of PS3 Cell processor to 22nm
Responsible for top-level large-block synthesis of the entire SPU (Synergistic Processing Unit). Was able to achieve route-ability and timing closure through sophisticated synthesis techniques involving detailed floor-planning, custom pre-wires and latch placement, custom routing algorithms, and soft/hard placement boundaries.
• Oban – migration of XBOX PowerPC chip to 32nm
Helped to develop the flow for schematic and layout migration of key circuits, and led efforts to trouble-shoot the verification-flow (in addition to passing all checks ahead of schedule). Also led analog migration of the RNG and played a key role in fixing a critical component of the circuit.

Wow really good find, it lays to rest several lines of speculation and opens others. So Oban is not an interposer but if the rest of the SemiAccurate posts are accurate about volume then a Xbox 361/Xbox next with ARM for Trustzone and low power is probably likely.

ARM chips to rival PS3, Xbox 360 in 18 months? July 2011

ARM has been beating the performance drum again, this time telling the Inquirer that a new Mali GPU design due out in 18 months will make its chips the equal of current-gen gaming consoles like the Xbox 360 and the Playstation 3.

So yes, given a process shrink to 28nm or thereabouts, it seems quite possible that ARM will at the very least be able to pack as much hardware as the Xbox 360 does into an application processor.
We are now seeing 28nm in the PS4 and Xbox 720 and nearly 10X the GPU performance of the PS3 with less power used than the first versions of the PS3 and speculation that the power used will be less than 30% greater than the latest revision of the PS3. So PS4 like hardware at PS3 performance should be in the 20 watt range, ARM designs should be more efficient (Apx 5 watts). Temesh at 15 watts has demo games that look similar/better than PS3 games.

ARM-BASED XBOX ‘LITE’ COMING IN 2013, XBOX 360 SUCCESSOR LATER, INSIDER CLAIMS March 2012

“My understanding is that we’ll see a Xbox device in late 2013 which does Arcade-style games & all the current & future media apps with Kinect (with near-mode),” MS Nerd wrote on Reddit while fielding user-submitted questions. “It will be an ARM-based platform price-competitive with the Apple TV (if you own a Kinect already).”
This was recently seconded by another reporter, XBOX 360 that runs Win RT (ARM) and is supposed to be Xbox 360 compatible. March 2013 I'm guessing this takes the place of the Xbox 361 (1080P & XTV support) that was projected for end of 2012 in the 4/2010 leaked Xbox 720 powerpoint

Arm Mali 600 series GPUs are what the first cite above is about "Xbox 360" performance and can even support 4K OS UIs.




No idea if the 22nm PS3 Cell processor has been sent to a forge yet or if it's even happening.
 

Interfectum

Member
Maybe I'm tired, maybe I'm just too busy at work to read this in detail but... i have no fucking clue what's going on in this thread.

Bottom line: will this 'news' make a PS4 fanboy happy?
 
Maybe I'm tired, maybe I'm just too busy at work to read this in detail but... i have no fucking clue what's going on in this thread.

Bottom line: will this 'news' make a PS4 fanboy happy?

I don't think a fanboy would know wtf it means. But it is good news.

All I can say is put the pieces of the puzzle together the best you can until E3 or whenever we can get some concrete info.

http://forum.beyond3d.com/showpost.php?p=1714789&postcount=828
What about those ACE increase to 8? What if that was just their final solution?
 

spwolf

Member
dont see why would ARM need GPU, as it is used for bg tasks, not for out puting any kind of graphics. Why would it output UI when it is "off"? To show off some "secure" logo? It is not an ATM or Kiosk or whatever AMD Trustzone is meant to be used for.

Thing is off but works in bg... when you want for OS to show up, it turns on the APU, just scale down the Mhz, like laptop or any other low power device.
 
Jeff here just got your page

Oh wrong Jeff? Nobody cares about this Jeff

image.php
 
dont see why would ARM need GPU, as it is used for bg tasks, not for out puting any kind of graphics. Why would it output UI when it is "off"? To show off some "secure" logo? It is not an ATM or Kiosk or whatever AMD Trustzone is meant to be used for.

Thing is off but works in bg... when you want for OS to show up, it turns on the APU, just scale down the Mhz, like laptop or any other low power device.
It looks like I was more than half right in the thread I started: EPA Energy Star third Tier requirements and impact on Game Consoles . Change for using more modern hardware from a 2CU AMD GPU to using ARM GPU, and that's just for the UI regulations.

Trustzone AND UI => ARM for XTV
 

mrgreen

Banned
Fascinating reading (@ Jeff.) The more I read about the PS4 the more I think it may be the first time that I buy from Sony over Xbox (or N64 back in the day.)
 
Final Draft Version 1.0 EPA Game Console Performance Requirements and Test Method Comment Summary and Response

These are the comments from Game console Manufacturers on next generation consoles meeting the power requirements set forth by the EPA. PS4 and Durango could not meet those levels but the WiiU could but just barely. The comments also give an idea of what they had to do to meet the requirements.

Stakeholder = Game console Manufacturer

One stakeholder noted that Final Draft power limits are not achievable for two of the three console makers and, under some circumstances, only partially achievable by the Nintendo WiiU since game title customizes its navigation menus slightly differently, with some opting for a simple, utilitarian appearance while others choose to incorporate splashy graphics, sound, and background animations, which require more power over 40W.

Another stakeholder in support of the Final Draft power limits reported that the new Wii U launch model (8 GB model) purchased in November 2012, draws 28-29 watts in Video Stream Play mode, and 31-32 watts in Navigation Menu function per their own tests.

One stakeholder commented that power scaling does not have infinite elasticity and that in order for the manufacturer to “hit” a certain downscaled number for media streaming, the manufacturer may have to opt for a chip that has an energy ceiling below that which is optimal for other, non-scaled functions, like game play. Alternatively, there may exist a subset of chips that could handle both extremes (e.g., a chip designed for high-end ultra books) but at an exorbitant cost relative to what is an affordable option for a device priced at several hundred dollars. It also noted that redesigning consoles’ motherboards to accommodate scalable architecture is an incredibly complicated and expensive process and even if the next generation was updated it may require more processing power.

The stakeholder further stated that console manufacturers can only meet the Streaming Media power limit by embedding into the console a separate chipset and associated circuitry optimized for video streaming which is technically complex, prohibitively expensive, and could introduce latency issues when switching between systems. The stakeholder cited an unofficial estimate of an Apple TV (2nd Generation) cost of $64.5 compared to the launch year prices for the most economical versions of the current generation game consoles were $299 (Xbox 360), $499 (PS3), and $299 (Wii U). In light of the industry’s business model, adding an additional $64 in parts to devices at these price points when they are often sold at a loss is not sustainable financially. The stakeholder also commented that it does not make sense to compare dedicated media steaming boxes to game consoles because the consoles are optimized for different function
Epa responded
State-of-the-Art game, what game consoles have always been about, is, in essence, not covered by power requirements in this program. Instead, game play is being allowed to continue uninhibited. However, game consoles that are increasingly dedicating themselves to providing nongaming services such as media play should be held to similar standards as devices providing these same services. Devices such as set-top boxes can use as low as 4W (though more typically 10-20W) in Active Streaming Media. For these reasons, a requirement of 50W is achievable. The game console recognition program recognizes those manufacturers that are able to produce a console that pushes the limits of current efficiency within the industry.
Notice that WiiU can barely meet the Power mode requirements and while slightly older technology, Durango and PS4 have vastly more powerful GPUs.

My guess is that a Second separate chipset and associated circuitry optimized for video streaming is in Durango and PS4 and it includes a small ARM GPU. The UI and streaming is via the second chip GPU. When you press the game suspend button you switch from the Game GPU to the second smaller GPU. OS UI and overlay in game mode is the second GPU also.

Note: does this now satisfy the critics of my posts? This was a real issue and a second smaller GPU is in the PS4 and Durango.

2 GPU context switching patent by Sony starts as early as 2009 with a final file date of June 2012.
 
Jeff, you bolded the bit about the seperate streaming chipset, then stopped the bolding mid sentance where it says it would be prohibitively expensive, and could introduce latency issues when switching between systems.
In light of the industry’s business model, adding an additional $64 in parts to devices at these price points when they are often sold at a loss is not sustainable financially. The stakeholder also commented that it does not make sense to compare dedicated media steaming boxes to game consoles because the consoles are optimized for different function

In other words, 'no, we're not gonna do that'

One stakeholder commented that the game console industry supports
reasonable energy efficiency policy, and to that end all three console
makers have agreed to take several substantial steps to make consoles do
more with less energy, including: a robust auto-power down regime, a
commitment to reduce the energy expenditure for secondary functions,
power caps for next generation consoles that are approximately half of
what the current generation Xbox 360 and PlayStation 3 used at their
launch,
and a commitment to explore power-scaling technologies.

That part is interesting.
 

Toski

Member

So in essence, a Game Cube 2.0 could use as much power as it wanted to play games, but the second Netflix is on the box it has to meet EPA energy star requirements? How do PCs get around this, or do they just forego the Energy Star requirements because those customers don't care about it?
 

spwolf

Member
Jeff, you bolded the bit about the seperate streaming chipset, then stopped the bolding mid sentance where it says it would be prohibitively expensive, and could introduce latency issues when switching between systems.
In light of the industry’s business model, adding an additional $64 in parts to devices at these price points when they are often sold at a loss is not sustainable financially. The stakeholder also commented that it does not make sense to compare dedicated media steaming boxes to game consoles because the consoles are optimized for different function

In other words, 'no, we're not gonna do that'



That part is interesting.

indeed... it will use apu to stream, not ARM... so quite likely ARM chip will be low power and not have gpu.
 

QaaQer

Member
How do PCs get around this, or do they just forego the Energy Star requirements because those customers don't care about it?

general purpose computing devices can be used for whatever the end user requires, from d3 protein modelling for cancer research to browsing for fap material. they are not appliances like consoles or toasters, with specific functions. PCs are not media consumption devices by nature.
 

onQ123

Member
Final Draft Version 1.0 EPA Game Console Performance Requirements and Test Method Comment Summary and Response



My guess is that a Second separate chipset and associated circuitry optimized for video streaming is in Durango and PS4 and it includes a small ARM GPU. The UI and streaming is via the second chip GPU. When you press the game suspend button you switch from the Game GPU to the second smaller GPU. OS UI and overlay in game mode is the second GPU also.


Did we ever find out why Sony licensed more PowerVR SGX Series5XT

http://www.imgtec.com/corporate/newsdetail.asp?NewsID=643

06 September 2011

Sony licenses Imagination Technologies' PowerVR graphics IP technologies

Imagination Technologies Group plc (LSE: IMG; "Imagination"), a leader in System-on-Chip Intellectual Property ("SoC IP"), has signed a further license agreement with Sony Corporation, (“Sony”), a leading consumer electronics company, for IP cores from Imagination’s PowerVR SGX Series5XT graphics family.

Sony will deploy Imagination’s technologies in SoCs targeting consumer markets.

Under the terms of the licence agreement Imagination receives licence fees and royalty revenues based on shipments of semiconductor products incorporating Imagination's IP.
 
Jeff, you bolded the bit about the seperate streaming chipset, then stopped the bolding mid sentance where it says it would be prohibitively expensive, and could introduce latency issues when switching between systems.
In light of the industry’s business model, adding an additional $64 in parts to devices at these price points when they are often sold at a loss is not sustainable financially. The stakeholder also commented that it does not make sense to compare dedicated media steaming boxes to game consoles because the consoles are optimized for different function

In other words, 'no, we're not gonna do that'

That part is interesting.
1) The EPA Energy star specs are voluntary but they will be mandated by multiple countries all over the world (this is in the AMD proxy statement).

2) PS4 and Durango MUST comply, they have no choice. The verbiage in the stakeholder letters to the EPA are exaggerating the price and difficulty. A USB stick with a complete Android Google TV SoC system and memory retails for $48 and that includes a controller. Remove controller and we are down to less than $40, remove markup and we are under $30.

They have two issues (active UI interface power 40 watts and streaming power 50 watts) and prior to the current specs there was a Set Top Box power spec for streaming RVU and OTT IPTV of less than 20 watts; that was removed but likely after the designs for next generation were implemented.

Plans, for instance, include Skype support while RVU streaming as well as XTV browser overlay support while streaming. RVU mode is a STB mode but the additional features mentioned are likely to exceed STB power regs so they were lifted.

There are EU power regulations for Always on Standby mode with exceptions for "special features". Standby is 500mw but special exceptions are allowed and I have not been able to find the power that is authorized. EU paper on standby power mode and exceptions. it applies to the PS4 and Xbox 720.

Cerney did state that the CPU in the second chip "so called Southbridge" is there to handle background tasks because of restrictive EU power regulations. Southbridge is on and the APU is mostly turned off, that should include the GDDR5 controller and memory. This depends on what the EU regulations will allow as well as GPU and GDDR5 standby power requirements.

My opinion is that the second chip for IPTV streaming is for foreground and also can be used for background. Cerney stating the second chip is to meet EU regs for Standby special exceptions is probably bogus as I can find no specs for the special exception power regs. EPA is concerned by power use that occurs for long periods not for a firmware update that may happen once a month. They are concerned with STB power use for always on when the TV is on RVU streaming and implementing XTV at the same time requires a GPU. So second chip for Standby mode yes but not for the special exceptions that are allowed.

You also missed the Sony patent filed July 2012 on using 2 GPUs, low power and high power and the context switching between them. Hopefully Sony followed/implemented the patent. The cheaper method is to have fixed duties for the two GPUs to comply with the Energy star Regs.


onQ123 said:
http://www.imgtec.com/corporate/news...asp?NewsID=643


06 September 2011

Sony licenses Imagination Technologies' PowerVR graphics IP technologies

Imagination Technologies Group plc (LSE: IMG; "Imagination"), a leader in System-on-Chip Intellectual Property ("SoC IP"), has signed a further license agreement with Sony Corporation, (“Sony”), a leading consumer electronics company, for IP cores from Imagination’s PowerVR SGX Series5XT graphics family.

Sony will deploy Imagination’s technologies in SoCs targeting consumer markets.

Under the terms of the licence agreement Imagination receives licence fees and royalty revenues based on shipments of semiconductor products incorporating Imagination's IP.
Good find, timing is right and it's a ARM derived GPU so it's a possible in the PS4, Sony TVs, Blu-ray players, 22nm refreshed PS3.

FCC rule in 2010 mandated RVU in 2014 is going to drive STB features and home networking in all CE products. FCC rules on ATSC 2.0 2013 is going to add 1080P and S3D as well as XTV.
 
I give up, I'll just wait for the teardown.

But don't you think they'd mention this? Or mention this supposed second GPU?
You also missed the Sony patent filed July 2012 on using 2 GPUs, low power and high power and the context switching between them. Hopefully Sony followed/implemented the patent. The cheaper method is to have fixed duties for the two GPUs to comply with the Energy star Regs.

Why would Cerny play down this second cpu as an I/O and file management handler if it's part of this amazing Media processor you're describing?
 

i-Lo

Member
When the stakeholders state "half" the power consumption compared to launch models of current gen systems, do they mean while the machine performs non gaming functions or overall?

If its the latter then it stifles technological advancement in power.
 
When the stakeholders state "half" the power consumption compared to launch models of current gen systems, do they mean while the machine performs non gaming functions or overall?

If its the latter then it stifles technological advancement in power.

As they had just already mentioned what they were doing for secondary functions I assumed it meant overall but it might not. But not wanting a box as big as the original PS3 also stifles technological advancement in power.
 

Shikoro

Member
As they had just already mentioned what they were doing for secondary functions I assumed it meant overall but it might not. But not wanting a box as big as the original PS3 also stifles technological advancement in power.

Not wanting a $599 console stifles technological advancement in power, but it also stifles the demand for a year or two. :p

They went with the best of both worlds and I'm already happy with the results.
 

i-Lo

Member
Not wanting a $599 console stifles technological advancement in power, but it also stifles the demand for a year or two. :p

They went with the best of both worlds and I'm already happy with the results.

Given the lack of time stamp, this failure to live up to EPA: was it before or after the announcement of the current specs?
 
Top Bottom