• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

tapedeck

Do I win a prize for talking about my penis on the Internet???
Im assuming this was posted somewhere?
[/IMG]

Looks similar to that blu-ray player people were posting. I highly doubt this is the actual PS4 design, probably just a placeholder. Still kinda interesting that the official fb page would just post a pic of the system (even if its just a mock-up) out of the blue.
 

tha_devil

Member
Im assuming this was posted somewhere?

https://www.facebook.com/pages/Playstation-Network/226127317526943

421851_226128087526866_584641072_n.png


Looks similar to that blu-ray player people were posting. I highly doubt this is the actual PS4 design, probably just a placeholder. Still kinda interesting that the official fb page would just post a pic of the system (even if its just a mock-up) out of the blue.

LMAO that fb is fake account i assume.
 
While researching what might be in the ARM Low power Google TV like part of the PS4 if my speculation is correct I found the following on the ARM Mali GPU. Previous posts showed a ARM SoC with Cortex A5 and Mali GPU that required 3.5 watts. I'm assuming that a low power ARM SoC could provide full GPU accelerated HTML5 support as well as a UI for the PS4. This is a wild guess I'm following to see where it leads. It's based only on Sony and Microsoft wanting to support a Google TV like XTV feature with very low power hardware. The ARM background processor for trustzone and background streaming to handheld features mentioned in the leaked Xbox powerpoint and the Feb 20th Sony PS4 press conference has most of the hardware to do this except the Mali GPU.

ARM Mali GPUs for User Interfaces

A large number of devices are benefiting from the inclusion of GPU acceleration using APIs such as OpenGL ES 2.0 and OpenVG. Typical operations that are accelerated by GPUs are the rendering of UI elements such as text and icons, 2D and 3D transitions and the visually stunning effects that make some UIs stand out from others, such as window compositions and video textures.

As resolutions continue to increase higher resolutions are expected to appear in tablets first, as shown by the resolution leading Nexus 10 tablet. Mali GPUs enable basic 2D UIs from low resolutions up to 3D effects in 4k devices with multiple video streams. Mali GPUs open up a further range of features that can be applied to UIs such as overlay and alpha blending, complex transitions, animated icons, videos as textures as well as enabling new interaction options such as facial recognition and gesture driven UIs without the need for separate equipment. UI demand is expected to keep growing across all consumer products.

ARM Mali for HTML5 acceleration

Technologies such as HTML5 bring new elements to the browsing experience by accelerating parts of the rendering engine. The HTML5 elements that can be accelerated by Mali GPUs include image scaling, composition and font rendering. Mali GPUs can benefit a wide range of browsers and applications, support for Flash, whilst integrating OpenGL ES 2.0 for accelerated rendering as well as offering improved WebGL and Javascript engines.

ARM Mali-T624/Mali-T628
The Mali-T624 GPU offers scalability from one to four cores, whilst the Mali-T628 from one to eight cores provides up to twice the graphics and GPU compute performance of the Mali-T624, extending the graphics potential for smartphones and smart-TVs. These products provide breathtaking graphical displays for advanced consumer applications, such as 3D graphics, visual computing and real time photo editing for smartphones and smart-TVs.

ARM Mali-T678
The ARM Mali-T678 GPU offers the highest GPU compute performance available in the Mali-T600 Series of products, delivering a four-fold increase when compared with the Mali-T624 GPU through features, such as increased ALU support. This brings a wide range of performance points to address the vibrant tablet market. The Mali-T678 offers energy-efficient high-end visual computing applications, such as computational photography, multi perspective views and augmented reality.

What is ASTC?
ASTC supports a very wide range of pixel formats and bit rates, and enables significantly higher quality than most other formats currently in use. This allows the designer to use texture compression throughout the application, and to choose the optimal format and bit rate for each use case. This highly efficient texture compression standard reduces the already market-leading Mali GPU memory bandwidth and memory footprint even further, while extending mobile battery life.

All products are designed to support the following APIs; OpenGL® ES 1.1, OpenGL ES 2.0, OpenGL ES 3.0, DirectX 11 FL 9_3, DirectX® 11, OpenCL™ 1.1 Full Profile and Google Renderscript compute.

Mali GPUs are supporting the same features touted by AMD.
 

mrklaw

MrArseFace
I'd expect any ARM chip to just be handling background processes like downloads and also standby activities. I don't expect it to need to render any UI or any user interaction at all - the main processors will do that.
 

stryke

Member
I'd expect any ARM chip to just be handling background processes like downloads and also standby activities. I don't expect it to need to render any UI or any user interaction at all - the main processors will do that.


I'm paraphrasing but Mark Cerny did say they "added" dedicated hardware for compression and decompression of video when he talked about stream uploads and recordings. What the hardware entails is not clear.

http://youtu.be/RiNGZMx2vhY?t=24m20s
 

LiquidMetal14

hide your water-based mammals
Its probably just the stuff that AMD provides on all there modern video cards there appears to be a fixed function h264 encoder.

It's easy to guess that but it could be radically different to where you couldn't make comparisons. There are some "off the shelf" parts for sure but there are also some tweaks which are specific to the consoles like PS4 or 720.
 
I'd expect any ARM chip to just be handling background processes like downloads and also standby activities. I don't expect it to need to render any UI or any user interaction at all - the main processors will do that.
One standby activity is polling the PS button and if it's polling the playstation button then Blue tooth control is on the ARM buss, network access also, both Wifi and wired network if it's going to be always on to serve to handhelds. Add to this codec accelerators and you have everything needed but GPU for HTML5 and UI.

Not mentioned yet is how the PS4 is going to serve to handhelds in the background. Is the ARM hardware to also generate RVU UI HTML5 menus so that handhelds can be served a menu to choose what they want streamed? It seems to me that we are now 90% of the way to the same ARM also creating the UI and HTML5 browser desktop for the PS4. What about when a game is running and you want a chat window or HTML5 browser in game, is this using the ARM system or Jaguar and GCN GPU which will reduce Game quality/performance. What's the need for the third reserved overlay in the AMD GPU?

So for a Google TV like feature, in the case of HDMI IN; all video streamed from HDMI In to Out is checked for target data and that data acted upon or saved. If acted upon it's either by a Java or Javascript engine or both. For RVU; a DLNA server IPTV streams either clear or DTCP-IP encripted video/menu and again the IPTV video stream is checked for target data and that data acted upon or saved for the customer to indicate he wants to act on the data.

In the second case above with RVU, the menu can be picture based or HTML5 based. Everyone is getting ready to support XTV with Gesture, Voice and of course Keyboards and air mice (Move controller).

There are hints from Sony that the PS4 game side will be similar to the PS3 with a FreeBSD OS and the Application side of the PS4 will use a "simpler version of Linux". Doesn't two different POSIX OS flavors indicate two different ISA families is possible.

KidBeta said:
Its probably just the stuff that AMD provides on all there modern video cards there appears to be a fixed function h264 encoder.
To this point we have not seen flow chart diagrams of AMD SoCs that have a ARM A5 trustzone processor. What we have seen are AMD GPUs with Codec accelerators and NO background processor ability. This is new ground and AMD has not yet disclosed how it's going to work which weakly supports another NDA date agreed upon by Sony and Microsoft.
 
Continuation of:

ARM trustzone on PS4
Search for Google TV and ARM Cortex A5
Serving handhelds from PS4 and Xbox 720 requires RVU server & HTML5 UI
ARM Mali GPU information, low power and can provide the UI for the PS4 and the UI in serving handhelds


Remember this Sony Patent: Oct 7 2010

Graphics processing in a computer graphics apparatus having architecturally dissimilar first and second graphics processing units (GPU) is disclosed. Graphics input is produced in a format having an architecture-neutral display list. One or more instructions in the architecture neutral display list are translated into GPU instructions in an architecture specific format for an active GPU of the first and second GPU.

A ARM Mali GPU and AMD GCN GPU are "architecturally dissimilar". See Sheet 2/5 drawing which is about the flow and decision tree: Active GPU power is monitored and if switch criteria is met then an switch to the A GPU. Read; Background of invention on page 1.

It's about a low power GPU and a high power/performance GPU and the switching between them to save power required by an "Always on" server or Google TV like feature.


I assumed this was APU (low power XTV/Google TV like) + GPU (Game performance) context switching but it's NOT. In hindsight it can possibly apply to the PS4 with ARM Mali GPU + AMD GCN GPU. So now there is the patent and a guess based on Sony wanting to support something like Google TV/XTV/RVU with low power ARM as it will be on when the TV is on and for some features on all the time (serving handhelds).

All this to save what, 30 watts but it's always on when the TV is on 30 watts!

Why in the Yukon slide are there two GPUs, the smaller GPU belonging to the "Always on" system. The Xbox 720 in the leaked powerpoint was to have a HDMI pass-through to support something like Google TV and it would need a low power GPU to support a always on GPU accelerated HTML5 desktop/dashboard also.

Notice ARM/X86 is in both Application and System blocks. Notice in the Always on System block; 48 ALUs @ 500mhz, 500 Mhz for the GPU clock signals LOW POWER for the always on System GPU more than anything else!

Slide9.jpg


Speculation was QOS and the System GPU would be used to background serve Xbox 720 games to handhelds at the same time a game was playing in the application GPU. In hindsight this does not make sense, does the System GPU also have 32 megs of eDRAM to make up for the rumored slow DDR3 memory compared to the PS4 GDDR5, System GPU 300 Mhz compared to Application GPU @ 1 Ghz, consider DDR3 being shared between two GPUs and playing 2 games at the same time....stupid in hindsight.

Michael Pachter chooses the Xbox to win the next generation because it includes a TV tuner and Skype. Neogaf thread We've already discussed this and the TV tuner is a non-starter for anything but clear OTA. As a subsidized Cable TV box it will need 6 tuners with Cable card support and that would only work with a special SKU.

0gQps.jpg
 
Stacked memory on interposer predicted by Yole for the PS4 and in the first AMKOR product intercept (2013) and moved to 2014 in a AMKOR product intercept published the same month is now explained. I started this thread using the first Amkor product intercept and then this thread: SemiA: Console using Amkor potentially delayed 6 months till mid 2014 when pointed out by a SemiAccurate post the second Amkor product intercept PDF. There is also the GSA Memory and 3DIC conference where 3D stacked memory 2.5D attached to interposer was predicted for game consoles 2013-2014.

(Found by Stuckey on SemiAccurate) This makes all my posts semi-accurate <grin>.

http://www.electroiq.com/articles/sst/print/volume-56/issue-2/columns/packaging/3d-ic-with-tsv-status-and-developments.html said:
Many companies show a silicon interposer or 2.5D solution on their packaging roadmaps where a logic device is mounted next to a stack of memory and the through silicon vias are in the substrate. The problem is that this assumes stacked memory with TSV is commercially available at a cost/performance ratio that matches the requirements. With the stacked memory unavailable this pushes out the adoption of even 2.5D. Some companies also indicate that the cost of the silicon interposer is too high and they would like to consider a glass interposer or even a high-density organic substrate. At this time, glass interposers with TSVs are not commercially available and organic substrates with fine features are still in development.

1303SSTpack.jpg
So Sony's plan was/is to use stacked memory on interposer using what will be cheaper DDR3 components but because that memory was not available they had to use GDDR5 which gives the same bandwidth but is more expensive and uses more power.

I can only guess a early refresh will occur to transition to stacked memory on interposer sometime ?early? 2014.
 

onQ123

Member
I haven't seen anything about the PS4 using HDMI 2.0 but it says so on the HDMI Wiki page I wonder where this info come from.

http://en.wikipedia.org/wiki/HDMI

Version 2.0

The HDMI Forum is working on the HDMI 2.0 specification.[151][152] In a 2012 CES press release HDMI Licensing, LLC stated that the expected release date for the next version of HDMI was the second half of 2012 and that important improvements needed for HDMI include increased bandwidth to allow for higher resolutions and broader video timing support.[153] Longer term goals for HDMI include better support for mobile devices and improved control functions.[153]
On January 8, 2013, HDMI Licensing, LLC announced that the next HDMI version is being worked on by the 83 members of the HDMI Forum and that it is expected to be released in the first half of 2013.[12][13][14]
Based on HDMI Forum meetings it is expected that HDMI 2.0 will increase the maximum TMDS per channel throughput from 3.4 Gbit/s to 6 Gbit/s which would allow a maximum total TMDS throughput of 18 Gbit/s.[154][155] This will allow HDMI 2.0 to support 4K resolution at 60 frames per second (fps).[154] Other features that are expected for HDMI 2.0 include support for 4:2:0 chroma subsampling, support for 25 fps 3D formats, improved 3D capability, support for more than 8 channels of audio, support for the HE-AAC and DRA audio standards, dynamic auto lip-sync, and additional CEC functions.[154] The Sony PlayStation 4 will utilize this standard.
 

stryke

Member
Facts:

News - GStreamer OpenMAX IL wrapper plugin 1.0.0 release 3/22/2013 and we got a PS3 Firmware 4.40 update on the 21st and the PS Store app updated to version 1.04.

The GStreamer team is pleased to announce the first GStreamer OpenMAX IL wrapper plugin release for the new API and ABI-stable 1.x series of the GStreamer multimedia framework.

2/22/2012 Video on Gstreamer 1.0:
1) Sony uses Gstreamer to play TV video on their Google TV (Gstreamer Very integrated into the TV) @ 2:30 min.
2) "We are about to release Gstreamer 1.0." @ 3:45.
3) Started talking about Gstreamer 1.0 in 2007 @5:50 (2007 Collabora proposed Gstreamer for HTML5 <video> with Cairo bindings in GTKwebkit, two months later Sony sent Collabora a PS3 developer kit late 2007, in 2007 OpenMax 1.2 was projected for 2008 but was delayed from 2008 till Nov 2011. Big issue for GST - OpenMax was memory management which Openmax 1.2 addressed. In another talk Collabora/Gstreamer 1.0 employee commented on the extreme amount of time getting standards approved through Khronos (OpenMax) ).
4) 2006-on and Texas Instruments, Nokia and others were doing research on using Gstreamer via OpenMax framework which resulted in GST-OpenMax and exposed the issues that needed to be addressed in both Gstreamer and OpenMax.
5) 2009 a Gstreamer DASH ultraviolet DRM player was released.
6) From 2009, maybe earlier, Sony has been using OpenVG & Pixman for the XMB and AVM+ (Open source for non commercial use Flash video player which uses OpenVG) for Non-commercial video and Dash IPTV.
7) Sept 29th 2010 Sony publishes open Source Webkit javascript engine as required. Inside the disclosures it's found to be a GTKwebkit version.
8) Feb 2011 Khronos published the webGL 1.0 specs and Sony published more Webkit updates and Cairo (Cairo versions are tied to WebGL specs to support WebGL webkit) Inside new disclosures is edited GTK Chrome to POSIX chrome. NEWs picks this up and Chrome coming to the PS3 races around the world in error, it's the UI Chrome not the Chrome browser.
9) Sony is to use what front end player for commercial DASH and AR? (on Vita and PS3 backend is OpenMax IL).

As mentioned, Vita got a GPU accelerated browser and both the Vita and PS3 are using GTKwebkit2 APIs. To this point the PS3 browser is CPU only not GPU accelerated and has no OpenGLES/Cairo support (uses OpenVG which by design is the 2D portion of Cairo with a couple of minor changes).

Gstreamer now has official GST-OpenMax support and the PS3 and Vita use OpenMax IL as the video player back ends.

A few days ago Google's Chrome OS got the new HTML5 DRM proposed by Netflix/Google/Microsoft and Netflix is now available on the Chrome OS.

Speculation: The PS3 uses OpenMax IL as the back end for video media and has/is using Flash AVM+ for the front end. It may switch to Gstreamer - Openmax because Gstreamer appears to be a standard front end for video players in Opera, Firefox, Sony TVs, GTKwebkit and more.

Several key components of GTKwebkit's version of HTML5 video are now mature enough to be used in the PS3 and Vita. HTML5 video player DRM is now being implemented. It appears all the pieces needed for Sony to have a full blown Media ecosystem are now in place. Firmware 4.40 may have those pieces and when Sony is ready the PS3 will get a MAJOR update to the HTML5 software stack. When though?
 
Curiously enough, Sony have not specified the HDMI standard they will be using in their official spec sheet while all other components have their standards addressed.

http://www.scei.co.jp/corporate/release/pdf/130221a_e.pdf

You could be onto something. (Of course, anyone can edit a wiki lol).

I highly doubt it.

The HDMI 2.0 spec isn't even finalized yet. It should be finished the first half of this year. That leaves Sony very, very little time to implement and test it. Plus, Sony isn't likely to be pushing 4k 60fps, one of the main new features of the spec. The only TV's that support it are in the tens of thousands of dollars range. In the lifetime of the console 4k TV's will come down in price though, and will slowly become more mainstream. It just seems like the schedule for that would be really tight, and it's unnecessary future proofing.
 
I highly doubt it.

The HDMI 2.0 spec isn't even finalized yet. It should be finished the first half of this year. That leaves Sony very, very little time to implement and test it. Plus, Sony isn't likely to be pushing 4k 60fps, one of the main new features of the spec. The only TV's that support it are in the tens of thousands of dollars range. In the lifetime of the console 4k TV's will come down in price though, and will slowly become more mainstream. It just seems like the schedule for that would be really tight, and it's unnecessary future proofing.
You start with just a faster HDMI port...just a slight hardware change from 1.4, the rest are software standards that can be implemented over time like we got S3D on a HDMI 1.3 port in the PS3.

A faster HDMI port that can support HDMI 2.0 is a lock, the rest is software.

The PS3 can support 2 1080P monitors if there was an adapter for the HDMI port. 1080P S3D is two 1080P views but only on one monitor. PS4 will support Glassless 1080P S3D on a 4K TV and that requires 4 independent video streams. Will there be enough 4K TVs to make this practical.....maybe not but by the end of the PS4 life I expect 4K TVs to be in the 5K range. Using the feature/bandwidth for multiple monitors (With HDMI active adaptor) and head mounted glasses I think will be on-line 2015.

If you look at what AMD implemented with Display Port and have followed Display port and HDMI, usually what comes to DP eventually makes it into HDMI. HDMI 1.4 features are setting up HDMI for HomeTheatre/XTV and HDMI 2.0 will just extend the specs. DP was supposed to get an inexpensive adapter for multiple monitors but there is a chicken and egg issue with volume to make it inexpensive enough to sell. 1080P is getting cheap which allows for more consumers able to afford multiple TVs on the wall and game console volumes could support enough people using multiple monitors, IF the OS supports it, to make the adaptor affordable.
 
They called something the P3D is this document I wonder if that was a slip up?

Stereoscopic+mapping+4+.jpg
I understand from a developer that Sony is very interested in a one camera Structured light (IR) solution. There is a Sony patent to determine depth with IR intensity but it has low resolution. The Kinect 1 solution was to generate a known pattern (Patterned light) of 30,000 dots (IR Laser and interference grid) and to calculate the distortion of those dots created by the curves and angles in an object to determine in the object the different depths.

Currently there are multiple projects on the internet using structured light generated by a DLP projector and a PS Eye camera to 3D map an object. The problem is the cost in generating the structured light patterns and the cost of a IR laser powerful enough to give decent imaging. Also, multiple patterns requiring multiple frames increases latency.

The Kinect 2 one camera 1080P solution is advanced and Sony has a two camera solution. The industry, not just Sony, "fears Microsoft" and is looking for a way to compete in a cheaper one camera solution.

IllumiRoom uses structured light to map the curves and angles so that a corrected projection appears like it's on a flat surface.
 
Top Bottom