• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Interesting finds showing PS4, XB1 and PS3 media plans

You're still not getting my point. All of the stuff you're talking about is probably all being done by the XMB itself. The video "windows" and such are not the same as a "window" that you would see on a desktop computer. They are most likely just part of the overall image that the XMB is rendering. Calling them "windows" implies a degree of separation that doesn't really make sense. Using a full window system in this context would be highly inefficient.
I understand that the "XMB" is the full screen framework that everything writes to and that it does have "windows". You are arguing the task level these windows can be opened, or how easily or with "standards" doesn't make them independent real "windows" like a X11 windowing system which supports among other things boarders and chrome around the windows and remote desktop and redirecting streams. That's true and only shows Sony made a custom "desktop with a subset of the same functionality..

I take issue with calling the console UIs "desktops" because they are so far removed from the actual desktop metaphor that desktop computers use. This is admittedly getting into semantics, but given some of the other assumptions you're making, I feel the need to clarify.

Also, as said above, what you are referring to isn't the multitasking I'm talking about. I'm talking about having fully separate applications on the screen at the same time. Not just there being a video embedded somewhere in the interface. Having videos embedded in the UI doesn't even necessarily require true concurrency.

I'd need more information about both of these topics in order to give more insight.

The reason the PS3 OS is lacking in chat features isn't because it is too difficult to implement. It's because there isn't enough RAM. Switching over to a web app would actually make that problem worse, since web apps use far more RAM than native ones.

Similar reasoning applies to the rest of the OS. Sure, they would have the features to reimplement the XMB in HTML/JS, but it would be much slower than the current version, and would use a lot more RAM.

Also, just because you can rewrite an app doesn't mean that you should. That is a terrible reason to rewrite something. It's only worth it if there are significant advantages to doing it, which there really wouldn't be in this case. "It is easier to write the apps in HTML/JS, than as native apps" is not a significant enough benefit when you already have a competently written, working native implementation.
All this is essentially the same misunderstanding of a browser desktop.

1) Nearly all native programs necessary to support a browser as well as the Javascript engine are loaded as part of the kernel.
2) Every application uses those same program APIs that the browser uses and they can be called from C++, Lua, Javascript, Java, Mono and more.
3) Apps for the browser desktop can be C++ or signed third party WebMAF apps.

Everything runs under a framework (desktop) that has support for a browser. All W3C extensions for a browser are the creation of APIs in native programs that can be called by a browser's Javascript engine in a STANDARD published by W3C. C++ programs can call those same APIs.

The advantages are numerous; reuse of the same code, common APIs used by every app, smaller kernel always loaded which usually results in faster APP loads including the browser. When I say the Vidipath platforms have W3C extensions like Java and DLNA I mean that the platform has to have Java and DLNA as native programs that have APIs that can be called by EVERY app no matter what it's execution engine; C++, Mono, Lua, Javascipt etc. For W3C WebRTC support there are native language programs supporting W3C standard APIs for the system Camera, Video processing and audio as well as audio echo-cancellation.

All or most of the Stability updates are rewrites to the Native libraries to support new APIs but still using those same native libraries. We may not see new features being supported but the "hooks" for the new features are there. When a new GTKwebkit API is developed and added as a Javascript feature supported by Webkit it's really a native language program being used by a scripting language (Javascript) and that same native program can be used by Lua or Mono or Java but sometimes changes need to be made to the native program to support a new API (call/feature).

HTML5 <video> is a native language program supporting a number of video features (APIs) that are reused and reused and reused by nearly everything. This includes extensions (APIs) for DRM, DASH and more. HTML5 <video> routines are now used by DLNA and require DASH, are used by Video Chat and require DASH, will be used by the 4K blu-ray digital bridge which requires the use of the MSE DRM extensions which Playready and Playready ND will hook into.

If a video chat app uses the HTML5 <video> routines it would result in a much much smaller app. As an example, Netflix on the PS3 is a stand alone app and includes DRM, Player, Javascript engine, video framework and native libraries which take about 37 megs (more when in memory). If it was implemented in a browser it would require -0- space beyond the bookmark and Icon. As a WebMAF app, you still need the MAF framework but are using the system browser libraries so about 10 MB (guess).

Audio chat takes even less memory but it could not be included in game on the PS3 either (this was discussed with the release of PS3 firmware 3.0) either because there was limited memory or some games from EA didn't follow Sony recommendations and those games would break if Sony implemented Video chat as part of the OS including "Pulse Audio" routines which were then recently implemented for Gnome audio APIs.

The original PS3 OS native language libraries did not have the APIs it now has and the early versions created apps like chat that were mostly stand alone with very little reuse of common native libraries. A XMB rewrite COULD result in a smaller kernel and faster loads at the same time. The PS3 is getting a rewrite to support Playready because Sony wants the PS3 to support Vidipath. Vidipath will require many new features that require the use of MODERN browser native libraries. IF you have them loaded then why would you still use apps that are primarily stand alone like Chat when you can create a C++ version that is extremely small using the native libraries already loaded for Vidipath features and at the same time update to use a common: "Friend/Contact", Calendar, notebook, event log....

There is a common reason Gnome supports a video editor and the PS3 and PS4 support a video editor...it's a nice feature but the real reason is the HTML5 <video> native library has APIs that make creating a basic video editor extremely easy.

To clear up previous questions; the GTKwebkit project creates APIs for features that need to be supported by Apple or Google changes in webkit. The APIs use native libraries typically found on Gnome Linux builds which may need rewrites to support those APIs. Sony duplicates those same features/APIs using BSD native libraries. Since Webkit is supported across multiple OS there are native libraries Sony can use on BSD. Sony makes minor changes to the Vector graphic routines for browser chrome but uses the GTK APIs without change.
 

Pokemaniac

Member
I understand that the "XMB" is the full screen framework that everything writes to and that it does have "windows". You are arguing the task level these windows can be opened, or how easily or with "standards" doesn't make them independent real "windows" like a X11 windowing system which supports among other things boarders and chrome around the windows and remote desktop and redirecting streams. That's true and only shows Sony made a custom "desktop with a subset of the same functionality..

The problem with what you're saying is that what you're calling "windows" in the Playstation OSes have little to nothing in common with windows in a traditional windows system. You're trying to force a connection where none exists.

All this is essentially the same misunderstanding of a browser desktop.

1) Nearly all native programs necessary to support a browser as well as the Javascript engine are loaded as part of the kernel.

I'm just going to stop you right here. This is a profoundly terrible idea. Not only would it make the entire system generally unstable, it would also be hilariously insecure. It this was actually the case, then any rouge JavaScript could potentially compromise literally the entire system with barely any effort.

I'm going to assume that you actually just mean that Webkit is loaded as part of the OS, because that actually sounds like a reasonable design.

2) Every application uses those same program APIs that the browser uses and they can be called from C++, Lua, Javascript, Java, Mono and more.
3) Apps for the browser desktop can be C++ or signed third party WebMAF apps.

Everything runs under a framework (desktop) that has support for a browser. All W3C extensions for a browser are the creation of APIs in native programs that can be called by a browser's Javascript engine in a STANDARD published by W3C. C++ programs can call those same APIs.

The advantages are numerous; reuse of the same code, common APIs used by every app, smaller kernel always loaded which usually results in faster APP loads including the browser. When I say the Vidipath platforms have W3C extensions like Java and DLNA I mean that the platform has to have Java and DLNA as native programs that have APIs that can be called by EVERY app no matter what it's execution engine; C++, Mono, Lua, Javascipt etc. For W3C WebRTC support there are native language programs supporting W3C standard APIs for the system Camera, Video processing and audio as well as audio echo-cancellation.

C++ bindings for JavaScript APIs are cool, but generally slower and more resource heavy than directly using a native implemenatation.

All or most of the Stability updates are rewrites to the Native libraries to support new APIs but still using those same native libraries. We may not see new features being supported but the "hooks" for the new features are there.

That is the exact opposite of a stability update. Stability updates don't add functionality, they fix it.

When a new GTKwebkit API is developed and added as a Javascript feature supported by Webkit it's really a native language program being used by a scripting language (Javascript) and that same native program can be used by Lua or Mono or Java but sometimes changes need to be made to the native program to support a new API (call/feature).

If they're really using the WebkitGTK API, then that is a very overly simplistic view of what's happening. Calling the features "native programs" is kinda misleading. It is all native, but it is a single giant program which implements everything. The WebkitGTK API essentially just manipulates the Webkit rendering engine.

Just to make absolutely sure we're on the same page, the WebkitGTK API only allows interaction with Webkit itself. If you wanted to directly use one of the supporting libraries, you'd go and interact with it directly.

HTML5 <video> is a native language program supporting a number of video features (APIs) that are reused and reused and reused by nearly everything. This includes extensions (APIs) for DRM, DASH and more. HTML5 <video> routines are now used by DLNA and require DASH, are used by Video Chat and require DASH, will be used by the 4K blu-ray digital bridge which requires the use of the MSE DRM extensions which Playready and Playready ND will hook into.

If a video chat app uses the HTML5 <video> routines it would result in a much much smaller app. As an example, Netflix on the PS3 is a stand alone app and includes DMA, Player, Javascript engine, video framework and native libraries which take about 37 megs (more when in memory). If it was implemented in a browser it would require -0- space beyond the bookmark and Icon. As a WebMAF app, you still need the MAF framework but are using the system browser libraries so about 10 MB (guess).

Audio chat takes even less memory but it could not be included in game on the PS3 either (this was discussed with the release of PS3 firmware 3.0) either because there was limited memory or some games from EA didn't follow Sony recommendations and those games would break if Sony implemented Video chat as part of the OS including "Pulse Audio" routines which were then recently implemented for Gnome audio APIs.

You clearly fundamentally misunderstand how programs use RAM. It is not simply a cache for data that is on disk or the Internet. That is an important job, but it is not the only one. It also is used for the working memory of an application. In case you're not aware, this working memory essentially contains the state of the application. Web browsers are very complex applications, with lots of data necessary to keep track of their state, and that causes the, to use a lot of RAM. Any web app is going to inherently use a lot more RAM than a comparable native application thanks to the overhead of the web browser that it is running in.

Also, as I elaborated on a bit in my response to your PM which I just sent yesterday, WebMAF cannot possibly be using MAF if it is using the Webkit rendering engine. WebMAF is almost definitely just a Nintendo Web Framework-style container/runtime for web apps.

The original PS3 OS native language libraries did not have the APIs it now has and the early versions created apps like chat that were mostly stand alone with very little reuse of common native libraries. A XMB rewrite COULD result in a smaller kernel and faster loads at the same time. The PS3 is getting a rewrite to support Playready because Sony wants the PS3 to support Vidipath. Vidipath will require many new features that require the use of MODERN browser native libraries. IF you have them loaded then why would you still use apps that are primarily stand alone like Chat when you can create a C++ version that is extremely small using the native libraries already loaded for Vidipath features and at the same time update to use a common: "Friend/Contact", Calendar, notebook, event log....

Because you don't just rewrite things when a shiny new library appears. Especially not on an embedded system like the PS3. Rewriting applications isn't something to be taken lightly. It involves significant effort and is something of a last resort tactic.

There is a common reason Gnome supports a video editor and the PS3 and PS4 support a video editor...it's a nice feature but the real reason is the HTML5 <video> native library has APIs that make creating a basic video editor extremely easy.

You're mixing up cause and effect. GNOME has a video editor because GNOME aims to provide a full desktop environment including auxiliary utility applications that the user might just expect to be there. Sony has a video editor because they thought one would be good to have in their OS. You don't make programs because you have libraries, you incorporate libraries that provide features that would be useful to have in your program.

To clear up previous questions; the GTKwebkit project creates APIs for features that need to be supported by Apple or Google changes in webkit. The APIs use native libraries typically found on Gnome Linux builds which may need rewrites to support those APIs. Sony duplicates those same features/APIs using BSD native libraries. Since Webkit is supported across multiple OS there are native libraries Sony can use on BSD. Sony makes minor changes to the Vector graphic routines for browser chrome but uses the GTK APIs without change.

I think you may be conflating all of the libraries used by Webkit with Webkit itself. That might explain some of the weirdness in a lot of what you're saying.
 
The problem with what you're saying is that what you're calling "windows" in the Playstation OSes have little to nothing in common with windows in a traditional windows system. You're trying to force a connection where none exists.
The connection is in part an understanding that the hardware for the PS3 can support multiple video windows and the PS3 XMB has more tasks of greater CPU use (5 video windows) than the PS4 which does have a "Browser Desktop". Similar functions are seen in both. The difference is the PS3 at this time is custom Sony and the PS4 is industry standard VERY similar to the Gnome Desktop in function but not look and feel.

I'm just going to stop you right here. This is a profoundly terrible idea. Not only would it make the entire system generally unstable, it would also be hilariously insecure. It this was actually the case, then any rouge JavaScript could potentially compromise literally the entire system with barely any effort.

I'm going to assume that you actually just mean that Webkit is loaded as part of the OS, because that actually sounds like a reasonable design.
The PS4 has a browser desktop and the Open source software lists: Lua, Java, Javascript and Mono which are all engines that can call and use the same native libraries and all likely have bindings to Cairo.

You forget that the PS4 has a ARM trustzone processor for security and Playready is going to be used by both. That the PS3 does not have a trustzone processor is a possible reason for the PS3 OS to be different from the PS4 and have restrictions with fewer features.

C++ bindings for JavaScript APIs are cool, but generally slower and more resource heavy than directly using a native implementation.
All OS native programs designed to be called by programs have APIs that other native programs use; libc is an example and the native program APIs in libc are also used by Javascript and can be used by Lua, Java and Mono also. You are overthinking or don't understand that the only difference is a greater effort is made to reuse the same native libraries, the browser native libraries.....in every other respect a C++ program using those same libraries is still a Native language program and will run at the same speed since disk access is not needed. This scheme also reduces memory copy and memory fragmentation which was a major goal of the Gnome Mobile Initiative (Zero copy).

A WebMAF app can use C++ and/or Javascript and uses those same native libraries that support a browser. If performance or security is an issue it can use all C++. WebMAF be it Mozilla's Application Framework with the Nuanti webkit port or Mobile Application Framework, it is a FRAMEWORK designed to reuse the webkit native libraries and the Javascript engine. A Framework is a software structure with standards with which other apps can plug into the OS and in this case it's based on Webkit and native support libraries.

WebMAF = Mozilla Application Framework in Detail Proof => https://www.linkedin.com/pub/ian-arundale/28/167/640 "PlayStation® 3/PlayStation® 4 (WebMAF/Nuanti Webkit)"

The PS4 Plex app is a signed WebMAF app. store.sonyentertainmentnetwork.com/#!/en-gb/apps/plex/cid=EP4544-CUSA01703_00-WEBMAF000000PLEX

http://www.nuanti.com/gecko-to-webkit said:
This report chronicles some of the community work we've done at Nuanti to restore and enhance functionality in GNOME applications that were limited by the Mozilla Gecko browser engine.

In some cases, the issues were severe enough that the applications were dropped by distributors, leading to popular and successful software failing to reach end-users &#8212; a worst-case scenario for any Open Source project.

For each of these applications, Nuanti engineers were available to replace (or assist in replacing) the faulty Gecko component with WebKit GTK+, restoring full functionality as well as introducing new capabilities and enhancing application performance.

That is the exact opposite of a stability update. Stability updates don't add functionality, they fix it.
I put quotes around Stability as Sony uses this for everything that does not include new features.

Just to make absolutely sure we're on the same page, the WebkitGTK API only allows interaction with Webkit itself. If you wanted to directly use one of the supporting libraries, you'd go and interact with it directly.
No, not in all cases; To go to a lower level, a GTK webkit routine can use Javascript calling GTK native library APIs massaging data and returning it to the webkit core and in this case, to support a new webkit feature published by Apple, that routine can only be used if Javascript is used. The GTK APIs generally use existing native library program APIs used in Gnome. For clarity, GTK creates APIs in native programs or uses existing APIs in native programs to support features published by Webkit that come from work primarily by Google and Apple...Javascript is not needed in every case. These create the standards that native libraries supporting the GTK version of webkit have to support. Some of the programs like Gstreamer are common across most browsers and have set API standards that others follow even if they are using a different Player. Gstreamer was a program used by Gnome (among others) before webkit.

The Gnome Mobile initiative was to pair down to absolute minimum the number of libraries needed to support the GTKWebkit browser; Android did the same. Android's Java engine can use those same libraries that are used to support a webkit browser on an Android platform. If it helps, think of the PS4 using an Android OS...and the Android Apps as similar to WebMAF. Both the Android Apps and WebMAF use a FRAMEWORK for support. The Framework is different but has common elements because both the PS4 and Android use a POSIX OS (Linux and BSD Unix) and both use a version of eglib. Routines from Android's version of eglib were ported to Gnome's Glib because they were smaller, faster and contained new needed features. Sony is using eglib in the PS4 which is a embedded CE (consumer electronics = smaller/simpler) BSD version of glib on Gnome.

Because you don't just rewrite things when a shiny new library appears. Especially not on an embedded system like the PS3. Rewriting applications isn't something to be taken lightly. It involves significant effort and is something of a last resort tactic.
Sony hasn't rewritten the PS3 XMB or the Chat app or Netflix the Netflix app since 2012 and before. Some of the libraries used by apps have been rewritten or changed, that is part of the Stability updates....You can revise and rewrite apps but Sony doesn't while Google, Apple and Microsoft do. The WebMAF framework (regardless of who created it) that Sony is using for Apps can be used for the XMB apps. That literally nothing has been upgraded is taken by many as the PS3 is dead but Playready is being ported to the PS3 and the 2015 PDF to the FCC has the PS3 supporting Vidipath. So if it's not dead then Sony must be waiting for a major rewrite to the PS3 with more support for the WebMAF framework AND browser as both are quite tightly tied together.

I take this one step further and assume that Sony doesn't want to reboot the PS3 every time it runs an App. This is in part a security issue and possibly a quick and dirty memory fragmentation fix. Gnome Mobile's Zero Copy loads the libraries once and since everything uses and reuses them, this reduces memory fragmentation. It can also be used to insure there are no changes to this kernel image in memory...Playready must do something similar to detect tampering with the DRM or player routines.
 

AmyS

Member
Apple partner Imagination unveils PowerVR 'super-GPU' with 512 ALU cores for game consoles



The following is an early educated guess at the Apple TV specs (grain of salt) but it matches on the high end, the above Power VR GPU. IF apple does want a game console as the Apple TV then it could be more accurate than not.


I do hope you're right.

If the new Apple TV has the high-end PowerVR GT7900 GPU configuration, it should be on par with the Tegra X1 powered Android TV Shield console.

 
Taking the previous two posts (AmyS and McSpidey) on Apple TV/Shield and PS Now servers running on custom PS3s....

The large number of ARM STBs that Play games and support Android TV or Apple TV with game performance coming this year compared to a Xbox 360 and PS Now sorta indicates a market for the Xbox 360 and PS3 still exists provided a refresh allows Price, power levels and security needed for IPTV.

Is this possible...is it coming....who knows. Shield is $199, Apple TV is speculated at $149+ and a PS3 is selling now for $219.00. Diskless PS3 should be $159?

If Sony is porting Playready to the PS3 and wants it to be a Vidipath STB at this late date then it has years of life.
 

Pokemaniac

Member
The connection is in part an understanding that the hardware for the PS3 can support multiple video windows and the PS3 XMB has more tasks of greater CPU use (5 video windows) than the PS4 which does have a "Browser Desktop". Similar functions are seen in both. The difference is the PS3 at this time is custom Sony and the PS4 is industry standard VERY similar to the Gnome Desktop in function but not look and feel.

The only thing that the PS4 UI and the GNOME desktop have in common is that they serve a vaguely similar purpose. I don't really see any other similarity between the two. I'd also like to know which "industry standard" this supposedly is.

The PS4 has a browser desktop and the Open source software lists: Lua, Java, Javascript and Mono which are all engines that can call and use the same native libraries and all likely have bindings to Cairo.

You forget that the PS4 has a ARM trustzone processor for security and Playready is going to be used by both. That the PS3 does not have a trustzone processor is a possible reason for the PS3 OS to be different from the PS4 and have restrictions with fewer features.

ARM trustzone only really applies to software running on the ARM CPU. The actual overall security ramifications of this rely heavily on how exactly the ARM CPU is actually used. At any rate, it would definitely not impede malicious code from gaining full control of the main x86_64 CPU.

To be absolutely clear, the reason what you are saying sounds ridiculous is because you are implying that the Webkit code will be running in the kernal. It is okay for the kernal to cache it in memory, but actually running the Webkit stuff in kernal mode would be a ridiculously stupid thing to do.

All OS native programs designed to be called by programs have APIs that other native programs use; libc is an example and the native program APIs in libc are also used by Javascript and can be used by Lua, Java and Mono also. You are overthinking or don't understand that the only difference is a greater effort is made to reuse the same native libraries, the browser native libraries.....in every other respect a C++ program using those same libraries is still a Native language program and will run at the same speed since disk access is not needed. This scheme also reduces memory copy and memory fragmentation which was a major goal of the Gnome Mobile Initiative (Zero copy).

You originally described this rather poorly. You seemed to be implying that everything would be running through web standards through Webkit.

Library reuse is often good if used where appropriate.

A WebMAF app can use C++ and/or Javascript and uses those same native libraries that support a browser. If performance or security is an issue it can use all C++. WebMAF be it Mozilla's Application Framework with the Nuanti webkit port or Mobile Application Framework, it is a FRAMEWORK designed to reuse the webkit native libraries and the Javascript engine. A Framework is a software structure with standards with which other apps can plug into the OS and in this case it's based on Webkit and native support libraries.

Mozilla Application Framework is a framework meant to build desktop applications that run in Gecko. If you're not using Gecko, you're not using the Mozilla Application Framework. I don't know how much more clear I can make this.

Regarding the rest of this, the only solid info that I can find on WebMAF is this:
http://develop.scee.net/research-technology/ said:
WebMAF

Video streaming application framework

WebMAF is a framework for a video streaming application, using a web browser and a video player to provide standard video streaming on PlayStation® platforms.

The native code is supported and extended by SCEE R&D, making it easier for video service partners to bring their content to PlayStation®.

That heavily suggests that WebMAF is what I suspected it was all along, a tool/runtime that allows you to package and run web apps as Playstation applications. Using native code in apps running in an evironment like this would kinda defeat the purpose of using it to begin with.

I put quotes around Stability as Sony uses this for everything that does not include new features.

I don't see any quotes.

However, even with that it isn't really right. Typically you mainly add code to add new APIs. Saying "rewrite" like you did gives the impression that they rebuild the whole thing every time.

No, not in all cases; To go to a lower level, a GTK webkit routine can use Javascript calling GTK native library APIs massaging data and returning it to the webkit core and in this case, to support a new webkit feature published by Apple, that routine can only be used if Javascript is used. The GTK APIs generally use existing native library program APIs used in Gnome. For clarity, GTK creates APIs in native programs or uses existing APIs in native programs to support features published by Webkit that come from work primarily by Google and Apple...Javascript is not needed in every case. These create the standards that native libraries supporting the GTK version of webkit have to support. Some of the programs like Gstreamer are common across most browsers and have set API standards that others follow even if they are using a different Player. Gstreamer was a program used by Gnome (among others) before webkit.

You're arguing semantics. The WebkitGTK API is purely for manipulating Webkit. Sure, that might indirectly use other stuff, but that doesn't matter to the application calling the APIs. The actual application is only interacting with Webkit.

The Gnome Mobile initiative was to pair down to absolute minimum the number of libraries needed to support the GTKWebkit browser; Android did the same. Android's Java engine can use those same libraries that are used to support a webkit browser on an Android platform. If it helps, think of the PS4 using an Android OS...and the Android Apps as similar to WebMAF. Both the Android Apps and WebMAF use a FRAMEWORK for support. The Framework is different but has common elements because both the PS4 and Android use a POSIX OS (Linux and BSD Unix) and both use a version of eglib. Routines from Android's version of eglib were ported to Gnome's Glib because they were smaller, faster and contained new needed features. Sony is using eglib in the PS4 which is a embedded CE (consumer electronics = smaller/simpler) BSD version of glib on Gnome.

What you seem to think that WebMAF is is waaaaaaaay more broad then what it probably actually is. You're basically conflating WebMAF with every single library available as part of the operating system. WebMAF, as I explained above, is most likely a very specific environment meant for running web apps. What you are talking about is basically just the OS itself. All libraries that applications can use are just part of the OS. There is no special name for it.

Sony hasn't rewritten the PS3 XMB or the Chat app or Netflix the Netflix app since 2012 and before. Some of the libraries used by apps have been rewritten or changed, that is part of the Stability updates....You can revise and rewrite apps but Sony doesn't while Google, Apple and Microsoft do. The WebMAF framework (regardless of who created it) that Sony is using for Apps can be used for the XMB apps. That literally nothing has been upgraded is taken by many as the PS3 is dead but Playready is being ported to the PS3 and the 2015 PDF to the FCC has the PS3 supporting Vidipath. So if it's not dead then Sony must be waiting for a major rewrite to the PS3 with more support for the WebMAF framework AND browser as both are quite tightly tied together.

There is also the third alternative that all parties involved are satisfied with the current state of the applications and feel no need to make major changes to them. Or maybe they're not completely satisfied, but don't really want to put in the investment that they'd need to actually do it.

Trying to compare Google or Apple in this instance doesn't make sense, as they make general purpose OSes which aren't tied to what is essentially a single piece of hardware. Microsoft's Xbox 360 is the only real valid comparison, and even that's slowed down a bunch. The only device that currently benefits from development on the PS3 OS is the PS3, so it makes sense that development on it is slowing down as the hardware reaches the end of it's natural life.

I take this one step further and assume that Sony doesn't want to reboot the PS3 every time it runs an App. This is in part a security issue and possibly a quick and dirty memory fragmentation fix. Gnome Mobile's Zero Copy loads the libraries once and since everything uses and reuses them, this reduces memory fragmentation. It can also be used to insure there are no changes to this kernel image in memory...Playready must do something similar to detect tampering with the DRM or player routines.

Do you have any evidence that the system is actually restarting? If it did, then I'd expect that times to enter and exit software would be more comparable to how it was on the Wii U at launch.

I wouldn't be too surprised if the XMB is restarting, but that is not the whole OS, it is just the main interface of the OS. If that is happening, then it would probably be mainly due to memory limitations.
 
The only thing that the PS4 UI and the GNOME desktop have in common is that they serve a vaguely similar purpose. I don't really see any other similarity between the two. I'd also like to know which "industry standard" this supposedly is.

ARM trustzone only really applies to software running on the ARM CPU. The actual overall security ramifications of this rely heavily on how exactly the ARM CPU is actually used. At any rate, it would definitely not impede malicious code from gaining full control of the main x86_64 CPU.

To be absolutely clear, the reason what you are saying sounds ridiculous is because you are implying that the Webkit code will be running in the kernal. It is okay for the kernal to cache it in memory, but actually running the Webkit stuff in kernal mode would be a ridiculously stupid thing to do.
The W3C HTML5 and OpenGL standards. W3C set standards for Voice, Gesture (coming), Gamepad, Chat, Video streaming and more; Vidipath requires these W3C extensions.

My understanding is that the final version of the PS4 OS will have most of the routines that are API called running in Southbridge in the 256MB of Southbridge memory. HTML5 <video> with media and DRM extensions for sure, Playready, DLNA, Facial and voice recognition, Game pad, Bluetooth, Video RTC routines, Java and more..anything that needs to be always running like EAS (emergency alert), Incoming call, wake on Key phrase, second screen routines and Miracast will be run in that 256 MB of memory for security or power reasons or because the hardware to perform the function is ARM in the ARM SoC as Southbridge.

So what if something takes over the X-86 APU, it can't access any system resources without going through the ARM SoC and the worst case would be a Resource intensive Webpage using the X-86 APU browser trying to delete or encrypt files on the hard disk or to access files on your local network. Each of these has to go through southbridge and in every case a user dialog generated by Southbridge can ask if you want to do this. Or just not allow any file access by a browser program without security keys that southbridge recognizes.

Mozilla Application Framework is a framework meant to build desktop applications that run in Gecko. If you're not using Gecko, you're not using the Mozilla Application Framework. I don't know how much more clear I can make this.
This is from a Sony developer = (WebMAF/Nuanti Webkit) That is either stating WebMAF and Nuanti Webkit which means the PS4, Vita and PS3 are using Nuanti Webkit or it means the Nuanti Webkit version of Mozilla's Web MAF. Since we know from the Webkit disclosure it's not Nuanti's webkit but a semi custom webkit using GTK APIs which Nuanti's webkit port of Mozilla Application Framework can support but not using Gecko.

The original argument was that Sony is using GTK APIs as proven by their use of WebMAF/Nuanti Webkit and it's also in the Sony Webkit disclosures. Either could show this but both prove it 200%. If, as you seem to want to do, you want to be nit-picking then yes it's not a pure MAF but MAF as you pointed out to me is eliminating Gecko too.

Regarding the rest of this, the only solid info that I can find on WebMAF is this:

That heavily suggests that WebMAF is what I suspected it was all along, a tool/runtime that allows you to package and run web apps as Playstation applications. Using native code in apps running in an evironment like this would kinda defeat the purpose of using it to begin with.
You missed the point that the WebMAF app's framework is C++ and it can call Java or Lua or Mono or Javascript or just run straight C++ code. It is not limited as you suggest. I've read that XTV apps primarily use Java but can use Javascript. Java will be used by IOT apps at the lowest power levels so Java is a Southbridge ARM SoC engine.

That the Sony site says WebMAF is used for streaming apps is not proof of anything because all current apps on the PS3 and PS4 are streaming apps. There will eventually be other apps that support text editing and more.

What you seem to think that WebMAF is is waaaaaaaay more broad then what it probably actually is. You're basically conflating WebMAF with every single library available as part of the operating system. WebMAF, as I explained above, is most likely a very specific environment meant for running web apps. What you are talking about is basically just the OS itself. All libraries that applications can use are just part of the OS. There is no special name for it.
And I'm saying that all applications can use the same WebMAF framework. For example; MAF was used to create the entire Firefox browser. MAF needed to have access to every system resource to do this right?


There is also the third alternative that all parties involved are satisfied with the current state of the applications and feel no need to make major changes to them. Or maybe they're not completely satisfied, but don't really want to put in the investment that they'd need to actually do it.
Netflix will not be allowed to stream 4K unless the DRM is embedded and the platform security is robust enough to support it. Currently on the PS3, DRM is inside the Netflix app and soon won't be allowed to stream 1080P unless it's updated and uses embedded resources. The PS3 won't be allowed to use DSS (Downloadable security scheme) or connect to a DSS platform unless it has embedded DRM. File access and playing is limited and the PS3 can't connect to a Over the air tuner because there is no certified embedded system DRM. Miracast isn't allowed for the same reason. Lots of things will change when Playready is embedded....maybe not as fully as I envision but it will be a major rewrite with a full set of Vidipath features.

Trying to compare Google or Apple in this instance doesn't make sense, as they make general purpose OSes which aren't tied to what is essentially a single piece of hardware. Microsoft's Xbox 360 is the only real valid comparison, and even that's slowed down a bunch. The only device that currently benefits from development on the PS3 OS is the PS3, so it makes sense that development on it is slowing down as the hardware reaches the end of it's natural life.
The Playready port and the use of the PS3 as a Vidipath platform means the PS3 is not end of life. End of Life as a AAA game console maybe but not as a Vidipath STB and casual game console.

Do you have any evidence that the system is actually restarting? If it did, then I'd expect that times to enter and exit software would be more comparable to how it was on the Wii U at launch.
Yes, watch a first time boot and then the reboot before and after Netfix. Time it too. Sony uses a snapshot boot kernel which vastly speeds up boot times.

I wouldn't be too surprised if the XMB is restarting, but that is not the whole OS, it is just the main interface of the OS. If that is happening, then it would probably be mainly due to memory limitations.
The entire OS is rebooting each time except from the browser return and I think video chat. That eliminates reboot is needed for memory limitation and lessens it's need for memory fragmentation issues.
 
I don't understand a single thing on any of this.
We are discussing the PS3 and PS4 media features, how they are implemented, a coming update to the PS3 and PS4 OS and the changes that will bring. It starts with an understanding of how Sony is supporting apps.

1) The PS3 is getting a Playready update to support Vidipath and the PS4 will also support Vidipath. Both may support a Downloadable Security Scheme for Cable TV.
2) Vidipath and OTA Antenna support requires a OS that recognizes and supports Metadata rules and Flags in OTA TV programming. To this point the PS3 doesn't or doesn't do this securely and Sony has to lock down certain OS features.
3) When Sony opens up those features because it has Playready support, Netflix can no longer use the current app as it is not embedded and could be cracked which then with OS copy and File access, movies could be moved off the PS3 Hard Disk.
4) Vidipath requires WiFi Direct to support Miracast and Second screen features
5) Miracast requires a common DRM in the Handheld and in the PS3 and PS4.
6) XTV and browser create vulnerabilities which I think Sony understands and has provided for and Pokemaniac thinks the odds favor a breach in security. The potential vulnerabilities create issues I stated in #3 above.

7) Lots of new features or rather features that have been waiting on a secure embedded DRM in the PS3 and PS4. Embedded means protected but more importantly ALL media is processed/played by the same embedded player and if it has metadata and or flags it is handled DRM properly. All commercial media is encrypted and can be stored anywhere. This opens up some of the PS3 OS features that were locked down. Playready DRM will authorize multiple platforms in the home by domain and possibly owner name. All Playready certified platforms (Vidipath) in the home on the same domain can share media either with copies on a network drive and moved between platforms or by DLNA and DTCP-IP protected streaming.

8) Multiple Vidipath platforms are coming at the end of this year and some are ARM STBs with game console features.
9) A Vidipath STB or Smart TV is needed to support ATSC 2.0, HbbTV, TNT 2.0 and other Xtended TV features so Vidipath platforms are needed and if they don't have a tuner then a USB or Network tuner is needed for OTA and possibly with some of the coming Downloadable Security Schemes made possible with the elimination of the Cable Card requirement by the FCC.

Network DLNA TV tuners are coming at affordable prices, many Accessories will be useable across most Vidipath platforms as they support the same W3C standards. More features driven by competition for our dollars and use for viewing of commercials. .
 

Pokemaniac

Member
The W3C HTML5 and OpenGL standards. W3C set standards for Voice, Gesture (coming), Gamepad, Chat, Video streaming and more; Vidipath requires these W3C extensions.

This doesn't really make the PS4 UI "industry standard". It was built using industry standards, but that doesn't make it industry standard itself.

My understanding is that the final version of the PS4 OS will have most of the routines that are API called running in Southbridge in the 256MB of Southbridge memory. HTML5 <video> with media and DRM extensions for sure, Playready, DLNA, Facial and voice recognition, Game pad, Bluetooth, Video RTC routines, Java and more..anything that needs to be always running like EAS (emergency alert), Incoming call, wake on Key phrase, second screen routines and Miracast will be run in that 256 MB of memory for security or power reasons or because the hardware to perform the function is ARM in the ARM SoC as Southbridge.

Some of those things sound like phone features that a PS4 couldn't possibly implement.

Do you have a source for this info?

So what if something takes over the X-86 APU, it can't access any system resources without going through the ARM SoC and the worst case would be a Resource intensive Webpage using the X-86 APU browser trying to delete or encrypt files on the hard disk or to access files on your local network. Each of these has to go through southbridge and in every case a user dialog generated by Southbridge can ask if you want to do this. Or just not allow any file access by a browser program without security keys that southbridge recognizes.

Having full control over the main CPU allows you to 1) access any data that is in RAM that the CPU has access to, 2) easily launch attacks on the ARM CPU kernal, and 3) do anything that you can get the ARM CPU to allow you to do. Getting access to the hard drive could prove a challenge, but there's a pretty good chance that you could get network access pretty easily.

Also, I kind of doubt that the ARM CPU can actually draw dialogs on the screen. I've seen designs like this before, but never one where the secondary CPU can actually draw to the screen. As I said above, I'd rather like to see your sources on the ARM CPU's functionality.

This is from a Sony developer = (WebMAF/Nuanti Webkit) That is either stating WebMAF and Nuanti Webkit which means the PS4, Vita and PS3 are using Nuanti Webkit or it means the Nuanti Webkit version of Mozilla's Web MAF. Since we know from the Webkit disclosure it's not Nuanti's webkit but a semi custom webkit using GTK APIs which Nuanti's webkit port of Mozilla Application Framework can support but not using Gecko.

The original argument was that Sony is using GTK APIs as proven by their use of WebMAF/Nuanti Webkit and it's also in the Sony Webkit disclosures. Either could show this but both prove it 200%. If, as you seem to want to do, you want to be nit-picking then yes it's not a pure MAF but MAF as you pointed out to me is eliminating Gecko too.

Both of these pieces of evidence are circumstantial at best. If you notice, the Mozilla Application is never called WebMAF, only MAF. You assume that they're both the same thing, but don't have any real basis for doing so. The fact that that company has ported applications from Gecko to Webkit before doesn't prove anything, since that company just generally works on Webkit. GTK also doesn't prove anything, since that is a general purpose library used by lots of stuff.

Also, you seem to have interpreted my comment about Servo wrongly. That's probably my fault, because I didn't make it clear enough. If/when Servo replaces Gecko, Firefox will no longer be using the Mozilla Application Framework. It is not simply XUL that has been deprecated, but essentially the entire framework. When I said that Mozilla seems to be essentially trying to rebuild the browser from the ground up, I wasn't kidding.

You missed the point that the WebMAF app's framework is C++ and it can call Java or Lua or Mono or Javascript or just run straight C++ code. It is not limited as you suggest. I've read that XTV apps primarily use Java but can use Javascript. Java will be used by IOT apps at the lowest power levels so Java is a Southbridge ARM SoC app.

Using binary components in MAF is only possible through XPCOM or NPAPI. XPCOM is only supported by Gecko, and if any significant portion of your web app requires NPAPI, you probably did something wrong.

EDIT: Clarifying note: NPAPI is a standard thing which Webkit supports and is not specific to MAF.

That the Sony site says WebMAF is used for streaming apps is not proof of anything because all current apps on the PS3 and PS4 are streaming apps. There will eventually be other apps that support text editing and more.

And I'm saying that all applications can use the same WebMAF framework. For example; MAF was used to create the entire Firefox browser. MAF needed to have access to every system resource to do this right?

MAF isn't simply the what was used to create Firefox. MAF, for all intents and purposes, is Firefox. Any MAF app is essentially just Firefox with a bunch of the XUL replaced.

Also, why the fsck would Sony release a text editing app for the PS4? I see no rational use case for something like that.

EDIT: One thing I forgot to mention when I originally wrote this. The key phrase from the description is "using a web browser and a video player". That suggests that it is a framework specifically for web apps, similar to Nintendo Web Framework, but probably with an special video player component, likely that Trilithium player.

Netflix will not be allowed to stream 4K unless the DRM is embedded and the platform security is robust enough to support it. Currently on the PS3, DRM is inside the Netflix app and soon won't be allowed to stream 1080P unless it's updated and uses embedded resources. The PS3 won't be allowed to use DSS (Downloadable security scheme) or connect to a DSS platform unless it has embedded DRM. File access and playing is limited and the PS3 can't connect to a Over the air tuner because there is no certified embedded system DRM. Miracast isn't allowed for the same reason. Lots of things will change when Playready is embedded....maybe not as fully as I envision but it will be a major rewrite with a full set of Vidipath features.

It is possible that Netflix may decide to update. They also might not. It is also possible that Playready will be added only as part of the Vidipath app, and not in the OS itself. You can't just assume that the new functionality will be used just because it exists.

The Playready port and the use of the PS3 as a Vidipath platform means the PS3 is not end of life. End of Life as a AAA game console maybe but not as a Vidipath STB and casual game console.

I didn't say that it was at the end of it's life. It's probably getting pretty close, though. I'd be shocked if it was still supported in a 2-3 years from now. I don't think Sony really wants to keep manufacturing those Cell CPUs.

Yes, watch a first time boot and then the reboot before and after Netfix. Time it too. Sony uses a snapshot boot kernel which vastly speeds up boot times.

The entire OS is rebooting each time except from the browser return and I think video chat. That eliminates reboot is needed for memory limitation and lessens it's need for memory fragmentation issues.

I don't have a readily available PS3 to test the speed on. Do you have any more direct evidence that this happens? It's been a little while since I've used a PS3, but I don't recall it having any of the usual signs that a reboot was going on in the background. You can often tell that this is happening by carefully watching the behavior of connected peripherals.
 
Jeff said:
My understanding is that the final version of the PS4 OS will have most of the routines that are API called running in Southbridge in the 256MB of Southbridge memory. HTML5 <video> with media and DRM extensions for sure, Playready, DLNA, Gesture/Facial and voice recognition, Game pad, Bluetooth, Video RTC routines, Java and more..anything that needs to be always running like EAS (emergency alert), Incoming call, wake on Key phrase, second screen routines and Miracast will be run in that 256 MB of memory for security or power reasons or because the hardware to perform the function is ARM in the ARM SoC as Southbridge.

Some of those things sound like phone features that a PS4 couldn't possibly implement. Do you have a source for this info?
Yup, phone features because it's using ARM trustzone and Xtensa processors: http://www.neogaf.com/forum/showthread.php?t=916219 There are power mode regulation in place or coming. This was mentioned at the PS4 launch as the reason for Southbridge but only for things like Network standby...nothing else was mentioned and a few seem to think that means that it only handles standby functions. PC Video streaming and Blu-ray players have 20 some watt language but not legally binding regulations except in some countries and California or soon there. This is also interesting and a possible power saver.

1) HTML5 <video> with media and Cryptography extensions. Playready requires the Player, Codecs, Metadata parsing routines, Keys and cryptography be embedded/protected in a TEE. Arm Trustzone can support a TEE and the Xtensa processor used for codecs can also be used to encrypt/decrypt and encode/decode a audio or video stream. It is on the ARM AXI buss which is controlled by a ARM CPU with trustzone support.

In the XB1 and AMD APUs and dGPUs, TEE is a ARM SoC block in the APU and uses Trustzone. In the PS4, Sony moved the ARM blocks out of the APU to Southbridge so they could use GDDR5 with the APU.

Voice and gesture middleware on AMD APUs is ARM code running on the Xtensa processors that also are used by the UVD (Universal Video Decoder). Middleware performs basic functions and provides APIs. Middleware can be ARM code that X-86 code passes to and accepts data from. How this is done in the PS4 over a PCIe from X-86 APU to ARM Southbridge SoC can be custom or with IOMMU and a common addressing scheme. Trusted boot and fast data transfer requirements might require a IOMMU on either side of the PCIe which is then a HSA design. I have no idea how this is accomplished.

If you research Xtensa processors (True Audio) and Low power key phrase voice turn-on which is also supported by several phones and is or will be in the XB1, it uses the Xtensa processor in a special low power mode.

Miracast, Blu-ray, IPTV streaming, Playing a Playready encrypted movie, DLNA...everything enters the ARM SoC as an encrypted stream and exits as a HDCP 1.4 or 2.2 encrypted stream to the HDMI port. This complies with TEE requirements.

Also, I kind of doubt that the ARM CPU can actually draw dialogs on the screen. I've seen designs like this before, but never one where the secondary CPU can actually draw to the screen. As I said above, I'd rather like to see your sources on the ARM CPU's functionality.
Define a primary CPU. The PS4 boots the Southbridge ARM SoC and it root of trust boots the X-86 APU. The PSN B&W Icon from a cold boot just before the PS4 desktop displays is generated by the ARM SoC. I'm old enough to have worked with PCs and the Atari ST before they had GPUs and one of the Cell features is Graphics support. The Xtensa processor package can also support UI graphics and games.

The key to everything including HEVC support are the Xtensa processors.


Also, you seem to have interpreted my comment about Servo wrongly. That's probably my fault, because I didn't make it clear enough. If/when Servo replaces Gecko, Firefox will no longer be using the Mozilla Application Framework. It is not simply XUL that has been deprecated, but essentially the entire framework. When I said that Mozilla seems to be essentially trying to rebuild the browser from the ground up, I wasn't kidding.
Google's BLINK is agnostic as to the toolkit just as Webkit is but likely Google's libraries will be used and some of the framework APIs could change. Samsung's Tizen is now using BLINK with QT and this resulted in Cable Labs switching from the QT toolkit/APIs to the GTK toolkit/APIs because they still wanted to use webkit and the QT branch wouldn't have as much support for webkit since Samsung was a big contributor.

Also, why the fsck would Sony release a text editing app for the PS4? I see no rational use case for something like that.
The PS4 will not be just a game console or IoT and media hub. Edit: This is not understood...Vidipath/HTML5 requires support for network printers...this could just be for platforms like PCs except if you have read the papers by the founding members of some of the open source software from 1986. Read this link as I think it provides an understanding for the PS3 and PS4 design and future features but Sony dropped the ball for the PS3 security scheme and that may impact features on the PS3.

It is possible that Netflix may decide to update. They also might not. It is also possible that Playready will be added only as part of the Vidipath app, and not in the OS itself. You can't just assume that the new functionality will be used just because it exists.
Playready requires a TEE embedded environment for higher resolutions and for any platform that can copy media over a network or stream to other platforms. 4K media requires Playready 3 and Playready ND. Bottom of the OP.

I don't have a readily available PS3 to test the speed on. Do you have any more direct evidence that this happens? It's been a little while since I've used a PS3, but I don't recall it having any of the usual signs that a reboot was going on in the background. You can often tell that this is happening by carefully watching the behavior of connected peripherals.
More direct than actually seeing it happen....I'm sure it's documented but that is not direct evidence.

It sounds like you are more informed on some of the technical terms but don't have the big picture that requires reading article from the last two years. Follow the following and read the blue links. Many have contributed and it's also fun to see in hindsight posts that predicted what we see today and predict what's coming the next few years.

Game Consoles to replace Cable boxes and the connected home starts in 2014 2015

Job posting and posts indicating the PS3 and other platforms will be using WebMAF and Playready DRM.

PS3 (likely not a production firmware) supporting RVU in 2010

Job postings "Heavy single page HTML5 media apps"

More job postings for HTML5 apps using OpenGL or similar
 
ATSC 2.0 passed candidate status early this year and can be implemented by OTA TV stations and ATSC 3.0 has reached candidate status with implementation scheduled for 2020. This is within the life of the PS4 and XB1.

ATSC 2.0 PDF A current 1080P TV can't support this, it requires a 2013 or later smart TV or a USB or Network TV tuner and something like a Vidpath STB = XB1, PS4 and PS3.

ATSC 3.0 will require a new tuner but the PS4 and XB1 can support HEVC and the other features of ATSC 2.0 will be supported by ATSC 3.0

HbbTV is very similar to XTV coming with ATSC 2.0 and Vidipath platforms can support both. Sony has been for years supporting HbbTV in their smart TVs sold in Europe.

HbbTV services and devices at IBC 2015

Quick video for consumers showing what a HbbTV can do.

HbbTV web site
 
Playready Overview for DSTAC (FCC Downloadable Security to replace the Cable Card)

Microsoft making the case that it's Playready supports an end to end solution for Video in all it's forms.

What's Happening: Enhanced Content (4K) requires ECP (Enhanced Content Protection) = Playready 3 and Playready ND use the C-ENC format
ECP requirements will raise the bar for all commercial content
HTML5 <video> apps and Dash will enable interoperable media distribution using the common encryption format (C-ENC)

HTML5 app and DASH media frameworks enable interoperable commercial media applications = HTML5 <video> EME MSE


Playready ND is ahead of the Revolution Presentation, must read plans and pricing for Playready ND in iOS and Android

4K Enhanced Content Protection = Playready ND In HOME streaming support to iOS and Android ( Porting kits to OEM, TVs, STB, PC, Silicon (Sony, Samsung, Nokia, Phillips, HTC (Mfg of Android and Windows Smart Phones)) from Windows 10 PCs, XB1 and ?PS4? My View is yes, 4K Playready ND streaming from the PS4 also. This is likely the reason for the Microsoft-Sony.com and Sony-Microsoft.com domain registration by Microsoft. They will be the first to support 4K blu-ray players with a Digital bridge that streams 4K blu-ray over the home network using Playready ND to their platforms (Android TVs and Phones as well as Microsoft Surface with Miracast streaming to TVs).

Playready ND is ECP 4K protection for streaming between platforms in the home. There is no other 4K source for streaming in the home except 4K blu-ray players and side loaded Playready (Ultraviolet model).

Anyone have another source for 4K in home content that can be streamed?
 

Pokemaniac

Member
Yup, phone features because it's using ARM trustzone and Xtensa processors: http://www.neogaf.com/forum/showthread.php?t=916219 There are power mode regulation in place or coming. This was mentioned at the PS4 launch as the reason for Southbridge but only for things like Network standby...nothing else was mentioned and a few seem to think that means that it only handles standby functions. PC Video streaming and Blu-ray players have 20 some watt language but not legally binding regulations except in some countries and California or soon there. This is also interesting and a possible power saver.

You completely missed the point of what I said. Some of those features are things that make absolutely no sense to implement on a device that is not directly connected to a
phone network.

1) HTML5 <video> with media and Cryptography extensions. Playready requires the Player, Codecs, Metadata parsing routines, Keys and cryptography be embedded/protected in a TEE. Arm Trustzone can support a TEE and the Xtensa processor used for codecs can also be used to encrypt/decrypt and encode/decode a audio or video stream. It is on the ARM AXI buss which is controlled by a ARM CPU with trustzone support.

In the XB1 and AMD APUs and dGPUs, TEE is a ARM SoC block in the APU and uses Trustzone. In the PS4, Sony moved the ARM blocks out of the APU to Southbridge so they could use GDDR5 with the APU.

Voice and gesture middleware on AMD APUs is ARM code running on the Xtensa processors that also are used by the UVD (Universal Video Decoder). Middleware performs basic functions and provides APIs. Middleware can be ARM code that X-86 code passes to and accepts data from. How this is done in the PS4 over a PCIe from X-86 APU to ARM Southbridge SoC can be custom or with IOMMU and a common addressing scheme. Trusted boot and fast data transfer requirements might require a IOMMU on either side of the PCIe which is then a HSA design. I have no idea how this is accomplished.

If you research Xtensa processors (True Audio) and Low power key phrase voice turn-on which is also supported by several phones and is or will be in the XB1, it uses the Xtensa processor in a special low power mode.

Miracast, Blu-ray, IPTV streaming, Playing a Playready encrypted movie, DLNA...everything enters the ARM SoC as an encrypted stream and exits as a HDCP 1.4 or 2.2 encrypted stream to the HDMI port. This complies with TEE requirements.

Define a primary CPU. The PS4 boots the Southbridge ARM SoC and it root of trust boots the X-86 APU. The PSN B&W Icon from a cold boot just before the PS4 desktop displays is generated by the ARM SoC. I'm old enough to have worked with PCs and the Atari ST before they had GPUs and one of the Cell features is Graphics support. The Xtensa processor package can also support UI graphics and games.

The key to everything including HEVC support are the Xtensa processors.

The primary CPU is the one that application code runs on. In this case it is the x86_64 CPU. It doesn't matter that the ARM CPU is technically "in control", it almost definitely doesn't have application code running on it. The Wii and Wii U use a somewhat similar setup, though in those cases, it is defintiely just an ARM CPU.

If the PS4 really is setup like you think (I have some doubts, especially since your slide supposedly from a Sony presentation seems to have actually come from AMD to promote Kaveri) then the ARM CPU would likely be able to directly draw to the screen.

As for precisely what it is responsible for drawing, without more information, that is up for debate.

Google's BLINK is agnostic as to the toolkit just as Webkit is but likely Google's libraries will be used and some of the framework APIs could change. Samsung's Tizen is now using BLINK with QT and this resulted in Cable Labs switching from the QT toolkit/APIs to the GTK toolkit/APIs because they still wanted to use webkit and the QT branch wouldn't have as much support for webkit since Samsung was a big contributor.

What does any of this have to do with what I said?

The PS4 will not be just a game console or IoT and media hub. Edit: This is not understood...Vidipath/HTML5 requires support for network printers...this could just be for platforms like PCs except if you have read the papers by the founding members of some of the open source software from 1986. Read this link as I think it provides an understanding for the PS3 and PS4 design and future features but Sony dropped the ball for the PS3 security scheme and that may impact features on the PS3.

Just FYI, all the links in that post are dead, so I'm going purely off the text of the post.

None of what is in that post makes any sort of argument for why a game console would get any sort of text editing app. Printer support does not imply text editing. There are plenty of things that you can print just with a web browser alone.

Playready requires a TEE embedded environment for higher resolutions and for any platform that can copy media over a network or stream to other platforms. 4K media requires Playready 3 and Playready ND. Bottom of the OP.

I wasn't aware of that. That makes it pretty likely that PlayReady will be implemented at the OS level. However, that still just brings us back to the point that just because something is there doesn't imply that it will be used.

More direct than actually seeing it happen....I'm sure it's documented but that is not direct evidence.

Looks can be deceiving. Just because the system looks like it is restarting doesn't mean that it is. If this really is documented, why don't you try to find some of this documentation?

It sounds like you are more informed on some of the technical terms but don't have the big picture that requires reading article from the last two years. Follow the following and read the blue links. Many have contributed and it's also fun to see in hindsight posts that predicted what we see today and predict what's coming the next few years.

Game Consoles to replace Cable boxes and the connected home starts in 2014 2015

Job posting and posts indicating the PS3 and other platforms will be using WebMAF and Playready DRM.

PS3 (likely not a production firmware) supporting RVU in 2010

Job postings "Heavy single page HTML5 media apps"

More job postings for HTML5 apps using OpenGL or similar

Look, I'm not particularly interested in cable boxes in general. I find the whole cable model to be somewhat archaic. I just came in here because your posts set off a number of red flags in my head indicating that you may not fully understand what you're talking about. I think that this conversation has proved that correct. You clearly aren't clueless, but you generally seem to only have a fairly high level understanding of software. This has led you to make some bad conclusions.
 
You completely missed the point of what I said. Some of those features are things that make absolutely no sense to implement on a device that is not directly connected to a phone network.
Perhaps I can explain if you tell me which features you are talking about. The Xbox 720 leaked roadmap had several phone features for the Forenza glasses that are not in the XB1. The Yukon design had two GPUs, one for performance/games and one for the UI supporting low power modes needed by XTV. The design has changed but the roadmap is still in place but delayed.

The primary CPU is the one that application code runs on. In this case it is the x86_64 CPU. It doesn't matter that the ARM CPU is technically "in control", it almost definitely doesn't have application code running on it. The Wii and Wii U use a somewhat similar setup, though in those cases, it is defintiely just an ARM CPU.
Some applications will run entirely in the ARM SoC as Southbridge, some will run on both X-86 and ARM. NONE can run on only the X-86 APU. This to my mind makes the primary CPU ARM. If it's only a Game console then you would be correct but it's more.

If the PS4 really is setup like you think (I have some doubts, especially since your slide supposedly from a Sony presentation seems to have actually come from AMD to promote Kaveri) then the ARM CPU would likely be able to directly draw to the screen.
The slides are from a Sony presentation on Trueaudio borrowed from AMD and Tensilica. The block on the right is the Tensilica DPU while the block on the left is the X-86 CPU/GPU. The PS4 supports Trueaudio using a Xtensa processor and if Xtensa processors are used for audio they will have to be used for Video codecs or the codec libraries from Tensilica are useless. Real time HEVC decoding takes 4 phone CPUs or a small GPU while real time HEVC encoding requires vastly more processing power. The XB1 supports HEVC encoding and decoding at the same time using Xtensa processors.

The AXI buss and Xtensa processors internally support Network on Chip (NOC). Modern Power PC and ArmV8 server chips support NOC and are called Oban designs. There was a 2011 Oban design for the Xbox 360 that made it to tapeout and was canceled.

From Tensilica Xtensa literature:

"Tensilica DPUs can provide the 2D and 3D graphics support required to drive the TV menu system and play games." = second smaller GPU that both Microsoft and Sony had patents for and was mentioned as needed in the leaked Xbox 720 powerpoint and by Sony in letters to Energy Star and EU on Game Console power modes.

Clearly DPUs can also support software video codecs, Playready and WMDRM10 (DTCP-IP) as well as HDCP 2.X.

As for precisely what it is responsible for drawing, without more information, that is up for debate.
The Video frame buffer and video player have to be in Southbridge to support TEE required by Playready. Video that is unencrypted can not be allowed outside of the TEE. That video is drawn in the Video buffer and then encrypted before leaving Southbridge for the custom HDMI chip. I fully understand I am using the wrong term "drawn" for effect. Full screen video doesn't require a GPU and the PS3 XMB shows that a GPU is not needed for a UI. The Xtensa processor is used in BLu-ray players for the UI and can even play games which supports it's power for simple low power GPU features needed by XTV.

Tee level DRM for on-line transactions requires that trust Icons be displayed on-screen to the customer and generated inside the TEE.

Just FYI, all the links in that post are dead, so I'm going purely off the text of the post.

None of what is in that post makes any sort of argument for why a game console would get any sort of text editing app. Printer support does not imply text editing. There are plenty of things that you can print just with a web browser alone.
The post is about a Living room STB that does it all. Text editing is just an example.

Every Cloud app including Microsoft word and Excel now used by tablets and phones can be used for a XB1 and something like it for the PS4. Remote desktop from Windows 10 to the PS4 and XB1 will be implemented as a browser app.

I wasn't aware of that. That makes it pretty likely that PlayReady will be implemented at the OS level. However, that still just brings us back to the point that just because something is there doesn't imply that it will be used.
Embedded DRM is not a case by case. Every media file has to be parsed by Playready for flags and metadata. Antenna TV is unencrypted but has DRM flags for how it can be used. If the platform does not honor the flags it can not be allowed to receive Antenna TV media and or can't save or move any media off the platform. When a secure Playready implementation is on the PS3 then it can have a TV tuner in the US. This has been the holdup for the US not seen in some European TV where they are allowed to connect the PS3 to a TV tuner.

Look, I'm not particularly interested in cable boxes in general. I find the whole cable model to be somewhat archaic. I just came in here because your posts set off a number of red flags in my head indicating that you may not fully understand what you're talking about. I think that this conversation has proved that correct. You clearly aren't clueless, but you generally seem to only have a fairly high level understanding of software. This has led you to make some bad conclusions.
Nope, even with your technical corrections my understanding of the PS4 remains the same while I find your knowledge base has significant holes that impact your understanding of the PS4 and PS3 OS and feature implementation. The last two about DRM above are examples.

If you know how things work you can see what others miss => http://www.neogaf.com/forum/showthread.php?t=1086342 about the PS4 supporting HDCP 2.2.

Edit: I read through the posts on this page and I think I see where you and I lost sync. The PS3 and PS4 are not PCs, they are embedded platforms and that allows Snapshot booting and Gnome Mobile's Zero copy which originally targeted handhelds. You don't understand what Zero copy means and you make assumptions based on how a PC works. A PC loads from hard disk or copies from cache to a program area in memory where CPU register access is much faster, it then starts executing that program. A Zero copy embedded platform copies the OS programs to memory ONCE at boot and only moves a CPU pointer to the start of routines that are always in memory. This Zero copy technique lends it'self to a Browser desktop scheme where multiple native language programs are loaded in memory and reused without needing to copy them to a program area...just the CPU register is changed to point to the start of each of the programs providing the APIs for the browser desktop....the programs never move so they don't fragment memory.

Snapshot boot can only work with an embedded platform where there are no hardware changes. A "picture" is taken of what's in memory and all registers and saved to disk. A subsequent snapshot boot just moves everything back into memory and registers and control starts where it left of when the "picture" was taken. With modern memory and hardware (CPUs and GPUs are now supporting registers that can be put in "hibernation") the GDDR5 memory automatically goes into self refresh when the APU is turned off.

Playready requires the platform owner to monitor the OS to insure the DRM is not being circumvented. It's much easier to do this when Zero copy techniques are used.

From a recent Vidipath PDF Flyer:

Approximately 20 members are preparing for certification
&#8226; Service providers services have launched
&#8226; Providers publically support VidiPath: Comcast, Time Warner, Cox
&#8226; Program draws on expertise from more than a decade of proven interoperability testing on
25,000+ device models, enabling more than 4 billion devices to share personal content
&#8226; A solution for streaming 4K/Ultra-HD TV profiles will be included in the DLNA guidelines by
Q3 2015,
enabling users to view content on multiple devices in the home with at least four
times the resolution of today's typical full HD 1080p formats.
&#8226; Reference designs are available to speed time to market for Certified products.
&#8226; Test tools also available to Members for product development to streamline troubleshooting
&#8226; Products can be submitted to any of five Test Labs (ICVs) worldwide.
 
&#8226; A solution for streaming 4K/Ultra-HD TV profiles will be included in the DLNA guidelines by Q3 2015,

Blu-ray 1080P (2006) was supposed to be a few years in advance of ATSC 2.0 which uses the same codec to support 1080P, S3D NRT and XTV (Java and Javascript). The PS3 was designed to support (except for USB or Network tuner) ATSC 2.0. The PS4's UHD blu-ray is supposed to be a few years in advance of ATSC 3.0 which will use the same HEVC codec and for XTV the same javascript and Java. It just requires a USB or Network tuner.

ATSC 2.0 has been delayed and it looks like ATSC 3.0 will be released early. Korea wants to broadcast the 2018 Winter Olympics Feb 2018 in UHD using ATSC 3.0. There are several additional factors that might make ATSC 3.0 come early.

1) Phone TV tuners. Phones use the same modulation scheme that ATSC 3.0 will use and the same UHF frequencies TVs used to use. With minor changes and firmware updates a Phone could support ATSC 3.0 mobile with very little additional cost. The FCC may require phones support ATSC 3.0 Mobile TV for emergency alerts. TVs will be required to be AOAC and turn on with an emergency alert (both the PS4 and XB1 support this AOAC network standby).

2) The FCC wants to auction off additional TV spectrum. In some markets this means not enough channels are available. ATSC 2.0 and 3.0 use h.264 and h.265 (HEVC) respectively which allow 2 to 4 times as many channels with the same bandwidth.

3) Requiring the consumer to buy a ATSC 2.0 STB and then two to three years later a ATSC 3.0 STB and possibly a new antenna does not make sense. Most of the Vidipath STBs being released in 2016 will support HEVC so with a network or USB tuner they can support both ATSC 2.0 and 3.0.

4) ATSC 3.0's primary short term use will be to support Mobile TV and 1080P channels. I suspect that sometime in 2016 DLNA network tuners will be on sale that support both ATSC 1/2 and 3.0. The current UHD TVs will require the same Network tuner and 1080P Smart TVs will require a DLNA server that transcodes to 1080P. All dumb 1080P TVs will require in addition a Vidipath STB with HEVC support.

The PS4 and XB1 have multi-stream codecs and are designed to be the HD and UHD DVR/DLNA server and transcode from HEVC to 1080P (Media Hub) as well as Vidipath client for 1080P and UHD TVs. All they need is a USB or Network tuner.

W3C extensions to HTML5 include USB and Network tuner control based on Hauppage USB and Silicon dust HD Homerun. HD Homerun prime is a DLNA tuner which serves a RUI for a Vidipath STB. No tuners currently support ATSC 3.0.

OK, that's the background as I understand it. The problem is chicken and egg....getting people to buy the Vidipath STBs be they on Cable or Antenna TV so that the majority of the cost to move to ATSC 3.0 features is pain free. Most of the UHD BLu-ray players will be connected to the home network and will have a HTML5 browser. If they support the UHD Blu-ray digital bridge they will likely support playready and are de facto Vidipath servers and Client. All they need is a Network tuner to support TV. This, I understand, is the plan for the PS4 and XB1.

Edit: ATSC 3.0 UHD TV is also going to support HDR and by accounts can support everything coming for UHD Blu-ray and vice versa. UHD Blu-ray is not just for movies.....3D and Multi-view for Augmented Reality tours of Museums etc and Multi-view for Live sporting events via UHD TV is coming. Including the audio there are 140 planned features for UHD TV. http://www.audioholics.com/hdtv-formats/atsc-3.0

ATSC 1.0 supporting 480i to 1080i
HD Blu-ray was released in 2006 supporting 1080P
IPTV streaming in about 2010 along with Blu-ray S3D
ATSC 2.0 was supposed to release 2013-2014-2015-2016 supporting 1080P, S3D (using blu-ray codec), Non Real-time Transmission, S3D and XTV
UHD streaming about 2015
UHD Blu-ray 2016

2013 (PS4 and XB1) to 2015 for other Vidipath and HEVC capable STBs enter the market but no firmware support.
2016 the FCC no longer requires a Cable Card for Cable TV and DSS (Downloadable Security Scheme) starts
2016 Vidipath STBs with HEVC codec and Firmware updates
2016 the PS4 and XB1 get firmware updates to support UHD blu-ray

UHD (ATSC 3.0) TV 2017 and Broadcast early 2018 in time for the Feb winter olympics in Korea. Supporting NRT, 1080P, S3D, UHD (using Blu-ray's HEVC), XTV and 140 planned protocols/features and is extensible

Notice the short time between UHD Blu-ray release and the release of UHD TV (ATSC 3.0). I think the time to broadcast TV as ATSC 3.0 will be equally short. Consumers in Korea need a lead time to have ATSC 3.0 tuners so they can receive the UHD broadcast Feb 2018 so I expect the tuners to make it to market in 2016 as the signaling and carrier are already candidate status and can already be used for a Tuner; UHD TVs with the new tuner by 2017.

The CE industry will need must have features to get consumers to buy UHD TV and Blu-ray and they will need education and advertising.
 
Updates to the FCC DSTAC recommendations and DLNA (Vidipath)

At this time the DSTAC has two recommendations, both would have a Downloadable Security Scheme rather than a cable card:

1) Vidipath app written by the Cable Company based on HTML5
2) A Cable TV interface standard based on HTML5 that exposes features so that CE manufacturers can write their own apps. Many are likening this to the 2010 ALLVID proposal that failed to get traction with Cable Companies.

CE manufacturers and Tivo are on the ALLVID (2) side and the cable industry are on the other (1) side.

The DLNA org just announced

Smart Home devices.

DLNA is currently working on three initial Smart Home use cases:

1) The gym extender use case consists of presenting exercise information on the user’s television rather than on a dedicated screen on the bike (or other exercise devices).
2) In the doorbell use case, someone rings the door and the surveillance camera output is displayed on the living room TV.
3) In the dimming use case, a user starts watching a movie and upon pressing play, the light dims. Later during the playback, the light intensity will adjust to the movie scenes.

VidiPath Cloud Content Addition

DLNA has extended the VidiPath guidelines to enable service providers to deliver Ultra-HD content directly from the cloud to consumer electronics inside and outside the home. The new features include HTML5 Cloud caching, 4K HEVC media format and MPEG-DAHS media format update for TV services

DLNA 4.0 Certification Available in 2016 – DLNA Certified 4.0

DLNA 4.0 promises to give consumers an outstanding experience with the latest and most feature-rich DLNA Guidelines. Companies taking advantage of this highest version can differentiate from competitors by using the DLNA 4.0 brand. The bar for DLNA 4.0 was set high and includes the following mandatory features:

Significant changes to improve interoperability and guarantee playback
Mandatory DMR (DLNA Renderer) support for DMP (DLNA Push from Phones to Renderers either from the Phone or a DLNA Server)
MPEG-2 & AVC (extend formats to HD; from 480P, 720P, 1080i to 1080P)
AVC in MP4 containers
Networked Devices Power Save
Diagnostics
IPv6
Of the above, only 1080P and IPV6 are new. 1080P may require a new DTCP-IP link protection which might be Playready ND. IPV6 is supported by most newer OSs and for years most have dual stack support for both IPV4 and IPV6 since Vista for Windows.

There is a good thread on NeoGAF on this: "IPv6, PSN and You: Clearing up the Confusion." that cites a Playstation blog. The benefits of IPV6 are:

1) Lower latency
2) Simpler setup eliminating the need for port forwarding
3) Multiple Playstation consoles can be behind a router and support multi-player playing of the same game.
4) Easier multi-cast which can be used by Playstation Vue or serving Events to multiple people.

Your ISP, Sony and the game must support this. Comcast currently supports IPV6 and looking at my devices on my home network has all supporting IPV6 including the PS4.

Edit: STREAMING 1080P requires Playready ND, DLNA 2+ is Streaming media = Vidipath. So DLNA 4 adds Playready ND while DLNA 3 provides Playready 3. Between the two you have UHD Blu-ray (Playready 3 with DLNA 3) and the Blu-ray Digital bridge with Playready ND (DLNA 4).
 
DLNA is currently working on three initial Smart Home use cases:

1) The gym extender use case consists of presenting exercise information on the user’s television rather than on a dedicated screen on the bike (or other exercise devices).
2) In the doorbell use case, someone rings the door and the surveillance camera output is displayed on the living room TV.
3) In the dimming use case, a user starts watching a movie and upon pressing play, the light dims. Later during the playback, the light intensity will adjust to the movie scenes.

Man, I love me some smart home functions. Dimming light during movies would be nice to have lol.
 
Microsoft and Sony have chosen ROVI (formerly Macrovision who purchased TV Guide/Gemstar) to support metadata for movies and music on their connected platforms including the Game Consoles. Sony just recently signed a multi-year contract renewing their agreement. ROVI has some 5000 patents in this field and a service called ROVI video with metadata on movies and TV programs in multiple languages serving 70 countries.

DVR support will require a TV guide using ROVI patents and we can expect that Sony and Microsoft will charge for this service on both Antenna and Cable TV. As I mentioned before and as Google TV exposed in 2010, search routines using metadata and returning cover art and synopsis as well as when and where it might be found and at what cost and resolution are going to be a big part of the UI. I Expect those to be bundled with the TV guide if only to expose ways for Microsoft and Sony to make money.

Plex DLNA is supported by all Game Consoles (Xbox 360, XB1, PS3, PS4) and it's primary feature over other DLNA servers is Metadata. It's being used to stream ripped copies of DVD and Blu-ray movies and provides methods to get metadata from 5 Open Source internet agents. It's illegal in the US but legal in EU countries to rip movies from DVD or blu-ray for private use. One company has set a legal precedent in the US and can store DVD and blu-ray disks locked in a digital vault and have a second copy on a hard disk for viewing on a TV; that system is some $3000.

Some thought must be given to Microsoft and Sony allowing support for what they know is piracy of commercial media. A large percentage of the piracy in the US is legal everywhere else and if truth be told I think they understand most will want the convenience of local copies on hard disk and DLNA streaming. 300 disk DVD disk servers sold years ago were $800 and out of the price point of most. Blu-ray disk servers with local copy on Hard disk, as I mentioned above are $3000. I believe Sony and Microsoft want to take advantage of consumer habits and will support a UHD Digital bridge (supports HD disks too if content owners allow). UHD Disks will have a serial number and can be licensed to a domain while HD disks don't or didn't have this ability.

Streaming DRM as it applied to DVD, blu-ray HD and UHD

Microsoft's Playready ND supports streaming 1080P to 4K content across the home network and rates both media resolution and DRM schemes in hardware to insure they meet content owners requirements. Media at DVD resolution is not considered worth protecting by Playready ND and WMDRM10 must be used. When WMDRM10 is no longer used then 480P and lower resolution (DVD) will likely not be legally DRM protected which I think is likely coming in a few years. Some blu-ray movies come with a unprotected DVD resolution digital copy which is provided to DLNA stream or copy to phones and tablets.

The UHD blu-ray digital bridge "proposal" is not new. Something like it was proposed in 2010 for HD blu-ray but the ecosystem allowing protected streaming from platform to platform wasn't ready. Playready ND and the Vidipath/Playready ecosystem will start with PCs, XB1 and PS4 as DVRs and Media hubs serving handhelds, tablets and Smart TVs connected to the home network. When that is in place and Media content creators can be convinced that it will increase sales of blu-ray disks, a digital bridge allowing local copies on hard disk able to be safely DRM streamed to other platforms will be implemented.

In EU power board papers years ago was a reference to a game console (assume PS3) controlling a blu-ray player (Could be a XB1 or PS4 but one that complies with the 27 watt requirement) that streamed blu-ray over the home network. The issue was the power that would be used (80 watts for a PS3 and 120 watts for a PS4) while watching a blu-ray movie when EU power requirements for blu-ray players were 21-27 watts. The EU and energy star wanted low power full screen IPTV support in game consoles when this feature is supported. They want low power IPTV for Vidipath too. This is why the XB1 supports power islands and the PS4 can have the APU off and GDDR5 in self refresh while full screen IPTV or blu-ray only uses the Southbridge TEE. This is also true for DVR and Key phrase voice turn on and IoT suipport.

The speculation on above timing is based on this logic; Playready ND can reduce the resolution of media to allow it to be streamed to a platform that has less DRM protection but in the UHD blu-ray digital bridge proposals Playready ND is not allowed to do this for HD and UHD blu-ray. It's unclear but I think UHD can be converted to HD but not lower and HD can not be reduced in resolution at all. This makes sense as Playready ND won't likely be supported by platforms that can only handle less than 1080P (480P - 1080i = WMDRM10).

Playready ND requires an internet connection at least once every 48 hours and the UHD digital bridge requires an internet connection to authorize a rip from UHD disk to hard disk. It was iffy on whether the first play of a UHD disk required an internet connection, the latest articles say no.
 
HDD recording on Sony's 2015 Android TVs

Sony launched its 2015 TVs based on the Android TV operating system this summer. Back then, the company promised to enable USB HDD recording and PIP/PAP for Europeans through a later software update. This is the second time that Sony delays the update. It is now expected to be available in February/March.

What makes this possible?
1) Skype and Emergency alert require always on standby modes where the TV is tuned to a channel waiting on a Emergency alert and for Skype a Network connection. It's a simple matter to also support DVR.
2) Playready and Vidipath support properly handle flags and DRM.
3) Sony will have to provide Program Guides which they will charge a fee to do and this is likely also part of Vidipath
4) The US FCC no longer requires a cable card and allows Downloadable security which opens up the US to DVRs in TVs, PS4, XB1 and PCs with TEEs (Trusted Execution Environment) and additionally in the short term Network or USB tuners.

What could cause delays or different implementation dates:

1) servers providing guides are regional
2) different DRM requirements and agreements with content providers.
 
ATSC 2.0 Candidate Status started Feb 2014 Final release status Vote April 2016.


Timeline:

October 2015 Microsoft announces a 1080P DVR for Antenna TV (1080P will be available for the first time with ATSC 2.0)

Jan 2016 the FCC no longer requires a cable card in Cable STBs and when they require cable companies support a method for a Downloadable Security Scheme, CE platforms can then implement them. DVRs in CE platforms can then be implemented in the US (post above this one).

ATSC 2.0 officially authorized or are they waiting on other standards to reach candidate status like closed caption and Subtitles in Dash streams to reach candidate status December 23, 2015. As seen below it takes W3C standards Is in use today by various &#8220;media delivery silos&#8221;, including broadcaster OTT delivery.

3.6 XML Schema and Namespace
The schema is available at W3C and the namespace is defined there. There are currently no ATSC defined
namespaces or schemas.

4. SYSTEM OVERVIEW
4.1 Features
The technology is SMPTE Timed Text (SMPTE-TT) as defined in SMPTE 2052-1 [9]. SMPTETT
was chosen as it:
&#8226; Supports world-wide language and symbol tables (specifically including non-Latin)
&#8226; Supports world-wide image glyph delivery
&#8226; Is in use today by various &#8220;media delivery silos&#8221;, including broadcaster OTT delivery
&#8226; Is US FCC closed caption safe harbor for IP-delivered content
&#8226; Supports FCC requirements for both 7081 and IP captions
&#8226; Compatible with DECE (UltraViolet) Common File Format Timed Text (CFF-TT) at [13]
All of SMPTE-TT is complex and not required to meet closed captions and subtitle
requirements. A simpler subset is desirable for practical implementation. Therefore, W3C&#8217;s new
&#8220;TTML Text and Image Profiles for Internet Media Subtitles and Captions (IMSC1)&#8221; [2] is
selected having been designed specifically for needs like broadcast as well as broadband delivery.
In summary:
&#8226; Superset of DECE/Ultraviolet CFF-TT (TTML + SMPTE-TT extensions)
&#8226; Two profiles are included
o Text Profile requiring a font rendering engine in the decoder
o Image Profile with PNG files



A/107 &#8211; ATSC 2.0 Standard

New features and their approved ATSC standards include:

Non-Real-Time (NRT) Services (A/103)
Interactive Services (A/105)
Advanced Codecs &#8211; AVC (A/72), Dolby E-AC-3 (A/53-6 & A/52), MPEG HE AAC v2, and DTS-HD
Access Control & DRM (A/106)
Complementing our ATSC 2.0 activities, TG1 developed and published specifications for 3D video coding,
 
http://www.multichannel.com/news/fcc/divided-fcc-votes-unlock-set-tops/402680 said:
Federal Communications Commission, though even a supporter had reservations, voted Thursday (Feb. 18) to approve chairman Tom Wheeler's proposal for "unlocking" the cable set top and letting third parties access the content and programming and integrate it into their own navigation devices and online video lineups.

If the chairman's proposal is approved in a final order, cable operators and other multichannel video programming distributors (MVPDs) will have to provide that information for all of their video services to any third party device of app conforming to technical specs set by a standards body.

That final order likely won't be voted on until spring at the earliest, and there are still standards to be hammered out. But the notice of proposed rulemaking (NPRM) says MVPDs won't have to implement the regulations until two years after they are adopted, so, as Pai pointed out, consumers won't see the change for almost three years at a minimum. That two years is for the development of standards, said FCC Media Bureau Chief Bill Lake.
Vidipath and DVR for antenna TV can be implemented now. Individual cable companies can authorize STBs now as AT&T authorized a Roku 3 as a Cable TV STB with the APP written by AT&T. The FCC ruling is to force cable companies to comply on a standard to replace the cable box with a more open standard than they want to support..
 

Blanquito

Member
Vidipath and DVR for antenna TV can be implemented now. Individual cable companies can authorize STBs now as AT&T authorized a Roku 3 as a Cable TV STB with the APP written by AT&T. The FCC ruling is to force cable companies to comply on a standard to replace the cable box with a more open standard than they want to support..

The article states that cable providers don't have to comply with the ruling for 2 years (in order to set standards and such), so I'm not expecting anything to happen on this front for at least 2/3 years.
 
The article states that cable providers don't have to comply with the ruling for 2 years (in order to set standards and such), so I'm not expecting anything to happen on this front for at least 2/3 years.
Yes, the FCC can't force Cable Companies for 2 years or so but Cable Companies can implement what they wanted, apps written by them, for select platforms that comply with a standard like Vidipath. Go back to page 1 and read about Comcast using Passage and Sony sending a representative to educate Cable Comanies on Standards and Vidipath.
Comcast is one of the companies that is not ready for all IPTV and has plans to start implementing it in 2017 to be ready by 2020 or earlier for UHD TV. The infrastructure hopes to be Gigabit ready for DOCSIS 3.1 by the end of 2017. Consumers have to switch out from Docsis 3.0 to Docsis 3.1 during 2017-2018 then all IPTV is phased in.

There are several issues:

1) Most of the cable companies are not ready for all IPTV. So a Vidipath gateway is still needed for the next 2+ years.
2) Some Cable companies are ready for all IPTV but the number of STBs they own are limited unless they use already existing boxes like the PS4 and XB1 and Roku 3 etc in Consumer hands. They write an app for the box essentially complying with the proposal they wanted but didn't get.
3) Knowing that a modern TV will not need a cable box and PCs, XB1 and PS4 can be Cable TV DVRs will limit investment in Cable TV owned Cable Boxes.
4) Antenna TV for ATSC 2 and ATSC 3 needs a network tuner that provides a conversion from RF to IPTV then a STB identical to what will be used for Cable TV on older TVs. Existing 4K TVs do not have a tuner for ATSC 3 so they also need a Network tuner but may not need a STB.
5) PCs, XB1 and PS4 can be DVRs for Antenna TV if they follow Vidipath standards.

NOTICE OF PROPOSED RULEMAKING AND MEMORANDUM OPINION AND ORDER
Adopted: February 18, 2016
 
There are no Vidipath certified client platforms listed here: http://vidipath.com/product-search/ (choose Device Classes under products). There is now a server: Myriad CVP-2 Server for the PC platform. In the Certification it's listed as an Endpoint (Client?) and Server and probably requires a cablecard but can legally use a DSS if the Cable Company allows. There is no definite post on this product .YET.

Something like this might be possible for the PS4 and XB1 with a Network Tuner provided a DSS is allowed.


1) Plans are for VR devices to display 3D TV.
2) UHD Blu-ray and digital bridge proposed by Sony-Panasonic-Fox uses the same standards as UHD TV and gets the industry ready for 4K TV.
2a) Skipping over DLNA 2 which should have launched in 2014 and is now outdated, Vidipath DLNA 3 uses the same standards the digital bridge and UHD IPTV uses
2b) DLNA 4 launches sometime in 2016 and is required for 1080P and IPV6 ** Sony may be waiting for this **
3) UHD TV is launching 2016 (tuners needed for older 4K TVs Network tuners?)
4) Korea plans to have UHD broadcasting Feb 2017 and to broadcast the Olympics Feb 2018
5) Likely the FCC will accelerate the adoption of UHD TV (ATSC 3)
6) Cell phones are likely going to be mandated to support ATSC 3 for Emergency alerts.
7) We are likely to get Antenna TV gateway/DVR boxes to support ATSC 3 in the home. (Like NASNE)
8) 2018 Cable TV can have DSS so no cable box needed and the mandated FCC/DSTAC STB for Cable TV is a reality.

The world wide 4K TV that the US calls ATSC 3 has the main screen as a web page using the C-ENC format for HTML5 <video>MSE EME delivery (page 6 and 18). ATSC 2 is just using a web page for XTV which requires an overlay. Both ATSC 2 and 3 have DRM beyond the copy-no copy flag which is a first for broadcast TV. Commercial launch for ATSC 3 2016 page 34.


Several Korean broadcasters announced this week that they will begin transmitting ATSC 3.0 OTA broadcasts starting in February 2017. The news comes after the two broadcast networks, SBS and MBC&#8212;in conjunction with LG Electronics, ETRI and several equipment vendors&#8212;announced the first successful live end-to-end ATSC 3.0 broadcast in the country, and represents perhaps the best confirmation yet that the ATSC 3.0 next generation broadcast standard is on schedule to be completed within the next 12 months.
Korea wants to broadcast the Olympics in 4K Feb 2018.

The FCC has sold so much TV bandwidth that it's going to be difficult to have both the old standard and new standard supporting the same channels at the same time. As a result of this and Cell phones being able to reliably and robustly receive TV in a moving car, a PUSH for ATSC 3 and a quicker than normal phasing out of the older incompatible standards will happen (this in multiple papers). This may be alleviated with STBs like the PS4 and XB1 connected to Network tuners able to downscale 4K and receive multiple 1080P channels in a HEVC container. This downscaling and HEVC multi-view codec which is also required for 3D is part of the UHD Digital bridge and mentioned in Playready ND papers. The UHD TV signal is already IPTV with the C-ENC format which is used by Playready and HTML5 <video> MSE EME.

http://www.tvtechnology.com/broadcast-engineering/0029/atsc-30-brings-flexibility-of-ip-to-broadcast/277732 said:
With ATSC 3.0, you connect your new ~$250 household gateway/router that probably includes some storage and DVR features to that OTA antenna. Now every IP device in your home&#8217;s LAN coverage area has access to everything OTA and Internet (including any &#8220;walled garden&#8221; of content your ISP might supply)

There is more of a chance that the FCC will move to expedite the transition to ATSC 3.0 as a way to ease the impact of the present-day spectrum reallocation process and the conversion of some UHF TV channels to wireless use.

Sinclair Envisions ATSC 3.0 Royalty Windfall
If the proposed next-gen TV transmission standard is adopted, Sinclair Broadcast Group stands to profit from its contribution of intellectual property, the company's execs say. "You should assume that anybody who is going to watch television, whether it is on virtual reality devices, cars, machines, phones, pads, TV sets is going to [be using] our technology," said CEO David Smith.
3D is coming to 4k TV (Virtual Reality devices), Phones, Pads, cars...it's already in the planning stages as seen in the underlying transport mechanism and W3C extensions.
 
ATSC Approves Critical Piece of Next Generation Broadcast Standard Washington DC and Baltimore now have a single frequency ATSC 3 test network.

The Physical Layer's Bootstrap is the essential core of the new ATSC 3.0 standard and serves as the universal entry point that allows all receiver devices to process and decode information. The new standard is designed to integrate with the Internet and features a long-sought after robust transmission capability that will finally power multiple mobile uses, impressive leaps in video and audio quality, localized programming and advertising and enable hundreds of new datacasting businesses. Using the new standard, broadcasters will now be able to provide robust, mobile, ultra-high definition video and enhanced, immersive audio with geo-targeted programming and advertising, advanced emergency alert functions and single frequency networks to help preserve repeater and translator service. Importantly, it also allows broadcasters to innovate with new non-programming opportunities including everything from distance learning, industry-specific mass data distribution and the backbone of the Internet of Things.
Sony just paid $212 for a company with IoT 4G radio designs and 4G patents. 4G radio is the standard used by Cell phones and ATSC 3.

Single Frequency Networks for ATSC 3 have multiple Cell Phone like towers that are frequency and phase matched to extend the coverage area including multiple cities. For Mobile this means you can start watching a program in one city and still watch it uninterrupted all the way through the next city and beyond.

By adopting the Bootstrap to a Full Standard, manufacturers can continue their product design work to produce devices capable of delivering on the promises of the new standard, confident that the key feature of the standard is locked. This includes portable tablets, home gateway devices and new transmitters."
With a Network or USB tuner the XB1 and PS4 can be home gateway devices and DVRs serving other platforms in the home over the home network using Vidipath.

"The timing is critical so that new equipment can be in place for the expected deployment to coincide with the repack of broadcast stations as a result of the upcoming Broadcast Incentive Auction. Giving broadcasters, who will be updating their facilities, the option of including Next Gen capabilities is both wise public policy and an important competitive boost with positive benefits for our industry and our viewers.
The adoption of ATSC3 will be accelerated due to the repack and Auction of TV spectrum to cell phones.

Five Things Television Broadcasters Should Know About ATSC 3.0

ATSC 3.0 Could Change Broadcasting As We Know It: If Next Generation TV becomes a reality, it has the potential to dramatically transform the broadcast television landscape. Because the ATSC 3.0 standard is based on Internet Protocol, traditional programming would become just one possible use for a broadcast television channel. Supporters of ATSC 3.0 tout the ability to broadcast in 4K and beyond, to incorporate virtual reality views and higher frame rates, improved audio, better and more targeted emergency alerting, and viewer personalization and activity.
Virtual reality views from TV broadcasting. The same standards used by ATSC 3 are also used by UHD Blu-ray. I made the comment that VR is being released this year because UHD Blu-ray standards and software stack are nearly the same as VR and UHD Blu-ray is being released this year. Of course no one understood and I got jumped on.
 

NinjaBoiX

Member
I genuinely still don't have a clue what any of this means.

Jeff, what are the real world implications of this?

I pre-emptively ban any abbreviations, numerical codes, technical terms and other assorted jargon in your response. :p
 
I genuinely still don't have a clue what any of this means.

Jeff, what are the real world implications of this?

I pre-emptively ban any abbreviations, numerical codes, technical terms and other assorted jargon in your response. :p
I tried to use fewer abbreviations or to describe before I used then. You can select, right click and use Google Search for any terms you don't understand.

4K Antenna TV (ATSC 3) has many features that ALL Consumer Electronics companies have been waiting for. The standards it uses are HTML5 with W3C Extensions which are the same as used by Vidipath, UHD Blu-ray, UHD Blu-ray Digital bridge and VR. It also supports DRM which looks to be Playready which will also be used by Vidipath (DLNA plus a common DTCP-IP streaming DRM) to stream media in the home.

The UHD blu-ray digital bridge uses Playready ND which is secure enough to protect 1080P and 4K media streaming in the home. Playready versions below 3 have WMDRM which can be used to DTCP-IP protect 480P to 1080i. The Vidipath DLNA 2 standard used Playready below 3 to support streaming media and IPTV (like Neflix). DLNA 2 and DLNA 3 are no longer listed in the Vidipath.org website, only DLNA 4 which supports 1080P streaming which makes it Playready 3 and Playready ND.

1) Cable TV had a mandate to support Vidipath by June 2015.
2) There are currently no clients for Vidipath. I believe they were/are waiting for HEVC and the media layers of 4K antenna TV to be finalized and the exact same standard will be used by all 4K media delivery schemes including UHD Blu-ray ( HTML5 <video> MSE EME & Playready 3/ND) .

If this is correct then when we get VR is when the Southbridge routines are ready to support Vidipath (DLNA 2), HEVC IPTV (DLNA 3), 1080P Vidipath (DLNA 4), UHD Blu-ray and soon UHD Blu-ray digital bridge (DLNA 4 Vidipath) after that 4K antenna TV. DVR support for Antenna TV with Vidipath streaming comes somewhere in there and will also support 4K Antenna TV.

3) The transport mechanism and bootstrap for 4K Antenna TV have been agreed upon so late this year or early next year Tuners (DLNA/Vidipath Gateway boxes) that support 4K antenna TV and the present Antenna TV can be sold.
4)The FCC expects the final layers of ATSC 3 (4K antenna TV) to be finished this year or Q1 2017.
5) The FCC can't sell or rather the Cell phone companies can't use the TV spectrum the FCC wants to sell until ATSC 3 is implemented. It appears that ATSC 2 is being skipped. They may fast track ATSC 3 using Antenna network gateway boxes or tuner dongles for 4K TVs .

The new HTML5 standards developed for TV are also being used by VR and UHD Blu-ray. Second screen is a W3C extension to HTML5 developed for TV. Any game console that supports second screen is probably planning to support 4K antenna TV = Wii U, NX, XB1 and PS4.

WiiU, PS3, WiiU and Xbox 360 have RVU support which is the precursor to Vidipath that was implemented by the two Satellite TV companies. PDFs sent to the FCC about Sony's passage have charts of a PS3 and PS4 supporting RVU and Vidipath. Microsoft in Playready ND whitepapers has Game Consoles using it to support DVR and Live TV streaming. The UHD Digital bridge proposals have Playready ND listed for the DRM allowing 4K streaming from the hard disk of a UHD blu-ray movie stored on that Hard disk drive in the UHD Digital bridge player.

All this is documented in my threads on NeoGAF.

The PS4 will support 4K blu-ray
This one we are in
All Playstation Platforms to use Playready which points to a significant PS3 update
Hardware for Media Hub features in both the XB1 and PS4 "kinda confirmed"
Game Consoles to replace Cable boxes and the connected home starts in 2014
 

Bgamer90

Banned
4K antenna TV huh? Very cool. So, if a new Xbox releases then I guess the Xbox Hauppauge TV tuner accessory would get some type of update for this.

Some TV antennas are already 4K ready too.
 

Epcott

Member
I read the thread title to myself "Something, something new media plans... PS4, PS3..."

Clicked the thread to eagerly read news.

Scratch my head and feel like the batman gif of Bruce trying to read in his dream. Thought to myself "What are they going on about..."

When I realized "Rigby?", looked for OP name, and there it was.


I'm going to try making sense of this. But from the reactions I've seen so far, I'm not sure I will.

Edit: What exactly is a Playready server?
 

Bgamer90

Banned
Some quotes in the link above (http://www.prnewswire.com/news-rele...eneration-broadcast-standard-300241854.html):

"This is a historic moment for broadcast television; opening our industry to a spectrum of opportunities..."

"The Physical Layer's Bootstrap is the essential core of the new ATSC 3.0 standard and serves as the universal entry point that allows all receiver devices to process and decode information. The new standard is designed to integrate with the Internet and features a long-sought after robust transmission capability that will finally power multiple mobile uses, impressive leaps in video and audio quality, localized programming and advertising and enable hundreds of new datacasting businesses. Using the new standard, broadcasters will now be able to provide robust, mobile, ultra-high definition video and enhanced, immersive audio with geo-targeted programming and advertising, advanced emergency alert functions and single frequency networks to help preserve repeater and translator service. Importantly, it also allows broadcasters to innovate with new non-programming opportunities including everything from distance learning, industry-specific mass data distribution and the backbone of the Internet of Things. For more information on the Bootstrap and ATSC 3.0, please visit http://atsc.org/standards/atsc-3-0-standards/.
 
4K antenna TV huh? Very cool. So, if a new Xbox releases then I guess the Xbox Hauppauge TV tuner accessory would get some type of update for this.

Some TV antennas are already 4K ready too.
TV channels are TV channels so an antenna does not care. What changes are the multiple towers and more robust delivery scheme. In some areas the TV may have an internal antenna that works. The FCC may mandate that Cell phones support 4K antenna TV. Cars will have 4K antenna TV reception that works while the car is moving.

Yup, the Antenna TV DVR announced by Microsoft using the tuner you mentioned support 1080P. 1080P is only available with ATSC 2 or 3 and if Microsoft knows they are skipping ATSC 2 then the DVR is designed to support ATSC 3. Microsoft announced multi-view HEVC profile 10 support and Multi-view HEVC is needed for ATSC 3.
 

Bgamer90

Banned
TV channels are TV channels so an antenna does not care. What changes are the multiple towers and more robust delivery scheme. In some areas the TV may have an internal antenna that works. The FCC may mandate that Cell phones support 4K antenna TV. Cars will have 4K antenna TV reception that works while the car is moving.

Yeah, I'm just pretty pumped about this news since I'm a cord cutter.

Yup, the Antenna TV DVR announced by Microsoft using the tuner you mentioned support 1080P. 1080P is only available with ATSC 2 or 3 and if Microsoft knows they are skipping ATSC 2 then the DVR is designed to support ATSC 3. Microsoft announced multi-view HEVC profile 10 support and Multi-view HEVC is needed for ATSC 3.

Very cool.
 
I read the thread title to myself "Something, something new media plans... PS4, PS3..."

Clicked the thread to eagerly read news.

Scratch my head and feel like the batman gif of Bruce trying to read in his dream. Thought to myself "What are they going on about..."

When I realized "Rigby?", looked for OP name, and there it was.


I'm going to try making sense of this. But from the reactions I've seen so far, I'm not sure I will.

Edit: What exactly is a Playready server?
Hold down the left mouse button and scroll across "Playready Server" and release. Right mouse click on the highlighted text and choose Google search. Have fun.
 
Yeah, I'm just pretty pumped about this news since I'm a cord cutter.
Very cool.
The roadmap has been 2020 for this but it may be accelerated. Korea wants to broadcast the Olympics in 4K Feb 2018. That means the tuners have to be shipping in 2017.

If you remember Vidipath was proposed as DLNA 2 with Playready 2 and early 2015 DLNA 3 and DLNA 4 popped up in the Vidpath.org site. Only DLNA 4 is now listed.

Sony is a founding member of the DTLA and DLNA orgs so we have this DLNA 4 being supported on a PS4 with Firmware 4 this October. Notice the 4's that may come in handy in advertising. For China PS4 plus Firmware 4 = 8 = PS3 plus Firmware 5 and 8 is a very lucky number in China.
 

liquidtmd

Banned
Sony is a founding member of the DTLA and DLNA orgs so we have this DLNA 4 being supported on a PS4 with Firmware 4 this October. Notice the 4's that may come in handy in advertising. For China PS4 plus Firmware 4 = 8 = PS3 plus Firmware 5 and 8 is a very lucky number in China

Have you been drinking?
 
I'm looking forward to him proclaiming there wasn't a big bang, but rather a large firmware update that started the universe.

"Rigbian Universal Firmware Theory" will be the title of the manifesto the police will eventually find.
 

tr00per

Member
If I understand it correctly, that is very interesting, but if sony is pushing ps vue, why would they try to partner with cable companies? And would they even agree to that? Also, is the existing ps4 model compatible? I tried to read and understand it, I really did.
 

NinjaBoiX

Member
Thanks for trying Jeff but I still don't get what this will do for me. I think it's something to do with PS4 being able to display 4K movies?

It's OK, I'm a luddite, I probably wouldn't get the most out of it anyway!
Your ban did not work. Please try again later.
lol
 
Top Bottom