• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony outlines a long term roadmap for Playstation tech: 8K, 300fps, 3D chips and cats

This benefits all three companies, not just Sony. Yet Nintendo still made a very modest system and it seems MS is following suit. I expect the same with Sony. All three systems will debut around 265.98€ (265.98€ ($350)) IMO.

I still don't know at how much they're gonna debut, mainly because nintendo seems to be going for a fairly competent machine in specs, and also adding a touchscreen controller.
Will microsoft throw a kinect in every box and use a tablet controller also like the recent rumors? Those costs can add up i guess.
Sony's plans are still largely unknown, but they're probably aiming for a kinect like device (or an evolution of ps move, using peripherals again) and the ps vita integration for the touchscreen controls.
 

tinfoilhatman

all of my posts are my avatar
Really have these guys learned nothing?!?!?

Targeting 4-8K resolutions is pointless, unless you have a 100in screen no one will notice the difference between 1080p and even 4K
 
i actually believe sony is in a good position to launch a powerful yet affordable ps4.
Bluray costs are nothing special nowadays, they could go for an improved cell cpu to keep costs down and ps3 bc, network parts are also pretty much commodity features now, they just need to get a good amount of ram and a good gpu.

Yup, I just want an updated cell if that means full BC for the PS4 and for Sony to portion more of the budget into RAM and the GPU. I really do not want to have to repurchase a Team Ico 4k collection, though I know I would.
 

cilonen

Member
4K, in general, can't come soon enough. I want my video to be indistingushable from actually seeing something in real life.

Of course, gaming at that resolution is going to take a ton of power, but as a video display standard, bring 4K and even 8K on.
 

tinfoilhatman

all of my posts are my avatar
4K, in general, can't come soon enough. I want my video to be indistingushable from actually seeing something in real life.

Of course, gaming at that resolution is going to take a ton of power, but as a video display standard, bring 4K and even 8K on.


Do you really believe the TV networks are going to run out and buy all bran new cameras and video equipment, hell there isn't even any TV stations that are in 1080p and they have NO plans to upgrade to 4-8K.

pointless scam to try and get people to keep buying new TV
 

cilonen

Member
Do you really believe the TV networks are going to run out and buy all bran new cameras and video equipment, hell there isn't even any TV stations that are in 1080p and they have NO plans to upgrade to 4-8K.

pointless scam to try and get people to keep buying new TV

Not all, no. Enough to make it worthwhile though. It's already been used for the broadcast of popular events in Japan to outdoor public locations and BBC in the UK are planning similar event broadcasts to outdoor public screens during the 2012 Olympics. By 2020 I expect to have a set in my house.
 

quest

Not Banned from OT
Do you really believe the TV networks are going to run out and buy all bran new cameras and video equipment, hell there isn't even any TV stations that are in 1080p and they have NO plans to upgrade to 4-8K.

pointless scam to try and get people to keep buying new TV

More importantly there is no way to broadcast OTA at such a resolution. 4k is just a pipe dream broadcast TV will be stuck at 720P and 1080I for a very long time.
 

cilonen

Member
More importantly there is no way to broadcast OTA at such a resolution. 4k is just a pipe dream broadcast TV will be stuck at 720P and 1080I for a very long time.

They could do OTA in 2006. Granted, in test conditions and the maximum distance was ~2m, but it was possible in a lab 5 years ago.

Fibre and IP are the way forward though to free up the airspace for greater mobile data networks.
 

tinfoilhatman

all of my posts are my avatar
More importantly there is no way to broadcast OTA at such a resolution. 4k is just a pipe dream broadcast TV will be stuck at 720P and 1080I for a very long time.


This is only ONE of the many many many many many hurdles of 4K sets and broadcasts would have to overcome.

Again unless you have a 100+ inch screen or are setting 3 feet from your screen 4K sets won't be a that big of a deal to people walking around and looking at sets in Walmart, CostCo best buy ect.

The human eye can only perceive so much detail in a certain amount of flat space, that's why we have 1080p sets , it was determined that 1920x1080 was the optimal resolution for the size sets most people have in their living rooms anything more than that and you just don't get much for returns.

Also it's not a only a OTA broadcast issue, when you talk about Dish and DirectTV satellite transmission they are at the limit of their existing capabilities with 1080p\mpeg4

So outside of possible 4K blurays on 120+ inch screens 4K+ is useless
 

Ranger X

Member
I love how tech people like to pitch their money out the window and have stupid dreams.

1) 8000x4000 resolution is completely useless until the day we all have theather screens at home. Even just at 1080p, the human eye cannot see the difference between 720p as soon as you're 4 feet away from your 40 inch TV.

2) 300fps is also completely useless. At 60fps you already get the same smoothness as looking around in real life. 60fps is also fast enough to respond to anybody's reaction time.
 
Why do people keep telling me what my eyes can and can't see?

At 4ft, the difference between 720p and 1080p is obvious and it doesn't stop there.
 
I love how tech people like to pitch their money out the window and have stupid dreams.

1) 8000x4000 resolution is completely useless until the day we all have theather screens at home. Even just at 1080p, the human eye cannot see the difference between 720p as soon as you're 4 feet away from your 40 inch TV.

2) 300fps is also completely useless. At 60fps you already get the same smoothness as looking around in real life. 60fps is also fast enough to respond to anybody's reaction time.

He didn't mean 300fps in the way you are thinking of.
 

waxer

Member
1) 8000x4000 resolution is completely useless until the day we all have theater screens at home. Even just at 1080p, the human eye cannot see the difference between 720p as soon as you're 4 feet away from your 40 inch TV.

Am I missing something. Why does everyone insist there is no perceivable difference with a resolution gain unless you have your face against the screen. I game on my pc at about 6 feet on a 40" screen. The difference between my old 720p set and newer 1080p when swapping them over was huge. And that's before I got my prescription glasses. The difference in things like text on a screen when comparing them make that pretty obvious.
 
I love how tech people like to pitch their money out the window and have stupid dreams.

1) 8000x4000 resolution is completely useless until the day we all have theather screens at home. Even just at 1080p, the human eye cannot see the difference between 720p as soon as you're 4 feet away from your 40 inch TV.

2) 300fps is also completely useless. At 60fps you already get the same smoothness as looking around in real life. 60fps is also fast enough to respond to anybody's reaction time.

Tell that to tim sweeney.
 
Why do people keep telling me what my eyes can and can't see?

At 4ft, the difference between 720p and 1080p is obvious and it doesn't stop there.

There was this jpeg with distance and resolution that is still ingrained in people mind.
Like people standard dont go up. I dont even care if i have a hard time to notice it. I just want it case close if i have to earn money i better be spending it ;p on stupid shit i dont need.
 
Am I missing something. Why does everyone insist there is no perceivable difference with a resolution gain unless you have your face against the screen. I game on my pc at about 6 feet on a 40" screen. The difference between my old 720p set and newer 1080p when swapping them over was huge. And that's before I got my prescription glasses. The difference in things like text on a screen when comparing them make that pretty obvious.

You're not going to be able to sell that to the majority of people for a long long while though, which is all that matters.
 

Mlatador

Banned
I hope 4k resolution NEVER EVER becomes a reality, unless someone invents screens that dont have native resolutions anymore.

Assuming that LCD/OLED technology is here to stay, you can't just change native resolutions - that have become a standard like FULL HD - every now and then! You would screw the customers over pretty hard. The upscaling madness with all it's shitty side-effects would never end!

Since current TVs and Screens rely so much on a native resolution, the world needs to set a standard that won't be touched in the next 50 years or so, untill that problem get's solved.
 
Really have these guys learned nothing?!?!?

Targeting 4-8K resolutions is pointless, unless you have a 100in screen no one will notice the difference between 1080p and even 4K
Read the thread! 8K resolution and higher video streams will be used for Medical imaging with downscaling to lower resolution monitors with zooming to see details. 300FPS is a video streaming budget to support up to 8 1080P streams for Augmented Reality. IT is not for one video stream at 300fps!

Playstation memories coming for the PS3 will support Zooming into video streams...possibly 4K at some point as plans call for 4K still picture support SOON and that is essentially a repeating video loop of one 4K frame.

http://hdfpga.blogspot.com/2011/12/x265-development-open-source-hevch265.html said:
x265 Development - An Open Source HEVC / H.265
Hopefully x265 will like x264, which has been the best open source implementation of H.264. One of x264 pioneers, Min Chen, started the x265 project to push open source development for HEVC / H.265. His goal targets for embedded system, FPGA, GPU, and multi-cores system. We are looking forward to it.

HEVC or H.265 (nickname) is targeted at next-generation HDTV displays and content capture systems which feature progressive scanned frame rates and display resolutions from QVGA (320x240) up to 1080p and Ultra HDTV (7680x4320), as well as improved picture quality in terms of noise level, color gamut and dynamic range. The performance goal is that HEVC should provide 2x better video compression performance than AVC (H.264) high profile, at the expense of increased computational complexity (so hardware implementations would be important ). HEVC will significantly reduce bandwidth requirements with comparable image quality for video conferencing and streaming.
Looks like some multi-core handhelds will support x265 for IPTV streaming. HEVC will significantly reduce bandwidth requirements with comparable image quality for video conferencing and streaming. Edit: OpenCL is supported by the 4 GPU cores in the Vita so Vita can support H.265 for IPTV.

For those who haven't been exposed to what's coming, from the NeoGAF thread Ulta HDTV standard agreed upon:

For everyone asking why this tech is useful:

  • While you'd need a crazy large screen to visually resolve the full resolution, you can actually resolve above 1080p with larger screens (and especially projectors)
  • The physical display resolution is useful for doing things like polarized 3D at 1080p without display shuttering
  • This makes lenticular 3D (glasses-free) more viable as it dramatically improves viewing angles
  • For people with large screens, subtly improves 1080p (and lower) content via upscaling. Obviously that assumes good upscaling, but the math for 1080p->UHD is more straightforward versus 480p->1080p.
  • More is obviously better.
Last month a world wide standard was agreed upon and in the US ATSC 3.0 is under consideration. h.265 (HEVC) is making this possible. ATSC 2.0 is coming soon using h.264 (Mpeg4) allowing broadcast 3-D and 1080P. ATSC 1.0 is using Mpeg2

http://www.vcodex.com/h265.html said:
Table 4 of the document compares the compression performance of the HEVC test model ("HM") and the H.264 test model ("JM"). On average, HEVC out-performs H.264 by 39% for random access scenarios (e.g. broadcast) and by 44% for low delay scenarios (e.g. video calling).

This means that the HEVC codec can achieve the same quality as H.264 with a bitrate saving of around 39-44%.

High Efficiency Video Coding (HEVC) is a new Standard under development by the ISO and ITU-T. The Moving Picture Experts Group (MPEG) and Video Coding Experts Group (VCEG) have set up a Joint Collaborative Team on Video Coding (JCT-VC) with the aim of getting the new standard ready for publication in 2012/2013. It's likely to be published jointly as a new MPEG standard and a new ITU-T standard, possibly H.265

US already planning a transition to ATSC 3.0 4k & possibly 8K broadcast TV. I still remember all the 4K is not happening arguments. This is made possible by h.265.

Sony has a PDF outlining what's possible using the existing 6 Mhz Broadcast TV bandwidth and the new H.265 codec. 4K TV channels, Multi-viewpoint, full 1080P 3-D, multi-channel Web plus advertising.

Flying_Phoenix said:
I'm predicting 15 years until I see this in stores.
Sony is stating 3 years to affordable OLED 4K & 3-D glassless TV. 4K broadcast TV will not be backwardly compatible with ATSC 1.0 or ATSC 2.0. ATSC 2.0 and 3.0 will have to share the TV spectrum which will retard acceptance. A PS4 with an inexpensive external network TV tuner could accept 1024QAM ATSC 3.0, decode and down-convert if needed. PS3 might be limited to 24hz 4K (HDMI port limitation) which is not a 4K TV standard. This could be happening in larger cities by 2015 2020. ATSC 3.0 (the US name) is going to be a world wide standard so anything created for the PS3 or PS4 would work everywhere. Why mention the Sony PS3 and PS4; because they have or will have a cell processor which can handle the h.265 codec. The Intel Ivy Bridge had to have dedicated codec hardware to handle it.

4K screens aside, there are other uses proposed for the bandwidth needed for 4K. Have you read the Sony PDF on A Revolutionary Digital Broadcasting System: Achieving Maximum Possible Use Of Bandwidth

In ATSC PUBLISHED roadmaps, they only go out to 2015 and there is no mention of ATSC 3.0 implementation. This world wide standard is being pushed by others not the United States ATSC. Correct if wrong but China, Korea, Japan and Brasil are in the news for UHD I think because it gives their electronics industry an advantage in the next TV standard. Roadmaps have 2018 as the start of consumer UHD equipment rollout with 2020 being the first broadcast of that standard. That's 7 years till we see UHD being supported in TVs. That's a projected roadmap from 2007.

I agree with others in this thread that 4K is the upper end to human vision at normal or slightly sub normal viewing distances, I think white papers from Sony confirm this. The BBC is experimenting with 4K and 8K IPTV. What I thought not possible was Howard Stringers comment that Sony might start a IPTV network to compete with Cable TV. Too much bandwidth needed in supporting that many consumers. h.265 and custom schemes being developed by Sony might change that. With a Cell processor in a networked PS4 or Sony 4K TV or 4K Blu-ray player, firmware upgrade-able cell equipped Sony products could un-compress real time and less bandwidth would be needed. We will probably see IPTV supporting 4K monitors soon with some resolution above 1080P depending on network speed with a higher refresh rate.

Why is Intel putting a hardware 4K codec in Ivy Bridge? Blu-ray media @ 4K or IPTV @ 4K. I assume that 4K computer monitors are going to be widely available before 4K TV broadcast.

And whats to keep Direct TV and Cable boxes from using h.265 which can allow something above 1080P and below 4K resolution in a normal 6 Mhz TV channel bandwidth NOW. (Beyond not having media and not having a CPU powerful enough to support h.265). In the Sony PDF it mentions going from Mpeg2 to h.265 will allow 10 SD Video channels instead of 6 in a 6 Mhz TV channel. Cable companies are always trying to make the best use of their limited RF spectrum.
http://www.multichannel.com/article/474978-The_Next_Big_Video_Squeeze.php said:
In February 2012, a draft of HEVC is expected to be circulated for comments, and the first edition of the standard should be finished in January 2013.

Initially, the clear winners for HEVC will be mobile network operators. “If you look at any of the market data, 70% of the traffic will be video in the next year,” Blackman said.

HEVC will also help broadcasters and cable ops deliver Ultra HD formats, which provide four to 16 times the resolution of current 1080p HDTV.

Elemental, whose customers include Comcast and Avail-TVN, expects eventually to incorporate HEVC into its software-based encoding solution that is based on off -the-shelf graphics processing units.

sony-4k-projector-2011-10-03-13hed.jpg


Sony has announced that the PS3 will support still pictures (video out) to 4K monitors @ 4K resolution early 2012. Add a codec supporting 4K and the PS3 outputs 4K moving pictures @ 24 fps instead of the same 4K frame over and over at 24fps.

http://www.digitaltrends.com/home-theater/sony-ushers-in-the-4k-home-theater-with-vw1000es-projector/ said:
Of course, this naturally raises the question: Where exactly will 4K content come from? Sony Pictures has more than 60 theatrical releases shot in native 4K resolution, but the means of actually transferring all that data to consumers simply doesn’t exist yet. Sony reps claim the company is in talks with the Blu-ray Disc Association to iron out a standard compression scheme for squeezing 4K movies onto discs, and has already promised a 4K release of the next Spider-Man movie, but the July 2012 release date for that flick should be telling: Sony won’t yet talk timelimes on when 4K movies could hit shelves.
Sony developed compression scheme or h.265? Timeline 2012 Sony custom scheme or 2013 using h.265-HEVC?

Can anyone confirm Sony 4K titles being released on DVD or blu-ray have Cinavia DRM protection while other 2K titles do not? (Sony protecting the more valuable IP)

http://www.digitaltrends.com/home-theater/sony-ushers-in-the-4k-home-theater-with-vw1000es-projector/ said:
As with 3D, the best 4K delivery solution for the home is likely to be Blu-ray Disc. “The physical format can do it,” declares Don Eklund, executive VP of technologies at Sony Pictures Technologies, thanks to new compression algorithms. Most notable is HEVC, or High Efficiency Video Codec, which is now in advanced development. It’s considerably more efficient than the AVC codec now commonly used on Blu-rays while remaining similarly free of artifacts, and it will allow a 4K film to fit on a mass-replicated 50-gigabyte, two-layer Blu-ray Disc. “I’ve seen samples of what that codec can do with 4K at a 30-megabit-per-second bitrate compared to what AVC can do at 50 Mb per second, and it actually looks a little bit better at 30 than AVC looks at 50,” Eklund says.

4K blu-ray on PS3 with firmware update?

http://www.hometheater.com/content/4k-revolution-page-2 said:
What’s missing, however, is a 4K Blu-ray technical standard that would allow the manufacture of players and discs. Putting that specification together will require a concerted effort by the member companies of the Blu-ray Disc Association. “There are discussions starting, but they’re very early discussions,” notes Eklund. “We just finished 3D, and these things are pretty hard work. But Sony and the other key companies are looking at it very hard.”

Still, the reality is that it’s going to take a while before those discussions bear fruit. With few manufacturers (save Sony) actively promoting native 4K displays for the home and no installed consumer base paving a profit path, other manufacturers and studios in the Blu-ray consortium just aren’t heavily incentivized yet to come on board. It’s the classic chicken-and-egg scenario. And even when the group is unified on a goal, as it was with Blu-ray 3D, it can still take more than a year to develop a standard. Then the manufacturers will have to move the final specification into player hardware as the studios begin mastering discs to it. Translation: Don’t give up hope; we’ll get there eventually. But don’t hold your breath.

There is one more option for driving a 4K display with native 4K content: Make it yourself. A 9-megapixel digital camera—common today—can capture the equivalent of a 4K digital still image. Eklund says Sony’s PlayStation group even worked with the projector division to create a slide viewer for the PS3 that will allow it to send 4K stills stored on the game console to the VPL-VW1000ES for native viewing. And although 4K movie cameras remain pricey professional tools today, Sony envisions a complete 4K ecosystem for the home that will include 4K home movie cameras as well.

Chicken and egg answer; the 4K blu-ray appears from the above two articles to be a standard (4 layer greater than 2X speed) drive but requiring more CPU power for the h.265 codec and larger video frame buffers. The PS3 has this now and a PS4 will also. Since they are firmware update-able, Sony can release a 4K version even if standards have not been set. The risk is the consumers in the 4K media (blu-ray disks) they might purchase.

http://en.wikipedia.org/wiki/Blu-ray_Disc said:
Although the Blu-ray Disc specification has been finalized, engineers continue to work on advancing the technology. Quad-layer (100 GB) discs have been demonstrated on a drive with modified optics[150] and standard unaltered optics.[151] Hitachi stated that such a disc could be used to store 7 hours of 32 Mbit/s video (HDTV) or 3 hours and 30 minutes of 64 Mbit/s video (Cinema 4K).

On January 1, 2010, Sony, in association with Panasonic, announced plans to increase the storage capacity on their Blu-ray Discs from 25GB to 33.4 GB via a technology called i-MLSE (Maximum likelihood Sequence Estimation). The higher-capacity discs, according to Sony, will be readable on current Blu-ray Disc players with a firmware upgrade. No date has been set to include the increased space, but according to Blu-ray.com "it will likely happen sometime later this year."[159]

Chip included in the Sony 4K up-scaling blu-ray player
 
Sony Publishes a Patent for a Kinect style "USER-DRIVEN THREE-DIMENSIONAL INTERACTIVE GAMING ENVIRONMENT"

isthisthepstationmotionjtjt.png


An invention is provided for affording a real-time three-dimensional interactive environment using a depth sensing device. The invention includes obtaining depth values indicating distances from one or more physical objects in a physical scene to a depth sensing device.

"Embodiments of the present invention provide real-time interactive gaming experiences for users. For example, users can interact with various computer-generated objects in real-time. Furthermore, video scenes can be altered in real-time to enhance the user's game experience. For example, computer generated costumes can be inserted over the user's clothing, and computer generated light sources can be utilised to project virtual shadows within a video scene. Hence, using the embodiments of the present invention and a depth camera, user's can experience an interactive game environment within their own living room. "

"The processing system 174 can be implemented by an entertainment system, such as a Sony.RTM. Playstation.TM. II or Sony.RTM. Playstation.TM. I type of processing and computer entertainment system. It should be noted, however, that processing system 174 can be implemented in other types of computer systems, such as personal computers, workstations, laptop computers, wireless computing devices, or any other type of computing device that is capable of receiving and processing graphical image data."

Augmented Reality support so we now have multiple points to prove Sony is going to support AR on the Vita and PS4. AR on the PS3???? Accessories (new camera) coming by Sept 2012 for the PS3?
Khronos published a PDF outlining among other things AR by Sept 2012

The OpenMAX IL 1.2 camera component is also updated with the following advanced capabilities:

• Enhanced Focus Range, Region and Status support;
• Field of View controls;
• Flash status reporting;
• ND Filter support;
• Assistant Light Control support;
• Flicker Rejection support;
• Histogram information;
• Sharpness control;
• Ability to synchronize shutter opening and closing events with audio playback.

Timing
The patent was filed on 26th October 2011, by PlayStation Eye creator Dr Richard Marks. The patent was published on 16th February 2012.
Khronos meeting discussing AR Nov 2011 with Sept 2012 target date (Leveraging Browser technology)
OpenMax IL 1.2 Spec internally released Nov 7, 2011 and published Feb 2012 (supports hardware APIs necessary for AR)
 
From last years GDC: http://www.youtube.com/watch?feature=player_detailpage&v=XiQweemn2_A#t=3s

Practical limit to human vision is 8000 X 4000 @ 72FPS with a 90 degree field of view (about 4 feet from a 100 inch screen or 2 feet from a apx 40 inch screen). HDTV viewing 2560 X 1600 30 degree field of view. (Above 1080P but below 4K depending on how far you are from the screen.)

Current PS3 and Xbox is 2nd order approximation, 2 bounces of light off screen images and at 1080P is .25 Teraflops. 3rd order approximation 3 bounces of light needs 2.5 Teraflops or 10 times PS3 power.

2 more generations till we reach human vision limits.

3D stacking mentioned including the same IBM stacking picture I posted.

Video is well worth watching. Given the information in this video, next generation console MUST be at least 10 times more powerful. 3rd order approximation for lighting and true 1080P would need more than what some are speculating for next generation.

Article on the above video by ars technica

The Global Foundries PDF has some interesting information.
1) 2 year lead time to production
2) 20nm and 3D stacking sometime after 2012 with High K gate first technology.

So we now have Major players both Global Foundries and IBM Forges ready to 3D stack with sub 32nm die by 2013.

And yes they are counting on tech that has not been perfected being available on a timetable 2+ years out as is the industry norm......

Both Microsoft and Sony waiting till at least next year makes sense! The rumors of IBM working with Sony on a Cell 2 several years ago would then make sense if 2+ years is needed from design to forge. So CPU design has been done, waiting on a GPU and processes to make a 10X next generation Game console economically practical. Kepler is somewhere above 2 and less than 3.1 TFLOPS but too hot and too expensive.

How the rumor that Cell is dead got started:

Last month Sony confirmed that it reached a preliminary agreement with Toshiba to sell off its production facilities for advanced chips
(including the Cell in PS3). The sale of Sony's semiconductor business to Toshiba is estimated to be worth about 100 billion yen ($861
million).

Now, however, the word from the Nikkei Business Daily in Japan (via Solid State Technology) is not only is Sony selling its Cell production
facilities, but the electronics company is also planning to withdraw from its R&D project with IBM and Toshiba, which sought to reduce
the Cell chip to 32nm and below. IBM and Toshiba are expected to carry on the project, though.

In addition, Sony intends to cancel all capital investments in production of 45nm and later Cell chips. The current Cell chips being
made for PS3 are using a 65nm process, SCE head Kaz Hira confirmed. Rather than pursue Cell development, Sony is planning to
strengthen its work in CCD and CMOS image sensors. Sony executive deputy president Yutaka Nakagawa explained to the
Nikkei that the move is part of Sony's "asset light" strategy to end chip manufacturing investments after 65nm. The focus will switch to
design over manufacturing.

"We plan to minimize the investment that is required to make packaged IC chips smaller," said Nakagawa. "Manufacturing
cutting-edge packaged IC chips is not considered as important as it once was. The most important thing is what type of chip a company
decides to produce, so we will increase the number of designers depending on the chip's purpose. The fact that we will stop operating
an advanced chip plant does not mean that we are downgrading the importance of the chip business."

He added, "About 100 employees who worked on designing the Cell and other advanced microprocessors have already been reassigned to
divisions where manufacturing technologies for image sensors and analog IC chips are being developed."

Rumor The Cell chip 2 is in development, and goes for PlayStation 4 Oct 2011 (+ 2year plus lead time to production = beginning 2014) Timing adds support to this unconfirmed information.

Mindful of the information comes from a trusted source: apparently, the Cell chip's successor is already being designed. Is not particularly surprise us, but it also makes two details that we have arrived, is being developed by the Barcelona Supercomputing Center, and its most immediate application will be PlayStation 4.

Apparently it is in Barcelona where the Cell chip 2 (according to our source, the official name, and predictable, the Cell that carries in its heart PlayStation 3) is being designed with the help of one of those mostrencos of computing that also used for research and development in many other fields, like medicine. As the original Cell, the most immediate use will chip when its development is completed PlayStation 4, Sony's new console from which, so far, nothing is known.

There are alot of good discussions in 2008 about Cell-X86-Larabee in a PS4 and even a mini-Cell in a new PSP. What has changed....OpenCL. The quad GPU in the Vita was designed with OpenCL in mind (known since 2009) and it can be used for codecs, math, AR and more.....mini-Cell is not needed in the Vita. 2 Cell SPUs are about equal to 32 math elements in a GPU and Cell can be easily used with OpenCL (32-64 SPUs). Backward Compatibility requires at least 6 SPUs and if a GPU is 100% being used for graphics it can't be OpenCL used for physics or AI...CPU is still needed and SPUs have no IP cost.

banned source add opa-ages dot com to /forums/topic/12493-detailed-cell2-info-ibm-will-continue-cell-development-for-its-mainframes-but-next-gen-scei-console-wont-use-cell2/

Barcelona Supercomputing Center

http://research.nvidia.com/content/BSC-ccoe-summary said:
NVIDIA and BSC started collaborations back in April 2008 with an initial conference presented by David Kirk on “NVIDIA CUDA Software and GPU Parallel Computing” in Barcelona. In 2010, BSC became the first NVIDIA CUDA Research Center in Spain, with Mateo Valero as PI and Prof. Nacho Navarro as the managing director of the center. In addition to an active research programs and many highly regarded publications, the CUDA Research Center also recognizes BSC’s efforts in CUDA education, highlighted by the second edition of the (PUMPS) Summer School, “Programming and Tuning Massively Parallel Systems”. Prof. Nacho Navarro organized the event, co-sponsored by the University of Illinois, the HiPEAC NOE and NVIDIA, with distinguished faculty members Dr. David B. Kirk of NVIDIA and Prof. Wen-mei Hwu of the University of Illinois.

Now as a CUDA Center of Excellence, BSC will utilize GPU computing equipment and grants provided by NVIDIA to support a growing number of research and academic programs. For the next three years, BSC and UPC plans to build an Education Program on Parallel Programming using CUDA, to provide a cluster-aware programming environment for GPUs, to optimize mutlti-GPU runtime management with GMAC, to support GPU acceleration from the task-based StarSs programming model and its OmpSs implementation and to build a new GPU-based cluster prototype system to explore the potential of low-power GPU clusters as high-performance platforms.
Best guess is there is some truth to the rumor, PS4 OS may in part be designed at BSC but the Cell2 if it's coming is probably a IBM-Sony collaboration. Cluster computing, Massively Parallel Systems, the Cell SPU and OpenCL lend themselves to the original Cell vision. Oh, and if BSC is designing the PS4 OS or parts of it then I think Nvidia is again making the PS4 GPU.

The above could support the PS4 having Nvidia Tegra CPUs and Nvidia GPU so we now have rumors supporting CPU types X-86, Power7, Tegra (ARM) and GPUs from AMD and Nvidia. My best GUESS is Power7 + more than 6 SPUs and a Nvidia GPU. Tuning is needed to assign tasks to and to split off parallel processing elements to perform tasks using OpenCL. This should include multiple SPUs in the CPU.

This is now getting interesting; Next generation is going to use many processes to improve performance and multiple CPUs and GPUs rather than increasing clock speeds is part of this. OpenCL makes using highly parallelized designs practical including multiple SPUs. Since OpenCL code is somewhat cross platform like OpenGL it supports the original Cell vision of Distributed processing. For it's part I think Sony invested in Cell for Coming multi-media like 3-D, 4K and 8K video. These new resolutions are going to need codecs that will require orders of magnitude more CPU power than we had in 2005. Cell may well be dead and replaced by modern GPUs running CUDA or OpenCL or Cell may now be practical because of OpenCL and in multiple new Sony products.

I also find it interesting that BSC, Collabora and Fluendo are based in Barcelona, Spain and Collabora-Fluendo announced a partnership promoting the GStreamer multimedia framework through the creation of a cross platform software development kit (SDK), targeting desktop and server platforms like Linux, Windows and Mac OS X, and very soon to include leading mobile platforms, such as Android. 1080P, 3-D, 4K, 8K with zooming, up-scaling and down-scaling needs lots of processing power.

In the NeoGAF thread; "PS3 Web Browser Discussion - big upgrade rumoured for long time, but no concrete news" I reported information on Collabora being sent a PS3 Developer Kit to "See what they could do", this was end of 2007. Gstreamer is a Multi-media framework that at that time was being accepted as the multi-media, HTML5 <video>, part of Webkit for Posix platforms like Linux and Unix and part of both FireFox and Opera. There were issues integrating it with Openmax IL and a newer version of OpenMax was delayed from 2008 to 2011 (OpenMax 1.2). Gstreamer 1.0 and OpenMax 1.2 both address issues with a Gstreamer Openmax wrapper and provide for next generation AR support.

Openmax IL provides low level APIs and Codec support for multi-media and is provided by the hardware vender. It's used by Android and Sony to support Multi-media and if Gstreamer is used as the upper level player a Gstreamer to openMax wrapper is needed. With that, OpenMax IL serves the same function as a Gstreamer Core and Codec plugins. It appears to this point that Sony has been using AVM+ Flash open source for DASH players while companies like Netflix have been using Gstreamer-openmax. AVM+ can't be used for commercial IPTV (without paying Adobe) and to this point Sony has not had a video streaming IPTV DASH player service like Netflix. Best guess is they were waiting for OpenMax IL 1.2 and Gstreamer 1.0 and will now use Gstreamer for multiple multi-media features like Streaming AR, DASH player for IPTV, DLNA, Video conferencing, video editing and more (my opinion).

Gstreamer is now production ready (with Gstreamer 1.0 and OpenMax 1.2) and can now support multiple platforms using Open Standards.

Two multicoreplatforms are concentrating an enormous
attention due to their tremendous potential in terms of
sustained performance: Cell BE and NVIDIA GPGPUs


Other libraries/environments that support Cell BE programming
&#8226; Data Communications and Synchronization Library (DACS) and Accelerated
Library Framework (ALF) are included in the latest releases of the IBM SDK
&#8226; Cell Superscalar (CellS) developed at Barcelona Supercomputing Center
&#8226; Multicore Plus SDK Software developed by Mercury Computing Systems Inc.
Meeting on Parallel Routine Optimization and Applications &#8211; May 26-27, 2008
18
&#8226; Multicore Plus SDK Software developed by Mercury Computing Systems Inc.
&#8226; RapidMind Multicore Development Platform for AMD and Intel multicore x86
CPUs, ATI/AMD and NVIDIA GPUs and Cell BE

Conclusion in 2008 is Nvidia GPU is affordable and easier to program (CUDA) than Cell. OpenCL changes the latter but affordability I can't answer. The reasons for posting the cite is it has a bearing on Cell viability and Barcelona Super Computing Center is mentioned again supporting the above Rumor for at least a OS based on Cell/Nvidia Highly parallelized SPU/GPU in a PS4.

http://www.training.prace-ri.eu/uploads/tx_pracetmo/gpuvideo1.pdf

Mapping Iterative Medical Imaging Algorithm on Cell Accelerator

the algorithm running on one SPE is over 5 times faster than on one core of the AMD Opteron processor. For 360 subsets, the Cell BE is 2.7 times faster than AMD Opteron processor. Note that for larger subsets, the number of DMA transfers between the local store and main memory increases on the Cell BE, increasing execution time. However, compared to AMD Opteron processor, the Cell BE still performs better.
If Sony is going to use the custom CPU/GPU they develop for the PS4 for medical imaging (NMR or CT to video and display) a X86 is not practical except to manage Cell SPUs or GPU. Power7 is 2.7 times faster than X86 and can be used to manage SPUs and there is Sofware owned by Sony to do this.

Medical Imaging

Computed Tomography (CT) reconstruction is a computationally and data-intensive process applied across many fields of scientific endeavor, including medical and materials science, as a non-invasive imaging technique. A typical CT dataset obtained with a CCD-based X-ray detector, such as that at the Australian Synchrotron with 4K×4K pixels captured over multiple-view angles, is in the order of 128GB. The reconstructed output volume is in the order 256GB. CT data sizes increase at 1.5 times the number of pixels in the detector, while the data-processing load generally increases as the square of the number of pixels, hence data storage, management and throughput capabilities become paramount. From a
computational perspective, CT reconstruction is particularly well suited to mass parallelisation whereby the problem can be decomposed into many smaller independent parts. We have achieved significant performance gains by adapting our XLI software algorithms to a two-level parallelisation scheme, utilising multiple CPU cores and multiple GPUs on a single machine. In turn, where data sizes become prohibitively large to be processed on a single machine, we have developed an integrated CT reconstruction software system that is able to scale up and be deployed onto large GPU-enabled HPC clusters.
Medical Imaging is more that I first thought if Sony is going to be working with the raw data out of a NMR. Upgrades to the old computers and imaging system that are now almost 10 years old can massively speed image processing, I would guess this is an open market for Older NMR and CT equipment.
 
Top Bottom