• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 architect knew in 2007 that 'clearly we had some issues with PlayStation 3'

Skenzin

Banned
2006
2007
2008
2009
2010
2011
2012
2013 Year of the Playstation confirmed!

Actually, sounds like Sony got there head out of their butt. This guys resume is impressive.
 
I dare someone to say something bad about Mark Cerny. I dare you!



Designing and manufacturing a Modern CPU is way more difficult than you imagine. There is no way Sony could do it on their own. IBM and Toshiba helped on the Cell. Neither has any intrest in Cell anymore. That leaves Intel, AMD or ARM.

Very well
cernywell...7maq9.gif


He does not have the Kaz swag.
 

nib95

Banned
2006
2007
2008
2009
2010
2011
2012
2013 Year of the Playstation confirmed!

Actually, sounds like Sony got there head out of their butt. This guys resume is impressive.

2009 was year of the Playstation for me (Demon Soul's, Uncharted 2, InFamous, Killzone 2, R&C etc). 2010 was strong too though...in-fact, now that I think about it, so was 2011 lol. 2012 was the PS3's weakest year in a while, but 2013 looks to be rectifying that.
 

kuroshiki

Member
And people were wondering why Kaz wasn't at the PS4 event. I bet he won't even be on stage at E3. I think Sony wants Cerny to be the man this time.

Kaz is now the head of entire Sony. Before he was head of SCE.

No one expects Steve Ballmer shows up and announce Xbox720 on e3, right?
 
D

Deleted member 17706

Unconfirmed Member
I gotta say, I really like this Cerny guy.
 

RayMaker

Banned
Lol I wouldn't be surprised if a few in here have shrines or photos of him in your homes.

Yeah, Cerny has done a good job if he is responsible for the PS4 hardware and he did a good job at the conference to, but he's no peter moor, he's more like a don mattrick but with more passion.
 

benny_a

extra source of jiggaflops
Yeah, Cerny has done a good job if he is responsible for the PS4 hardware and he did a good job at the conference to, but he's no peter moor, he's more like a don mattrick but with more passion.

Cerny is certainly no Don Mattrick and doesn't strive to be Peter Moore.

He is closer to Miyamoto with a more technical background and current emphasis.
 
This post right here is evidence people will find reasons to complain about any and every thing. I've ripped my entire library in super high bit rate MP3, and on my Sennheiser HD800s and my Onix Rocket MK2's, the sound quality difference is minimal at best.

Define "super high bit mp3". Some people think 320kbps is overkill but if you listen to HD800's then i would assume 1000 kbps or more? Even with my Grados i can tell the sound difference between a wav file and flac.
 
Wow he decided to use off the shelf PC parts and unified memory, why didn't anyone else ever think of that? The x86 processor? I wonder how he learned of this little known processor? LOL. This is just silly.

I believe Sony made the right decision given their loss of dominance, but the spin is hysterical. Sony didn't make PS3 by accident. The strategy was to create a highly customized architecture so that games would remain exclusive to Playstation and would be difficult to port to alternative platforms. Had PS3 dominated like PS2 it would have been an excellent strategy and kept many games off of PC/Xbox or at least heavily delayed after the PS3 release. Using off the shelf PC parts to remain in line with PC and Xbox next gen is a recognition that the strategy has failed and exclusives are unobtainable. At best PS4 is just another x86 box featuring ports from other x86 boxes, it will never be anything like PS2 using such common hardware.

Also please stop comparing this guy to Miyamoto, people living in the real world find this comparison insulting to their intelligence. I admire Cerny's insights and accomplishments, but keep them in check. It's like you're trying to compare Dave Grohl to John Lennon.
 

stryke

Member
Also please stop comparing this guy to Miyamoto, people living in the real world find this comparison insulting to their intelligence. I admire Cerny's insights and accomplishments, but keep them in check. It's like you're trying to compare Dave Grohl to John Lennon.

What people find insulting is not up to you.
 
Define "super high bit mp3". Some people think 320kbps is overkill but if you listen to HD800's then i would assume 1000 kbps or more? Even with my Grados i can tell the sound difference between a wav file and flac.

What sort of music are you listening too? A lot of music is not produced particularly well so no matter what you do, it'll still sound like shit most of the time. You should be enjoying the music (and that doesn't need lossless audio) and not the hardware anyway.

Honestly speaking, I've dabbled in fairly high end loudspeakers, amplification, and sources and I've found that 320kbps is more than enough to get a happy medium between small filesize and good sound quality. Obviously lossless if you're archiving the source audio file but most people don't care about that.
 
any thread involving the genius that is Mark Cerny is a great chance to post this video again

Wow. This was terrific. I wish the historical questions were arranged in more of a linear fashion instead of skipping around, but it's a pretty great Q&A anyway.

I knew he started young and was some sort of child prodigy, (after all, he created Marble Madness for Atari when he was barely 18,) but the part where he talks about trying to program a 3D D&D game with overhead combat in 1977 using a 3D projection rendering scheme he derived from trigometry when he was 12 years old was priceless.
 

antic604

Banned
I'm really surprised at the bashing Ken Kutaragi is receiving here for the PS3.

Sure, from business / image PoV it wasn't wise to go with Cell, but the architecture was so far in the future that it's still relevant - as an idea - even today. It's telling that it is still impossible to emulate the Cell in software on current PCs or PS4. Cell is a vector processor, just like current compute-shaders in GPUs are becoming, so actually I wouldn't be surprised if it resurfaces somewhere again in few years. Just read the interviews with people like John Carmack or Tim Sweeney to see that they see a future where fixed-function hardware (like current GPUs) are ditched in favor of much more flexible designs. The drawback would be, that the devs would have to make basic stuff like drawing triangles and texturing them themselves, but on the other hand there would be much more freedom in how they could create geometry, texturing, effects, etc. Remember the old times when Quake2 or Unreal had 'software rendering' option available? In many cases the software results (volumetric smoke, colored lights, etc) were better than what could be achieved using then-available gfx cards, because in software you could do anything, while with hardware you're limited to options given to you by the driver/api (eg. DirectX) and the card.

Ken's vision was exactly that, but it proved to be too challenging for most of the dev teams (even some 1st party ones), which simply could achieve the same results with much less effort on X360 or PC. But some positives remained - in recent years we've seen numerous devs (Dice, Irrational, Criterion to name just the top few) employing multi-threaded / job pipelines designed for PS3's parallel Cell/SPUs, which - turns out - were also hugely beneficial to performance on PCs and X360. So, in that sense not all of Ken's work was wasted. Simply, the timing was wrong :)
 

mrklaw

MrArseFace
In that game changers video he actually relates a tale from earlier in his career when he pulled multiple all-nighters in a row...and ended up in the ER.

So I hope he is getting more sleep these days ;)

He's also nearly 50, looks pretty good to me.
 

Durante

Member
I'm really surprised at the bashing Ken Kutaragi is receiving here for the PS3.

Sure, from business / image PoV it wasn't wise to go with Cell, but the architecture was so far in the future that it's still relevant - as an idea - even today.
Absolutely. But most people here aren't into computer architecture, so they just see Cell as an expensive failure. That's an understandable perspective, but really doesn't do the vision and foresight of the architecture justice.

It's like you're trying to compare Dave Grohl to John Lennon.
This is a good analogy, since I prefer the former in both cases :p
 

Alx

Member
Absolutely. But most people here aren't into computer architecture, so they just see Cell as an expensive failure. That's an understandable perspective, but really doesn't do the vision and foresight of the architecture justice.

But the industry isn't about selling ideas and architectures, it's about selling products. If Kutaragi was visionary and too early, it's just as bad as if he was too late. The timing was wrong indeed, but timing is essential when creating a product that people would buy. If you choose a technology that isn't ready yet for your market (because it's expensive, or unreliable, or the users aren't ready,...), it's your responsibility if it fails to sell or fulfill its tasks.

Most tech-savvy people know about the technologies of tomorrow. The difficult part is choosing the right moment to bring them to market. Inertial sensors have existed for years, but Nintendo had to wait for the right moment to succeed with the Wii ; just like they failed with the Virtual Boy because it was "too visionary", and now the time seems right for the Oculus Rift.
 

antic604

Banned
But the industry isn't about selling ideas and architectures, it's about selling products. If Kutaragi was visionary and too early, it's just as bad as if he was too late. The timing was wrong indeed, but timing is essential when creating a product that people would buy. If you choose a technology that isn't ready yet for your market (because it's expensive, or unreliable, or the users aren't ready,...), it's your responsibility if it fails to sell or fulfill its tasks.

Most tech-savvy people know about the technologies of tomorrow. The difficult part is choosing the right moment to bring them to market. Inertial sensors have existed for years, but Nintendo had to wait for the right moment to succeed with the Wii ; just like they failed with the Virtual Boy because it was "too visionary", and now the time seems right for the Oculus Rift.

True, but it's very rare for brilliant inventors / thinkers to combine this skill with marketing feel. Also, I'm thinking, whether it's simply not a case of trial & error - I'm sure you can't point to single person / company who's on a continuous basis introducing innovative and economically successful products? It's as much about luck, as it is about skills.
 

wilflare

Member
I'm really surprised at the bashing Ken Kutaragi is receiving here for the PS3.

Sure, from business / image PoV it wasn't wise to go with Cell, but the architecture was so far in the future that it's still relevant - as an idea - even today. It's telling that it is still impossible to emulate the Cell in software on current PCs or PS4. Cell is a vector processor, just like current compute-shaders in GPUs are becoming, so actually I wouldn't be surprised if it resurfaces somewhere again in few years. Just read the interviews with people like John Carmack or Tim Sweeney to see that they see a future where fixed-function hardware (like current GPUs) are ditched in favor of much more flexible designs. The drawback would be, that the devs would have to make basic stuff like drawing triangles and texturing them themselves, but on the other hand there would be much more freedom in how they could create geometry, texturing, effects, etc. Remember the old times when Quake2 or Unreal had 'software rendering' option available? In many cases the software results (volumetric smoke, colored lights, etc) were better than what could be achieved using then-available gfx cards, because in software you could do anything, while with hardware you're limited to options given to you by the driver/api (eg. DirectX) and the card.

Ken's vision was exactly that, but it proved to be too challenging for most of the dev teams (even some 1st party ones), which simply could achieve the same results with much less effort on X360 or PC. But some positives remained - in recent years we've seen numerous devs (Dice, Irrational, Criterion to name just the top few) employing multi-threaded / job pipelines designed for PS3's parallel Cell/SPUs, which - turns out - were also hugely beneficial to performance on PCs and X360. So, in that sense not all of Ken's work was wasted. Simply, the timing was wrong :)

I wonder if the PS4 still allows for some level of Cell-like processing/programming or even flexibility

http://www.eurogamer.net/articles/digitalfoundry-inside-playstation-4
Low-level access and the "wrapper" graphics API
In terms of rendering, there was some interesting news. Norden pointed out one of the principal weaknesses of DirectX 11 and OpenGL - they need to service a vast array of different hardware. The advantage of PlayStation 4 is that it's a fixed hardware platform, meaning that the specifics of the tech can be addressed directly. (It's worth pointing out at this point that the next-gen Xbox has hardware-specific extensions on top of the standard DX11 API.)

"We can significantly enhance performance by bypassing a lot of the artificial DirectX limitations and bottlenecks that are imposed so DirectX can work across a wide range of hardware," he revealed.
 

see5harp

Member
PS4 memory setup is much more elegant (X720 development will 'suffer' for the next 7+ years in comparison) and the system is over all stronger (+50% gpu power) than leaked Durango specs. The PS4 is delivering on all fronts (hardware, os functions, controller input, launch titles) confirmed at this point. The Durango might deliver strongly on several fronts (os?) but in power and hardware elegance it's now the lesser of the two.
Cerny is also being applauded for the general attitude turn-around within PlayStation.

I'm fairly certain I've seen you and others talk about how the unified memory setup in 360 was not a huge advantage at all when referring to the PS3 and that having a split pool was just as good. Talk about a 180.
 

Mifune

Mehmber
Also please stop comparing this guy to Miyamoto, people living in the real world find this comparison insulting to their intelligence. I admire Cerny's insights and accomplishments, but keep them in check. It's like you're trying to compare Dave Grohl to John Lennon.

Nobody's saying he's as great as Miyamoto. Just that he's closer to a Miyamoto than a Peter Moore or Mattrick.
 

tirminyl

Member
any thread involving the genius that is Mark Cerny is a great chance to post this video again

Thanks for posting this again. I am finally watching it after putting it on my watch later list.

For what he was doing at 12 blows my mind. As much as I love technology I didn't get to touch a computer until I was in 7th or 8th grade and access was only in school and was rare. I didn't have my own machine (or even regular access to one) until I was in high-school around '99 and FINALLY got into learning how to program with html, pascal, animation, etc. I really wish I could have started earlier.
 
Top Bottom