• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What happened to Ken Kutaragi and J Allard?

JordanN

Banned
I don't really miss Kutaragi, his arrogance cost Sony and the Playstation brand a lot during the PS3's early years.
I kinda wish he stayed around for the PS4 though. The current hardware is too close to the XBO which is already weak in itself whereas he could have pushed for a 3 Teraflop GPU. But that's it. Mark Cerny should have control over everything else.
 

Fezan

Member
j-allard-ngai-shot.jpg

ok what happened here ?
 

pr0cs

Member
I wish J Allard was still at Microsoft. The early years of the 360 were absolutely incredible, and it makes me sad to see how far the Xbox brand has fallen since his departure. We'd be in for an absolutely explosive generation if Allard was still at the helm, and going up against the PS4.

Agreed, Allard and Moore made a pretty exciting team, unlike the bean counters we have now as the 'faces of xbox'.
Oh how the mighty have fallen.
 

Ty4on

Member
This would be fantastic! J was definitely "with it" when it came to being at the forefront of the technology culture. I think he would be a good enough visionary to make Microsoft more than what they are now.

Totally my (and theirs) train of thoughts! He made some (expensive) mistakes, but Apple made the G4 Cube, attempted PowerPC on laptops (and later apologized) and made the super expensive and portless MacBook Air before the actually pretty decent MacBook Air. With the courier MS could have been in a totally different spot. Even if it were to be too expensive it would be a great halo product to show MS could make and pioneer great tablets instead of it feeling like they are stuck in 2004.

They need someone who can look in a direction that isn't just tiles :/
 

Sorcerer

Member
Wow at Allard using Apple products while working at MS.

Was he just being rebellious or did he love Apple products?

I guess one could always claim research.

I wonder if he ever thought of jumping ship to Apple?

Pretty ballsy.
 

Fezan

Member
Wow at Allard using Apple products while working at MS.

Was he just being rebellious or did he love Apple products?

I guess one could always claim research.

Pretty ballsy.

Allard of very advocate about design importance in technology and this is the same reason he loved using apple products
 
I've always been curious about Crazy Ken. Is there some kind of write-up or history about him I can read that also goes into why he was so fuckin crazy?
 

JPKellams

Member
J. Allard also has a big stake/is an angel investor in TheClymb.com (which is awesome... Gilt for outdoors stuff). I think his passion is fully in cycling now.
 

angelic

Banned
Speaking of which, N'Gai Croal... completely forgot about him, but he used to be a pretty big name in games journalism.

Thank god, he's a massive troll. Wagon circler on twitter, awful on GT, very pleased he's no longer prominent.
 

Lazy8s

The ghost of Dreamcast past
I imagine Kutaragi has been moving to more of a marketing position while Allard continues down the road of technology path finding.
 

DrZeus

Member
Sony should hire Allard. It would only be fare with Harrison doing the secret agent man gimmick in MS offices.
 

Melchiah

Member
I imagine Kutaragi has been moving to more of a marketing position while Allard continues down the road of technology path finding.

I have a hard time picturing Mr. Kutaragi blooming on the marketing field, considering his infamous "two jobs" blunder.
 
If Ken was still at Sony, the PS4 would be composed solely of Cell processors and Blu-ray. It would have lazers shooting out of it and every time you turned it on it would dubstep.
 
I kinda wish he stayed around for the PS4 though. The current hardware is too close to the XBO which is already weak in itself whereas he could have pushed for a 3 Teraflop GPU. But that's it. Mark Cerny should have control over everything else.

And be harder to develop for/port to? After the PS3, no more of that, please.

Who cares if they're architecturally similar at this point? The PC will always be at the forefront of tech anyways. At least now ports will be tons simpler.
 

Mahonay

Banned
I kinda wish he stayed around for the PS4 though. The current hardware is too close to the XBO which is already weak in itself whereas he could have pushed for a 3 Teraflop GPU. But that's it. Mark Cerny should have control over everything else.
$599.
 

JordanN

Banned
And be harder to develop for/port to? After the PS3, no more of that, please.

Who cares if they're architecturally similar at this point? The PC will always be at the forefront of tech anyways. At least now ports will be tons simpler.

Eh, why it would be harder? Ironically, their lessons from the PS3 should have helped them with their next console since didn't it give developers an early taste of parallel processing? Also, using the PC is pointless since consoles are clearly doing their own thing.

I think Sony could have designed a more powerful console without having to resort to that. No more than $500 hopefully which would still look very attractive when compared to the XBO. The switch to AMD would also mean they're not getting screwed again.
 
Eh, why it would be harder? Ironically, their lessons from the PS3 should have helped them with their next console since didn't it give developers an early taste of parallel processing? Also, using the PC is pointless since consoles are clearly doing their own thing.


I think Sony could have designed a more powerful console without having to resort to that. No more than $500 hopefully which would still look very attractive when compared to the XBO. The switch to AMD would also mean they're not getting screwed again.

They never really learned; and it's just plain esoteric to begin with. What case is there to be made for a bunch of asymmetric processing units over symmetric ones?

And you're right about consoles doing their own thing, but also wrong; they've been sharing more games than ever with the PC. Anything that can reduce port times between the 3 and not come at the expense of the strongest one is a plus.
 

Jubern

Member
Ken Kutaragi got an award at the CEDEC 2013 conference last week, for his achievements.

Funny thing is that the biggest sponsor of the entire event was Nintendo, whose name can be seen on all the pictures of Kutaragi posing lol

(Thanks to Florent Gorges for the story)
 

Lazy8s

The ghost of Dreamcast past
Those lamenting the change from the old Kutaragi-era approach of PlayStation hardware design to the more "PC" like hardware design from Sony of today simply don't understand the fundamental balances that a proficient hardware design must achieve.

Three case studies: the Emotion Engine, Cell, and the Graphics Synthesizer. Each was essentially the result of pushing poor designs to next level with relatively huge silicon budgets. While they might've been interesting from a purely academic standpoint to see what those inefficient approaches looked like at a higher scale, that scale did nothing to correct the underlying flaws

The Graphics Synthesizer relied on multipass rendering to create effects rather than multi-texturing. Somehow, Sony didn't figure out what the graphics engineers of all of the other companies in the industry understood post-Voodoo1 that multi-texturing was a more efficient approach to applying effects, and that any loss in flexibility as compared to multi-pass was more than made up for in the extra performance and efficiency that was achieved. GS's approach was EOLed with its demise, and the industry has marched on down its path of smarter design successfully.

The Emotion Engine and Cell both fail for the same reason as one another. Their job within a game system was to serve as a strong CPU. The job of a CPU, not just for spreadsheets but even for games, is to handle serial workloads full of dependent data, conditional operations, and branches.

In a well balanced system, cores/chips are specialized to handle a specific type of common workload, and these different specialists (CPU for serial, GPU for parallel, video cores/DSPs for certain specific algorithms, etc) should ideally master their own work independently yet be able to work in harmony with the other specialists in the system. This is the heterogeneous processing model that's finally being embraced today.

The little MIPS and PowerPC cores of the massive EE and Cell die, respectively, were nowhere near enough to serve as strong CPUs. The extra area on each die was filled with a bunch of math/vector units, which made them more suited to assisting with graphics. The problem is, giving what is essentially GPU silicon to a CPU is just imbalanced and less efficient.

Those who argue that the Emotion Engine's and Cell's ALUs were more flexible than even a DirectX 11+ GPU miss the point of why the evolution of GPUs follows ever-advancing models of restricted functionality like Direct X feature sets in the first place. From the beginning, GPU designers could've designed graphics processors that were as flexible as CPUs, but the limited die area budgets they had in the early days meant that they would've had almost no performance to power those flexible pipelines.

Instead, they realized that they could accelerate a set of fixed-functions and get a better return on investment of the silicon they were using. As new fabrication processes afforded more and more silicon, they had the choice of speeding up those fixed graphics functions even more or spending the silicon on being able to do a wider set of graphics functions. Once the visual return on investment from being able to do new effects, or at least similar effects in a more efficient way, started to outweigh just doing more of the same old graphics function, the feature set of GPUs would expand to a new level, characterized by an evolving API like DirectX and such.

So, designing CPUs like the Emotion Engine and Cell with a ton of FLOPs was missing the point, not using the silicon to boost the functions a CPU was actually supposed to be doing, and doing work a GPU would've done more efficiently and with a better return on investment of the silicon used.
 
Top Bottom