• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Cell Chip - how will MS and Intel face the music?

xexex

Banned
http://www.theregister.co.uk/2005/02/03/cell_analysis_part_two/page2.html

The Cell Chip - how will MS and Intel face the music?

By Andrew Orlowski in San Francisco
Published Thursday 3rd February 2005 14:29 GMT

Analysis A number of readers consider Intel and Microsoft the two dumbest companies ever to file a 10Q, and rejoice at the prospect of an upstart - almost anyone will do - dethroning them. But be careful for what you wish for: it might come true.

Such a dismissive and negative view of these two giants isn't fair, of course. They're very different animals and lumping the two together overlooks all kinds of interesting internal tensions and contradictions. Besides, anyone who can remember that fine purveyor of ladies' lingerie by mail - "gazelle.com" - or any of the other bubble companies that sprung up in this parish during the dotcom bubble, will really know what a dumb company looks like. But for the sake of argument, we shall let the proposition stand.

Since the two giants established their hegemony around 15 years ago, we've seen many candidates threaten to unseat the PC duopoly. RISC, Unix, and Internet appliances (with or without Java) were all taken seriously as competitive threats in Santa Clara and Redmond. The Cell, from Sony, IBM and Toshiba is the latest; it will be unveiled next week in San Francisco and will ship later this year in the PlayStation3 console and in enterprise infrastructure from IBM. But what will a world with Cell supreme look like?

There are two striking aspects to this wonder-chip, if you read our digest of Microprocessor Report's analysis of the Cell patent this week. If you haven't, speed read it, here they are again.

The first is that Cell is designed to be a component in a massively distributed, global computing infrastructure. It's hardware specifically designed for "grid computing". A world full of Cell chips allows an entirely different infrastructure to take the place of today's transaction-based data centers. Software processes will scavenge the resources of the local Cell instantiation first, but if they find more execution resources over a local area network, or even on the other side of the world, they'll go and find them, and execute there.

All previous generations of computers have been based on the idea that it's only efficient to execute the task required inside the box itself, or on one nearby. If this doesn't pan out, the machine is designed to refuse the offer gracefully and give up - and then offer to call the supplier on your behalf for an upgrade. Similarly, none of today's operating systems (we'll come on to Sun in a moment) can migrate workloads across the planet. And we all know what they do when they're overtaxed.

So the Cell architecture has the potential to make computing global. And so a model where you rent computer cycles as a utility, and don't really care where they come from, becomes possible. This is a bit like the old days of the time-shared computing bureau, only this time round, you'll have a choice of utility providers. At least, that's the idea.

The second striking aspect is that this black box finally seals the era of the mythic, have-a-go-hero hacker. You know - the one who's forever saving the world from evil - like Jeff Goldblum in Independence Day, or a blogger at an O'Reilly conference, with his RSS feed.

The Cell is designed to make sure media, or third party programs, stay exactly where the owner of the media or program thinks they should stay. While most microprocessor designers agonize about how to make memory accesses as fast as possible, the Cell designers have erected several (four, we count) barriers to ensure memory accesses are as slow and cumbersome as possible - if need be.

Neither idea is new, but the wild card here is Sony. Not one of the previous threats to the PC hegemony - such as 'CISC vs RISC' - has involved putting one of these devices into the home. But that's where it will go. Cell is designed to scale from handhelds to vast data centers, but the market will begin not with a few enterprising early-adopters, like the micro or the TiVO did, but with a mass market. The Cell will soon be used in quantities that even the giddiest marketing chiefs of Intel at Home, or Windows Media Center would not dare type into a spreadsheet forecast. That's enough reason to take notice.

Now, we have to ask - what chances does it have of succeeding?

There are technical reasons that weigh heavily on whether each of these propositions will succeed. For example, Intel's Itanium depends on compilers parallelizing the code, and has foundered because this is difficult, and cheaper, dumber does a better job. Cell hardware will need really great compilers to work. But in the end, technical arguments like these won't be the decisive factors. We have to step right back and look at how and why people depend on computer technology, and exactly who in the world stands to benefit from each - and there are many - of the possible "victory" scenarios.

My business cycle, your aura

About 30 years ago Asian manufacturers began to imitate Western technology such as automobiles and mainframe computers very successfully. In many cases, customers were far happier with the Asian imitations than the Western originals, which were soon shown to be more expensive and less reliable than the upstarts. Buyers were prepared to put up with human interface eccentricities - such as un-programmable VCRs and DVD remotes with gazillions of redundant, but identically-sized buttons - because it was worth it. (Mass market Asian design is truly awful - remote controls look like a crocodile's back: the 'Play' button is exactly the same size as 247 other buttons competing for your attention, some of which perform arcane functions we can only guess at - perhaps as "2.5x playback" and "pan". Nonetheless, these rule the market - which puts human interface design in a depressing context. It doesn't seem to matter).

Shortly after this splurge of cheap imports made the game clear, the people who fund technology development in the West decided to place a few strategic bets. With higher labor costs, and the R&D burden, the West couldn't compete with Asian manufacturers. It focused on materials innovation (which is where Intel really leads, with its incredible process technology) and on companies who obediently squeeze cost out of their businesses, by making computing products as cheap as possible. The East would be permitted to manufacture the goods, while the West would retain leads in semiconductor technology and system software.

The result of this is today's PC industry, the horizontal Dell model, and it's one where nothing quite works as it should, and no vendor really has any idea what people do with their computer products.

(By the by, you can begin to see Linux has such appeal to Asian manufacturers. It's not because it's particularly good - it's not. In computer science research it's Stone Age technology. But that doesn't matter. It's good enough to a systems builder - for whom a great big zero point zero-zero has suddenly, and quite magically, appeared on his bill of materials spreadsheet.)

Then, while Asian manufacturers concentrated on building real things that people want to use, Western "information technology" went off on a strange pursuit of quite irrelevant abstractions, such as the "internet", "push", "search", and any buzzword containing the word "multimedia" or "information".

But people have little patience for abstractions - they simply want to see a movie, share a tune, check their bank account, or get money out, or talk to an aunt. None of these fantastic, abstract brainfarts of the PC-era or the "Internet-age" really helped people very much: because they weren't ends in themselves, but rather ill-thought-out means to an end. Along the way, we saw a huge, incontinent explosion of investment capital in these abstractions, a bubble generated by a few selfish people who thought they could overlay the mother of all abstractions, the word "market" on top of these other abstractions. This ended, fairly predictably, in tears, but the desire to make money from abstractions-squared hasn't gone away, and will crop up again in this story, and we mention it here with a purpose.

We all know what happened. Once we exclude the quadopoly of services companies who lucked it out (Google, eBay/Paypal, Amazon and Yahoo!) after the bubble burst, we see that things are pretty much exactly where they were in 1995.

In addition, Intel's process lead has been cut from three to five years, down to a few months at most. Cell is a rare example of IBM breaking the Western concensus, and pooling its own semiconductor expertise into a global project. There is no Western "lead" in semiconductor technology anymore. Advanced microprocessors can, and are being developed on the most sophisticated process technology, without regard to any Wintel duopoly. In volume too. All of which means that this era of self-deception, a hegemony fueled in the popular media by so many fictions, is probably over for good, now. And no American or European technology company has conquered the living room, or really made itself pervasive in any aspect of our lives except ... in computing itself. And what use is that?

And now, rather brazenly, comes Cell.

Meet the utility company

The Cell proposition, we now see, has a couple of appeals to an audience which appreciates that computing technology hasn't got very much better in ten years, only slightly cheaper. The respective marketing departments of IBM, Sony and Toshiba have decided not make these too explicit at this early stage.

The first is the economic appeal of outsourcing computing cycles, if not entire IT departments. If you think the United States or the EU would never outsource the production of such a vital resource to such an unknown, in a potential trouble region, then just think oil.

The Grid argument proposes that it doesn't really matter where your data centers are, and the Cell at home proposes that you don't need a PC to make apply computer processing to digital media. The Cell promises to make computing as we know it - both commercially, and privately, invisible. But can anyone claim to deliver on such bold propositions?

Let's look at the business computing proposition first.

The idea that computing power is a utility isn't new. Back in 1999 many companies were investing heavily in the idea. HP, Intel, IBM and Sun all made the same bet, but only IBM and Sun seem to have stayed the course. Intel's global data center project was ignominiously written off a couple of years ago. Now Sun's vision, based on Scott McNealy's corny car analogy, is all that's left to compete with Cell, and Sun explained some of the numbers this week. (Read this for more details). Sun has spent a lot of money to make Solaris as flexible and responsive to this utility model as it can, and we can now see this was a wise bet. Sun too has complex, multithreaded machines in the pipelines, but only Sun, or maybe Google, has an OS that can bounce processes around the world.

So many factors must be overcome for such a model, either on IBM's terms or Sun's terms, to succeed, that only a fool would predict the future. We know that business managers demand ever increasing returns on investment and that IT investment can bring genuine short-term competitive advantages. The utility idea turns the conventional IT department inside out, and IT managers must become cunning procurers of cheap cycles or simply bespoke software houses. Tending to the machines no longer brings a competitive advantage, the logic goes. But if there isn't a level playing field - a fair market for cycles, and open and equitable access to useful APIs and other software resources - then the utility model collapses. Hardware itself isn't enough to make utility computing come true.

Remember that the politics of utility companies are keenly fought over. The desire for speculators to make an Enron out of a simple utility (to overlay the abstract concept of "market" where "market" doesn't really need to exist) has not gone away. Utility computing isn't magic, and doesn't make such problems go away. Experience has taught us that people really seem to prefer local utility companies, because they're cheaper and more accountable.

So for now, grid computing looks like a theory in search of some evidence. A bit like a Cell looking for some hardware resources. It's a great theory - but we're not convinced people want it.

But that's nothing compared to the problems Cell will have in the home, among people who want to share the bits of popular culture they value.
We can digitize it for you wholesale

The internet may one day be seen as a short-lived quirk, a weird Channel 79 on cable TV. "Occasionally useful" sums up how many people see it today, quite reasonably, or at least the 50 per cent who sign up. There are enough jokes, novelties, prospects of dates, or enough free music and porn to make this hopelessly unreliable and whacky channel just about worth it. That's all though.

However the internet has undeniably contributed to an explosion in the distribution of culture product, and that's largely because the devices allowed us to do it. People value this end of the technology proposition very much indeed. It's the one aspect that exceeds the hype. But it only took place because copyrights holders couldn't restrict the exchange of these cultural goods. Technology pundits tend to see the internet and argue that it's changing society. In fact, the reverse is true. Internet technologies are just the latest in a long line of tools that we've adopted because they allow us to store and transmit our culture.

Such tensions betweeen rights holders and the public are not unique - and there's nothing unique about "digital culture" either. Copyright holders really aren't interested in stopping stuff moving about, they're simply interested in being paid. We've been here many times before. Radio at first presented a much more dramatic threat to song sheet salesmen and honky tonk piano players than the computers and internet present today, but we all got over it.

Each time, a social settlement has been reached that allowed people to recompense the industries who invested in those storage and transmission technologies. The same deal takes place over and over again. We're in a funny time at the moment, because that settlement hasn't been reached yet. But it will be, and once it is, the Cell vendors had better start thinking of a more attractive sales pitch than "stops you doing stuff."

"Let's you do stuff" will be a start.

We can begin to see how important Cell is, even if we can see how its designers ambitions will never be fulfilled. Already, we suspect, Cell is a catalyst that is sweeping a lot of half-baked business plans into the bin where they really belong.

So to return where we started: for Intel and Microsoft does indeed does a deep philosophical challenge. The answers each company must produce to counter Cell aren't technical - faster bits, swishier graphics - but relate to how they can engineer social settlements that benefit both themselves and their customers, and traditionally technology companies haven't been very good at these. Intel, at least, with its new appetite for lobbying, indicates that the future isn't entirely a technical challenge. At this early stage, only Intel is facing the future with one eye open. ®
 

IJoel

Member
I think a "Cell Tech 1.21 JiggaFLOPS Official Thread" is warranted before we continue littering the forum with threads of articles that say nothing substantial about it.
 

DCharlie

And even i am moderately surprised
am i being retarded or something, or did mainframes suddenly no longer exist???
 

ourumov

Member
kaigai01.jpg


Intel is already building his vaporware...Well vaporware that seems more reliable to me than most of the CELL stuff we have been hearing.
 

Ryudo

My opinion? USED.
I keep saying this;

If "Cell" doesnt = toy story graphics, then its a big fucking joke.
 

gofreak

GAF's Bob Woodward
Ryudo said:
I keep saying this;

If "Cell" doesnt = toy story graphics, then its a big fucking joke.


Fear not...Xbox already has you covered!!1


Bill Gates said:
"We're approaching the level of detail seen in Toy Story 2"

"One of the basic premises of the Xbox is to put the power in the hands of the artist," Blackley said, which is why Xbox developers "are achieving a level of visual detail you really get in 'Toy Story.'"

Woah! Looks like you don't need a next-gen system! ;)

Seriously, though, the Toy Story jibes are getting real old. Especially when you consider that others (MS) promised even more (the multiple references to "photorealism" in the original Xbox unveiling attests to that).
 

Ryudo

My opinion? USED.
gofreak said:
Seriously, though, the Toy Story jibes are getting real old. Especially when you consider that others (MS) promised even more (the multiple references to "photorealism" in the original Xbox unveiling attests to that).

Nah they are not. They are making cell sound like some sort of cray with lazer beams and bees :lol
 

marsomega

Member
In all honesty, CELL will have to make Intel and AMD change things around, but the reasons are misunderstood. Pick up any book or article to see the work that goes into computing clusters. Especialy the extensive R & D on the software side. When you read the description of CELL and how it gets rid of the extensive software R & D and cost (EDIT: to an extent, it just makes it easier) and you'll see why even if CELL doesn't live up to the performance numbers, it will still be the best alternative in both hardware set up and software. As well as represent the future of computing clusters, heck it already makes computing clusters look ancient and impracticl by nature alone.
 

Fafalada

Fafracer forever
DCharlie said:
am i being retarded or something, or did mainframes suddenly no longer exist???
Might as well face it DC, your entire work-life was just a figment of someone's imagination :lol
 

Pimpwerx

Member
Drensch said:
We heard that the EE was going to be the greatest chip ever too...
It was great at what it does. It's a FP machine, and a damned good one. The GS was the weak link in the system, although it did some amazing things as well. It took a while for PC cards to catch up to it in bandwidth. The PS2 looks kinda crappy in hindsight, but it's been 5 years. When it came out, the internet was gushing over it, and that was just for the crappy tech demos. They looked like gold at the time, but like ass now. PEACE.
 
Top Bottom