• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Jonathan Blow Criticizes MS’s Claim of Increasing Servers to 300K, Calls It A Lie!

Klocker

Member
"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."

they are promising 3x CPU and storage and, memory presumably, available on the servers for every Xbox sold. Assuming they do not need to account for anything near 100% utilization of time, that should be totally doable
 

Godslay

Banned
The main issue is that no matter the amount of servers, some of the task they are claiming to off-load "on the cloud" are just bullshit.

"Lag for lighting is not that important so we could that with the cloud, you know".
Really? How?
Please, explain to me this technical process where you render a simpler scene with your local hardware and then you use the magic cloud for calculating lighting and adding it on top of the previous render, without suffering any latency-related issue.

Jesus Christ if people isn't gullible when it comes to marketing bullshit these days.

Go back and read the paper I linked to in this thread. If they can do it for this task, it can easily be done with latency insensitive tasks. It's easy to be ignorant on an issue and call it bullshit. Using distributed computing for gaming computations is coming sometime in the future, no matter how gullible you think people are.
 

Fusebox

Banned
I think what it really comes down to is: what is the difference between 300,000 physical servers that accomplish the task or 300,000 virtual servers that accomplish the task?

Because its meaningless. If I have a physical server with a certain amount of resources, then I create 10 virtual machines on that physical server and they're all devoted to doing the exact same processing task then I've done nothing but make it unnecessarily complicated. 1 physical server, 10 virtual servers, same amount of resources, still a single physical point of failure.

Then if I bragged about having 10 servers available in my marketing material I'd expect to be called on it.
 

hadareud

The Translator
I wonder if the Xbox One truthing is going to catch on on a large scale.

There's a distinct gap in the market at the moment, with the Boston bombings and the London stabbing now over.

A valiant start by Mr Blow. What a guy.
 

baphomet

Member
People actually thought they were going to have 300k physical servers? I assumed virtual servers right off the bat.
 

Sentenza

Member
Never said it was.
But Microsoft did, it's bullshit and they deserve to be called out for it.
Which, on a side note, is exactly the issue Blow is referring to in his first twitter:

d4zxWii.png
 

Godslay

Banned
But Microsoft did, it's bullshit and they deserve to be called out for it.
Which, on a side note, is exactly the issue Blow is referring to in his first twitter:

http://i.imgur.com/d4zxWii.png

From said article:

While latency-sensitive actions will be handled by a user's Xbox One console, Microsoft claims its cloud architecture can pre-calculate elements like lighting and physics modeling, leading to increased in-game performance.

It might be bullshit, but it doesn't contrast with my statement.
 

KHarvey16

Member
Because its meaningless. If I have a physical server with a certain amount of resources, then I create 10 virtual machines on that physical server and they're all devoted to doing the exact same processing task then I've done nothing but make it unnecessarily complicated. 1 physical server, 10 virtual servers, same amount of resources, still a single physical point of failure.

Then if I bragged about having 10 servers available in my marketing material I'd expect to be called on it.

I don't understand why this is a problem for you. If I need 10 instances to run 10 individual processes why does it matter if I can accomplish that with 1 machine?

But Microsoft did, it's bullshit and they deserve to be called out for it.
Which, on a side note, is exactly the issue Blow is referring to in his first twitter:

http://i.imgur.com/d4zxWii.png

That headline says nothing about dynamic lighting.
 

commedieu

Banned
That headline says nothing about dynamic lighting.

headline says... "more cloud processing bs." Which was in ref, to its operations, when the tweet was made. Cloud processing bullshit applies, in this context, to everything under the sun that was said.

Twitter doesn't allow for huge amounts of text, and is used for quick blurbs of communication.
 

KHarvey16

Member
headline says... "more cloud processing bs." Which was in ref, to its operations, when the tweet was made. Cloud processing bullshit applies, in this context, to everything under the sun that was said.

Twitter doesn't allow for huge amounts of text, and is used for quick blurbs of communication.

It says cloud processing can be used to enhance lighting, and Tuco was mentioning dynamic lighting before as being latency sensitive. Not all lighting is dynamic lighting.
 

commedieu

Banned
It says cloud processing can be used to enhance lighting, and Tuco was mentioning dynamic lighting before as being latency sensitive. Not all lighting is dynamic lighting.
which all falls under the blanket of could bullshit though. Opinion, but it seems that you understood that the tweet headline wasn't a full thought, and that it was in reference to something else. Even if its just lighting via the cloud, kinda still the same shitty smell. As far as the haters go.

I thought that was the problem.

If the lighting in question doesn't need to be dynamic it can be precomputed before they stamp the discs. No cloud needed.

Seems to be more along the lines of your concern. Kharv
 
It says cloud processing can be used to enhance lighting, and Tuco was mentioning dynamic lighting before as being latency sensitive. Not all lighting is dynamic lighting.

If the lighting in question doesn't need to be dynamic it can be precomputed before they stamp the discs. No cloud needed.
 

KHarvey16

Member
which all falls under the blanket of could bullshit though. Opinion, but it seems that you understood that the tweet headline wasn't a full thought, and that it was in reference to something else. Even if its just lighting via the cloud, kinda still the same shitty smell. As far as the haters go.

I thought that was the problem.



Seems to be more along the lines of your concern. Kharv

If the lighting in question doesn't need to be dynamic it can be precomputed before they stamp the discs. No cloud needed.

Latency insensitive doesn't mean precomputed when the game is compiled. It could be based on events that happened in the past in terms of gameplay or what's on screen.
 

Godslay

Banned
Dynamic lighting is an MS example and exactly the kind of claim people are calling bullshit on.

So many of these Xbone conversations need this reminder every few posts. It's getting maddening. "Why don't you wait for Major Nelson to tell us more?" Because I'm reacting to what Phil Harrison already said, that's why!

Give me a source, I don't doubt it, just want to see the words, "Dynamic" for myself.
 

Aeonitis

Neo Member
I, as a foolish consumer of the past, have learnt not to get excited by cannibalistic corporation's buzz words and statistics which have nothing to do with improving a creative, fun, and value in a game. I have full admiration for accomplished people like J.B. who are concerned for future creators coming up their path of success and not just paving the way, but also fighting for the path to stay open and with integrity and respect. Not many applaud that... I find buying an xbox one laughable, that is just my opinion. 7th gen turned me sour for consoles. Might even try and buy an Xbox One to swap it with an Atari 2600, would be much happier that way to be honest.
 
I would feel better about guys like Blow and Phil Fish if they ever backed up what they said. Hard to believe its not just him being emotional or looking out for his interests.

Why not elaborate why MS is lying? No one thinks of this guy as an expert on cloud computing. Now when he says xbla policies suck I believe him as he is a subject matter expert. Hard not to just associate this type of stuff as just playing politics.
 

Godslay

Banned
No,you're right, pre-baked lighting could easily be what they mean by "enhanced" lighting with cloud processing. Which still leaves us... Where? That's like saying you can have better textures if another computer "processes" them and you get them sent over the Internet. How does that make sense? Is it displaying a streaming CG clip that is better than what the hardware can handle? How does a GPU suddenly outperform itself due to streaming Internet data?

Well for one, if another computer is doing it, doesn't that mean that local resources aren't devoted to it? If the answer is yes, then the CPU/GPU doesn't outperform itself, it simply is devoted to an entirely different task.

I don't know exactly what they are planning, but I will say that I can see how there COULD be tangible benefits, and it has been proven in the lab to some degree. I don't know how they arrived at their numbers, but I doubt they are realistic. That doesn't mean throw the baby out with the bathwater.

I'll politely excuse myself from this thread now.
 
I would feel better about guys like Blow and Phil Fish if they ever backed up what they said. Hard to believe its not just him being emotional or looking out for his interests.

Why not elaborate why MS is lying? No one thinks of this guy as an expert on cloud computing. Now when he says xbla policies suck I believe him as he is a subject matter expert. Hard not to just associate this type of stuff as just playing politics.

He's speaking against MS so it's all fine.
 

commedieu

Banned
Well for one, if another computer is doing it, doesn't that mean that local resources aren't devoted to it? If the answer is yes, then the CPU/GPU doesn't outperform itself, it simply is devoted to an entirely different task.

I don't know exactly what they are planning, but I will say that I can see how there COULD be tangible benefits, and it has been proven in the lab to some degree. I don't know how they arrived at their numbers, but I doubt they are realistic. That doesn't mean throw the baby out with the bathwater.

I'll politely excuse myself from this thread now.

Why aren't PC devs, who have created infinitely technically superior products to everything known to the console world, making the leap into cloud gaming for their products though? Microsoft isn't the first to come up with cloud.

That alone, for me, and the SIM failure, is reason enough to look forward to clear skys.
 

Dipswitch

Member
Why aren't PC devs, who have created infinitely technically superior products to everything known to the console world, making the leap into cloud gaming for their products though? Microsoft isn't the first to come up with cloud.

Because they have to pay for those cloud computing resources? Indefinitely.
 

Hana-Bi

Member
Why aren't PC devs, who have created infinitely technically superior products to everything known to the console world, making the leap into cloud gaming for their products though? Microsoft isn't the first to come up with cloud.

That alone, for me, and the SIM failure, is reason enough to look forward to clear skys.

Well, most developers probably don't have the infrastructure nor the money to implement such a service for millions of gamer. It would be ridiculous to implement such cloud features for just a handful of games. With thousands of servers for many years...
 

Godslay

Banned
Why aren't PC devs, who have created infinitely technically superior products to everything known to the console world, making the leap into cloud gaming for their products though? Microsoft isn't the first to come up with cloud.

That alone, for me, and the SIM failure, is reason enough to look forward to clear skys.

Do they have the server infrastructure to do so? Or the cloud programming model/tools to do it? It's not as simple as just saying we've got X users, lets utilize their cycles! Control nodes (aka servers) orchestrate how work is distributed to the individual clients.

For the record AMD/NVIDIA has their hands in the cookie jar. Slightly different technology, more processing 'in the cloud' and streaming to devices.
 

spwolf

Member
You're apparently gone but I will reply anyway: while cloud processing could be beneficial on a micro/macro scale (pulling small amounts of data from a large simulation, which is basically what MMOs have been doing for a couple decades) it isn't parallel processing. Having the Xbone APU ignore lighting while it pulls data on lighting from the Internet means... It still has to process that lighting. Unless there is some kind of unknown technological factor at okay here, I don't get how this would work beyond a more powerful processor doing the work, and streaming it completely OnLive style.



Are you implying PC games cannot afford whatever this magic tool is? Despite being light years ahead of consoles in these areas this entire time?

it would actually make a lot more sense to do onlive-type of service, where you add it to any PC, and Win Phone, and Win RT tablet...

... this makes no sense at all... you buy $400 box, just to offload some small data calculations to the cloud, which you can do on your local box as well, and cache for later use anyway.

If they sold $49-$99 thin client for this, then ok, perfect sense, pay $15 per month to play over it... but XO? Its laughable concept. It will not happen, and they are digging themselves a hole.

Plus, them saying it needs once per 24hr internet check does not make sense with this in mind - because this means you would be connected and streaming game data from cloud all the time... it makes no sense at all, at all. So devs will make this cloud powerup optional, so if you lose internet, it still works offline, but you have net, then it is online? Taking up their cloud instances which costs money?

lol
 

itxaka

Defeatist
Why aren't PC devs, who have created infinitely technically superior products to everything known to the console world, making the leap into cloud gaming for their products though? Microsoft isn't the first to come up with cloud.

That alone, for me, and the SIM failure, is reason enough to look forward to clear skys.

Because it sucks!

You gotta use the cloud (or servers as it was know before) for good things like savegames, screens, settings sync, etc...

Also, because unless people pay, keeping servers up costs money abnd PC games have a loong looong life.
 
It'll be cool to see what Ms and Sony do this Gen with cloud services. I used onlive early on and was not impressed.

I am excited to see what happens. I want to believe but I struggle streaming 1080p video with my shitty dl and ul speeds.
 
This guy really thinks he's someone important. I guess there are people who care what he thinks. If ms is bullshitting we will know soon enough.
 

commedieu

Banned
Well, most developers probably don't have the infrastructure nor the money to implement such a service for millions of gamer. It would be ridiculous to implement such cloud features for just a handful of games. With thousands of servers for many years...
/\/\/\/\
You mean like World of Warcraft, or Planetside 2, or Guild Wars 2, or...


Do they have the server infrastructure to do so? Or the cloud programming model/tools to do it? It's not as simple as just saying we've got X users, lets utilize their cycles! Control nodes (aka servers) orchestrate how work is distributed to the individual clients.

For the record AMD/NVIDIA has their hands in the cookie jar. Slightly different technology, more processing 'in the cloud' and streaming to devices.

There is no requirement from consumers, that would encourage a developer to rely on cloud processing. And we aren't in a world where its shown to be a superior method of doing anything that has gaming applications. You can buy a PC game, run it, offline, end enjoy 100% of that experience. Those visuals, that model quality, blah blah, everything. This is why Its a hard issue for Microsoft to get me to swallow. I know MS isn't on the forefront when it comes to streaming textures/stream loading, blah blah, any of that stuff. So for MS to come up with this, when no one else has(that does it better) it doesn't really strike me as something to be exicted about. If we had a PC model of an open world Crysis type title relying on cloud, with millions of people playing, and it was second to none in engineering/tech/whatever you want to call it, i'd have no real point.

As it is now? Microsoft is telling me that Milo is going to make everything better again.(please understand the milo ref.. as just pop words and PR)

Because it sucks!

You gotta use the cloud (or servers as it was know before) for good things like savegames, screens, settings sync, etc...

Also, because unless people pay, keeping servers up costs money abnd PC games have a loong looong life.

Thats what I'm seeing. But, we are here. With people still doing the "well we don't know! The future could be.." fuck the future. You're selling a game console, not a way of life. Sony showed off the cell, had pointless tech demo's then started applying that to their games. Nothing like Uncharted has ever stream loaded on a console before. Thats something that me, as a consumer, can see and know how this will translate to my games. Little to no loading times while playing... the cloud shit is just.. "well you could do game saves, or lighting?!?" if it doesn't look superior, or perform better that PC/PS4 titles, its just causing people to maybe have weird glitches happening during a time other people are watching netflix/using the net on your connection..

egad to the whole fucking thing. Every part of the MS reveal is "No, don't trust what you know. But trust what we are telling you, stupid!" Our console with less physical power has more virtual power and doesn't require a connection but it requires a connection.
 

Sentenza

Member
That headline says nothing about dynamic lighting.
You are in full denial at this point.
If it's pre-baked lighting they are talking then what's the role of could computing, exactly, in achieving something that even old hardware can manage with virtually no additional workload?
Where's the advancement they are babbling about?Pre-baked lighting is called so because there's virtually no additional computation involved.

Beside, even ignoring this, that headline claims that The Cloud could be used for lighting, physics and so on.
No, it couldn't, anything related to render can't be processed through cloud-computing, because it's too sensitive to issues like latency and bandwidth.

We are talking about stuff that is usually handled by the GPU and requires gigabytes per second of of data transfer, with an output of AT LEAST one frame every... I don't know... 0,05 seconds? How could an average internet connection be sufficient, both in terms of latency and bandwidth?
How could these lighting/physics calculations travel with what's optimistically a delay of 30ms, then SOMEHOW tie with the basic render and give you this hypothetical enhanced frame with the power of the cloud?

This is a joke. It's not even remotely believable.
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
Do they have the server infrastructure to do so? Or the cloud programming model/tools to do it? It's not as simple as just saying we've got X users, lets utilize their cycles! Control nodes (aka servers) orchestrate how work is distributed to the individual clients.

For the record AMD/NVIDIA has their hands in the cookie jar. Slightly different technology, more processing 'in the cloud' and streaming to devices.

Do you know what you are talking about?

Try, COMPLETELY different technology.

One renders complete frames, on a server, in sequence, which are encoded and streamed like a movie to a client. This is a proven, although laggy technology. (See Onlive)

The other tries to offload certain non-real time calculations to another server, which is not even a little bit proven yet.
 

commedieu

Banned
You are in full denial at this point.
If it's pre-baked lighting they are talking then what's the role of could computing, exactly, in achieving something that even old hardware can manage with virtually no additional workload?
Where's the advancement they are babbling about?Pre-baked lighting is called so because there's virtually no additional computation involved.

Beside, even ignoring this, that headline claims that The Cloud could be used for lighting, physics and so on.
No, it couldn't, anything related to render can't be processed through cloud-computing, because it's too sensitive to issues like latency and bandwidth.

We are talking about stuff that is usually handled by the GPU and requires gigabytes per second of of data transfer, with an output of AT LEAST one frame every... I don't know... 0,05 seconds? How could an average internet connection be sufficient, both in terms of latency and bandwidth?
How could these lighting/physics calculations travel with what's optimistically a delay of 30ms, then SOMEHOW tie with the basic render and give you this hypothetical enhanced frame with the power of the cloud?

This is a joke. It's not even remotely believable.

he argues semantics/technicalities/devils advocate 247. just fyi.
 
You mean like World of Warcraft, or Planetside 2, or Guild Wars 2, or...

So now we're only talking about MMOs? Doesn't mean they want to spend the money... They're already making obscene amounts of money in monthly subscriptions, why waste money and probably see little to no return? You really think that would make a big spike in WoW users? They still have to allocate resources to do the various tasks, just because they already have servers doesn't mean it makes it free to them. Either way it costs money and I doubt they want to spend money and we don't know how reliable it is either on lower speed connections.

Why aren't PC devs, who have created infinitely technically superior products to everything known to the console world, making the leap into cloud gaming for their products though? Microsoft isn't the first to come up with cloud.

That alone, for me, and the SIM failure, is reason enough to look forward to clear skys.

Because they don't want to spend the money? Are you guys really this shocked? I'm not saying one thing is true or not but after the fucking abysmal ports at times on the PC this gen, I would think it's abundantly clear to all how much money/effort most companies want to put into their pc version.
 

Fusebox

Banned
I don't understand why this is a problem for you. If I need 10 instances to run 10 individual processes why does it matter if I can accomplish that with 1 machine?

It's highly unlikely that MS need 300,000 servers to run 300,000 individual processes, more likely they're just hoping people see the high number and get impressed.
 

KHarvey16

Member
You are in full denial at this point.
If it's pre-baked lighting they are talking then what's the role of could computing, exactly, in achieving something that even old hardware can manage with virtually no additional workload?
Where's the advancement they are babbling about?Pre-baked lighting is called so because there's virtually no additional computation involved.

Beside, even ignoring this, that headline claims that The Cloud could be used for lighting, physics and so on.
No, it couldn't, anything related to render can't be processed through cloud-computing, because it's too sensitive to issues like latency and bandwidth.

We are talking about stuff that is usually handled by the GPU and requires gigabytes per second of of data transfer, with an output of AT LEAST one frame every... I don't know... 0,05 seconds? How could an average internet connection be sufficient, both in terms of latency and bandwidth?
How could these lighting/physics calculations travel with what's optimistically a delay of 30ms, then SOMEHOW tie with the basic render and give you this hypothetical enhanced frame with the power of the cloud?

This is a joke. It's not even remotely believable.

Denial? Me? All we see in every one of these threads is how none of this is remotely possible because of...reasons. All of the research being poured into cloud computing and all the assets and money dedicated to making it work are all just a huge waste of time and I know better than them because of...reasons.

I'm not in denial, I'm hopeful these assets can be wielded effectively to bring me a better gaming experience. I appreciate healthy skepticism and questioning whether or not MS can do these things or if developers will do these things is perfectly fine, but the wholesale dismissal of it even being possible on a conceptual level is ridiculous. It's required that you assume far more than we know.

Why do you keep insisting they're talking about doing things in the cloud that need to be returned in that frame? Of course if you set up impossible demands you can claim the system can't possibly accomplish those tasks. Perhaps you don't know what tasks are being offloaded? Perhaps you don't understand the methods they're using to partition tasks between the local console and the cloud? There was a paper someone linked to in the other thread regarding AI being improved substantially using this even accounting for a 1 second turn-around time. Read it here:

http://research.microsoft.com/pubs/72894/NOSSDAV2007.pdf
 
So.


If I'm playing a game and it looks good and the AI is challenging, but then my internet goes out. So will the game suddenly look not so great and be stuck on easy mode?
 

Replicant

Member
But Microsoft did, it's bullshit and they deserve to be called out for it.
Which, on a side note, is exactly the issue Blow is referring to in his first twitter:

d4zxWii.png

So one minute it's all shiny and wonderful and the next I find myself in Silent Hill-like environment when my internet goes kaput?

That's not the kind of horror game I was looking for, MS!
 
Top Bottom