• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Jonathan Blow Criticizes MS’s Claim of Increasing Servers to 300K, Calls It A Lie!

hey_it's_that_dog

benevolent sexism
Jonathan Blow, who is making an "exclusive" game for the PS4, is bad mouthing PS4's direct competitor? I wouldn't pay attention to him, he has a horse in this race and he will say whatever he wants so people buy more PS4s over XBones. Every PS4 sale is a possible Witness sale = money in his pocket.

Regardless if he's right or not, who do you think will have more servers running by launch, Sony or MS?

So let's take into account all his preferences re: MS and Sony.

He clearly favors Sony right now.

Now let's ask the more relevant question: Were his complaints about MS policy factually untrue? If his anti-MS "bias" merely resulted in him openly voicing valid criticisms, then how does attacking his preferences undermine his arguments?

I don't care what platform he prefers, I care whether his statements are true. It's illogical to infer that he is biased just because he has a preference, especially when he's been fairly transparent about the reasons for his preference. He didn't like key things about XBLA. He does like key things about PS4.

So I think we can drop the simple-minded claims of "teh bias" that crop up whenever he comments on something lately and focus on the content of his statements.
 

Aeonin

Member
I said that the local hardware would start them and the cloud would then pick up once the latency catches up and continue the smoke effects without bogging down the CPU.

You must realize the latency could never catch up - or else we wouldn't have this issue in the first place.

Move the camera an iota of an inch in any direction and you have an entirely brand new set of data for the machine to crunch.
 

hey_it's_that_dog

benevolent sexism
Blow needs to shut the fuck up and worry about making video games.

Other than taking the time to tweet a few times, it seems like he is worrying about making video games.

Good contribution, though. Definitely worth the great deal of time it must have taken to make a post on the internet.
 
I meant specifically to this thread. Show me evidence of your claim that the infrastructure doesn't exist.

Dude, they are all over the fucking thread. Here's one mere posted before mine:

I wouldn't discount the potential performance gains, either. For one they said it was like having one One at home, and three Ones in the cloud, so 4x the performance and not 40x. That obviously doesn't make the One four times as fast and will only have a limited number of applications, but in situations where using the cloud works, I have no doubt we'll see these kinds of gains (if a developer really decides to use the cloud / always-online).

And the infrastructure doesn't exist yet because what Microsoft is describing is bandwidth-limited, processor-intensive make believe bullshit.
 

fallingdove

Member
Do you have any evidence that something like this isn't possible?

Or are you talking out of your ass too?

Rendering is low latency. I.e the type of calculation the cloud is not going to be able to perform.

And the fact that you think that just because there is a lot of smoke in the Forza trailer that it must be 'the cloud' is baseless nonsense.

I do believe that the cloud can and will be used for some offloaded calculations but we are talking about things like A.I. Routines, maybe lighting. I could see the cloud tracking how a player is playing and recalculate the ai response to anticipate play style. That sort of thing would be awesome.
 

FordGTGuy

Banned
You must realize the latency could never catch up - or else we wouldn't have this issue in the first place.

Move the camera an iota of an inch in any direction and you have an entirely brand new set of data for the machine to crunch.

How would it never catch up? Do you understand the concept of latency?

Rendering is low latency. I.e the type of calculation the cloud is not going to be able to perform.

And the fact that you think that just because there is a lot of smoke in the Forza trailer that it must be 'the cloud' is baseless nonsense.

I do believe that the cloud can and will be used for some offloaded calculations but we are talking about things like A.I. Routines, maybe lighting. I could see the cloud tracking how a player is playing and recalculate the ai response to anticipate play style. That sort of thing would be awesome.

I never, not once, on GAF claimed that it can be used for low latency computations.

They've already said it would be capable of continuing local calculations through the cloud once it caught up, I was only giving a wild example of this.

Dude, they are all over the fucking thread. Here's one mere posted before mine:



And the infrastructure doesn't exist yet because what Microsoft is describing is bandwidth-limited, processor-intensive make believe bullshit.

I'm glad you're the authority on this, have any evidence the infrastructure doesn't exist that it can't work?
 
I'm not the one with the burden of proof.
I said that the local hardware would start them and the cloud would then pick up once the latency catches up and continue the smoke effects without bogging down the CPU.

and your car will have moved off three seconds before the cloud gets back to you with your 'enhanced' smoke.
 

Aeonin

Member
I do believe that the cloud can and will be used for some offloaded calculations but we are talking about things like A.I. Routines, maybe lighting. I could see the cloud tracking how a player is playing and recalculate the ai response to anticipate play style. That sort of thing would be awesome.

Think about a big multiplayer USER vs CPU game. Man, that just gets my gears a goin'. Not to say that AI is that great now, but it could be really awesome.
 

FINALBOSS

Banned
To the people who are in support of this 300K cloud computing BS--do you HONESTLY think ANY multiplatform game will take advantage of it? If your answer is yes, you're sorely mistaken.
 

Perkel

Banned
Did you even read what I said about smoke effects?

I said that the local hardware would start them and the cloud would then pick up once the latency catches up and continue the smoke effects without bogging down the CPU.

Do you even understand that you need to:


- send data (internet latency) >
- process data (which takes again time) >
- send data back (again internet latency) which won't be small (since we talk here about volumetric effect) >
- and lastly render it. ?

All this needs to be under 33,3 ms or in case of forza 16,6 ms.

It is not even physically possible to do it

It is either that or smoke which will appear second/seconds after car created your smoke effect which would be simply bad idea.


It is not :

1. Start Smoke
2. Compute
3. .....
4. Profit
 

spuit*11

Banned
If people think cloud computing is somehow going to tangibly enhance their gameplay experience on current and near-future internet infrastructure are in for a massive disappointment.

Just wondering, are the people in this thread that are so convinced this is not a complete PR fairy tale the same people that gobbled up everything EA said about cloud computing in relation to their SimCity game?
Or are you guys cherry picking your cloud computing fantasies?
 

FordGTGuy

Banned
Do you even understand that you need to:


- send data (internet latency) >
- process data (which takes again time) >
- send data back (again internet latency) which won't be small (since we talk here about volumetric effect) >
- and lastly render it. ?

All this needs to be under 33,3 ms or in case of forza 16,6 ms.

It is not even physically possible to do it

It is either that or smoke which will appear second/seconds after car created your smoke effect which would be simply bad idea.


It is not :

1. Start Smoke
2. Compute
3. .....
4. Profit

Do you have the reading comprehension or understanding of basic concepts to realize that I said it would start on the local hardware and then continued on the cloud computations?

I can type "ping" into a console.

I'll take that as a no.
 

DesertFox

Member
Dude, they are all over the fucking thread. Here's one mere posted before mine:



And the infrastructure doesn't exist yet because what Microsoft is describing is bandwidth-limited, processor-intensive make believe bullshit.

*Ahem*
Blow is full of shit himself.

Of course Azure is running on virtualized hardware, anything else wouldn't really be feasible. And it's pretty much standard, AWS is all "VMs", as is Google Compute.

As for the 300k number: While no one in the industry gives out this sort of information, it's unlikely that the 300k represent anywhere near the total number of physical machines. So 300k could very well be the number of dedicated servers just for XB1. In 2008 Microsoft mentioned they were adding tens of thousands of servers a month. That was before they even launched Azure. In 2009 they opened up their Chicago data center, which can hold anywhere from 200k - 400k servers. That's one data center, almost 5 years ago. Since then they've opened additional data centers in Iowa, Ireland, Amsterdam and Virgian, to name a few.
At a conference in Las Vegas last week, Michael Manos, Microsoft's senior director of data center services, said in a keynote speech that the first floor of a data center being built by the software vendor in the Chicago area will hold up to 220 shipping containers, each preconfigured to support between 1,000 and 2,000 servers, according to various news reports and blog posts.

That means the $500 million, 550,000-square-foot facility in the Chicago suburb of Northlake, Ill., could have as many as 440,000 Windows servers on the first floor alone — or up to 11 times more than the total of 40,000 to 80,000 servers that conventional data centers of the same size typically can hold, according to Manos. He was quoted as saying that Microsoft also plans to install an undisclosed number of servers on the building's second floor, which will have a traditional raised-floor layout.

http://www.computerworld.com/s/arti...uilds_first_major_container_based_data_center


So yes, Blow is full of shit.

Sources:




http://www.globalfoundationservices...icrosoft-cloud-scale-data-center-designs.aspx

And what are your sources for calling this server farm "make believe"?

To the people who are in support of this 300K cloud computing BS--do you HONESTLY think ANY multiplatform game will take advantage of it? If your answer is yes, you're sorely mistaken.

I'm skeptical that it can be used to enhance gameplay in any way, but I'm not too stubborn to accept that the hardware exists.
 

Godslay

Banned
^^ Broken link to computer world

Dude, they are all over the fucking thread. Here's one mere posted before mine:



And the infrastructure doesn't exist yet because what Microsoft is describing is bandwidth-limited, processor-intensive make believe bullshit.

Well it is unrealistic to determine performance gains when it isn't tested or utilized yet. There are tangible benefits to reap though.

As far as the infrastructure, it will be there, and yes it is a real thing whether you believe it's bullshit or not.
 

Toparaman

Banned
tumblr_m23a28WFaL1rpalwwo1_500.gif

aASP8hs.png


The cloud is really... A BUSH!!

dun dun DUN!


I love all of you.

Isn't this guy making Witness and shit?

Honestly why would he even care man, game looked like ass. Could be done on my 3 year old GPU.

That's your criteria for "looks like ass"? Okay.

Games that look like ass: Bioshock, Wind Waker, Okami, Crysis, Super Mario Galaxy, Skyward Sword, Half Life 2, Portal 1/2, Team Fortress 2, etc.

If what you're trying to say is that the game doesn't demonstrate the PS4's power, I agree.


Anyway, does anyone else find it hilarious that a thoughtful intellectual like Blow has taken on the role of seething company fanboy? I'm not saying that he actually is a fanboy, because he's made some good points, but it's just funny imagining him getting all pissed off at Microsoft, while wearing a "SONY RULES" t-shirt.
 

ymmv

Banned
I'm not the one with the burden of proof.

I merely stated a hypothetical example, I have no proof to back them up and I never claimed to have any.

However if you're going to claim that it's not possible you have the burden of proof to back up these claims.

What the hell? You wrote:

In fact if you watch the Forza 5 trailer you can see a ton of smoke lingering after one of the McLarens drift around the corner I'm actually guessing this is what they are doing to pull it off.

I have to disproof your flights of fancy? You're the one thinking up this stuff in the first place. Why not come up with proof that smoke was the result of distributed computing instead of merely the Xbox One's CPU/GPU doing the hard work?
 

leroidys

Member
A big issue I see is with how the console would handle not knowing for sure when it can expect certain data back. How a modern computer normally handles contingent calculations, its that it does both beforehand (or all of the ones with a high probability of being true) and then when the case is evaluated it takes the right branch.

What happens if you start dropping packets? The smoke (or whatever other graphical effect) would have to drop to some level worse than what the console would normally be capable of to have any benefit to doing the computation in the cloud in the first place, or just disappear altogether.

I just don't see it being feasible for enhancing graphics.
 

FordGTGuy

Banned
What the hell? You wrote:



I have to disproof your flights of fancy? You're the one thinking up this stuff in the first place. Why not come up with proof that smoke was the result of distributed computing instead of merely the Xbox One's CPU/GPU doing the hard work?

Doe you understand the meaning of a guess?

Here I'll help:

predict something: to form an opinion about something without enough evidence to make a definite judgment

I never claimed that it's even going to work in the first place, I just want the people claiming that it isn't going to work to show actual evidence beyond their own opinions.
 

AZ Greg

Member
So once things are offloaded to the cloud, is it just going to create another thing that gets in the way of image quality? If they do use it for graphical effects, will we be dealing with pop-in and other undesirable effects depending on internet latency?
 

JCizzle

Member
^^ Broken link to computer world



Well it is unrealistic to determine performance gains when it isn't tested or utilized yet. There are tangible benefits to reap though.

As far as the infrastructure, it will be there, and yes it is a real thing whether you believe it's bullshit or not.

Let's give them the benefit of the doubt and concede that yes, there really are 300k actual servers out there solely focused on xbox live. They still haven't told us what benefit that provides the user or what function it provides games. It seems very unlikely that it will be a seamless thing for third parties to code for in a way that provides even minor benefits in non mmo environments.
 

leroidys

Member
So once things are offloaded to the cloud, is it just going to create another thing that gets in the way of image quality? If they do use it for graphical effects, will we be dealing with pop-in and other undesirable effects depending on internet latency?

Yeah I can't really think of a situation where you wouldn't get crazy, constant pop-in. I do systems-y stuff but I don't do graphics, so who knows, there are a lot of people working on this that are way smarter than me. But based on what I know about computer architecture and latency, I find it highly dubious.
 

Godslay

Banned
A big issue I see is with how the console would handle not knowing for sure when it can expect certain data back. How a modern computer normally handles contingent calculations, its that it does both beforehand (or all of the ones with a high probability of being true) and then when the case is evaluated it takes the right branch.

What happens if you start dropping packets? The smoke (or whatever other graphical effect) would have to drop to some level worse than what the console would normally be capable of to have any benefit to doing the computation in the cloud in the first place, or just disappear altogether.

I just don't see it being feasible for enhancing graphics.

Basically it would require a time window for clients to return the data, as well as redundancy so if a node drops off another node replaces it's data. If you start to drop packets, or lose connection it would require a local fallback.

Graphics seem like the most difficult lever to pull, as it has to happen very fast, and you see the results immediately.
 

Aeonin

Member
Do you have the reading comprehension or understanding of basic concepts to realize that I said it would start on the local hardware and then continued on the cloud computations?

You have a severe lack of understanding when it comes to any of this.

I suggest stop harping on people's reading and actually read the messages, it will describe to you the most basic problem with this issue.

If the cloud were to compute graphics, you would not get it back in time, even if the graphic has been onscreen for however long. Consider the dynamic nature of games and graphics.
 

DesertFox

Member
To everyone here who is skeptical that cloud based gaming can work, please try it right now for yourself here:

OnLive

Don't get me wrong, I'm not saying that I agree that MS can pull off splitting game calculations between local hardware and cloud processing (Cloud gaming vs offloading seem like entirely different beasts) - but for those of you who think that a game cannot be played over the cloud for latency reasons, please try it for yourself.
 

Perkel

Banned

Nailed also this correspond to some people in this thread.

As of real cloud compute ideas it would be awesome to see procedural sandbox games generated via server farm.

Something like Dwarf Fortress World generator but with full 3D and on much much bigger scale with many many more things in it. Probably client based single player RPG.

Do you have the reading comprehension or understanding of basic concepts to realize that I said it would start on the local hardware and then continued on the cloud computations?



I'll take that as a no.

Believe what you want but you won't see it in games.
 

ymmv

Banned
Doe you understand the meaning of a guess?

I never claimed that it's even going to work in the first place, I just want the people claiming that it isn't going to work to show actual evidence beyond their own opinions.

What is it? Are you guessing or providing evidence to spite the naysayers? You're blind to the contradiction in your statements.
 

2MF

Member
This is not entirely true, it highly depends on the game and the type of game it is.

There are a lot of non-sensitive to latency stuff you can pull off and even more stuff that can start from the local hardware and continued on the cloud once it catches up.

Global lighting and weather is a pretty good example because you have no direct control over it as a player.

On a next-gen Gears of War or Halo you could have entire battle scenes in the background done in the cloud that don't affect the player but are completely visual leaving any local and low latency computations to the local hardware.

http://en.wikipedia.org/wiki/Global_illumination

As for weather, that sounds possible but very contrived. Same thing for the far-away battle scenes, how much benefit is there to have them on the cloud compared to just being pre-rendered or rendered at low detail locally?

There is also latency sensitive stuff that can be pulled off as well.

Doing latency sensitive stuff with high latency is a contradiction. What do you mean?
 

leroidys

Member
To everyone here who is skeptical that cloud based gaming can work, please try it right now for yourself here:

OnLive

Don't get me wrong, I'm not saying that I agree that MS can pull off splitting game calculations between local hardware and cloud processing (Cloud gaming vs offloading seem like entirely different beasts) - but for those of you who think that a game cannot be played over the cloud, please try it for yourself.

This is just streaming video though...
 

ymmv

Banned
To everyone here who is skeptical that cloud based gaming can work, please try it right now for yourself here:

OnLive

Don't get me wrong, I'm not saying that I agree that MS can pull off splitting game calculations between local hardware and cloud processing (Cloud gaming vs offloading seem like entirely different beasts) - but for those of you who think that a game cannot be played over the cloud, please try it for yourself.

Onlive is cloud based rendering - not distributed computing.

Sony's Gakai is fundamentally different from what MS is proposing.
 

Aeonin

Member
To everyone here who is skeptical that cloud based gaming can work, please try it right now for yourself here:

OnLive

Don't get me wrong, I'm not saying that I agree that MS can pull off splitting game calculations between local hardware and cloud processing (Cloud gaming vs offloading seem like entirely different beasts) - but for those of you who think that a game cannot be played over the cloud, please try it for yourself.

No doubt, cloud based gaming is totally here. Its cloud enhanced gaming that is in its infancy at the moment.
 

2MF

Member
To everyone here who is skeptical that cloud based gaming can work, please try it right now for yourself here:

OnLive

Don't get me wrong, I'm not saying that I agree that MS can pull off splitting game calculations between local hardware and cloud processing (Cloud gaming vs offloading seem like entirely different beasts) - but for those of you who think that a game cannot be played over the cloud for latency reasons, please try it for yourself.

When you play a game on Onlive, the whole game engine is running with low-latency local resources. It's just doing so on Onlive's server. The only part that's cloud-based is the streaming of the final framebuffer.
 

DesertFox

Member
Onlive is cloud based rendering - not distributed computing.

Sony's Gakai is fundamentally different from what MS is proposing.

No doubt, cloud based gaming is totally here. Its cloud enhanced gaming that is in its infancy at the moment.

I agree with both of you, cloud computing is not the equal of OnLive. The point I'm trying to make is that if they can tweak their software to the point where user input over the internet can appear to be "real time" while rendering is done elsewhere, then perhaps something similar actually is achievable with Xbox One and distributed computing.

Like I said, I remain skeptical. But I was also skeptical of OnLive before I tried it.

When you play a game on Onlive, the whole game engine is running with low-latency local resources. It's just doing so on Onlive's server. The only part that's cloud-based is the streaming of the final framebuffer.

I don't think that's correct. You're essentially watching a video steam while the client side software sends your user input to the cloud rendering system. It's done fast enough that there is little to no perceptible input lag.
 

Perkel

Banned
To everyone here who is skeptical that cloud based gaming can work, please try it right now for yourself here:

OnLive

Don't get me wrong, I'm not saying that I agree that MS can pull off splitting game calculations between local hardware and cloud processing (Cloud gaming vs offloading seem like entirely different beasts) - but for those of you who think that a game cannot be played over the cloud for latency reasons, please try it for yourself.

That is cloud gaming not cloud compute (which is supposed to help local rendering)

completely different thing where you send only your controls data (very very very small thing) to server all rendering and everything is done on sever and sever send you video of that result.
 

DesertFox

Member
That is cloud gaming not cloud compute (which is supposed to help local rendering)

completely different thing where you send only your controls data (very very very small thing) to server all rendering and everything is done on sever and sever send you video of that result.

I already said in my initial post that I agree they are two fundamentally different things. I'm only trying to provide some insight into what I believe to be the closest technology to what MS is proposing. I don't see how cloud computing could realistically assist real time rendering taking place on a local machine either.

Maybe those people lucky enough to have google fiber could benefit from it? My cable internet connection? Probably too slow...

Only when sever is near you which is whole point of Onlive servers placement and this is also why Onlive had problems (costly infrastucture). Guy from NY must use NY servers, dude from LA need to use LA servers.

If you are not near server lag is unbearable.

Ah - I must have lucked out then with where I'm located. Never looked that far into how successful it was elsewhere.
 
I love how FordGT keeps calling everyone a fanboy and question everyone's intelligence while at the same time posting the most ridiculous and completely oblivious posts lol

Hilarious really.
 

Perkel

Banned
I agree with both of you, cloud computing is not the equal of OnLive. The point I'm trying to make is that if they can tweak their software to the point where user input over the internet can appear to be "real time" while rendering is done elsewhere, then perhaps something similar actually is achievable with Xbox One and distributed computing.

Like I said, I remain skeptical. But I was also skeptical of OnLive before I tried it.



I don't think that's correct. You're essentially watching a video steam while the client side software sends your user input to the cloud rendering system. It's done fast enough that there is little to no perceptible input lag.

Only when sever is near you which is whole point of Onlive servers placement and this is also why Onlive had problems (costly infrastucture). Guy from NY must use NY servers, dude from LA need to use LA servers.

If you are not near server lag is unbearable.
 

2MF

Member
I don't think that's correct. You're essentially watching a video steam while the client side software sends your user input to the cloud rendering system. It's done fast enough that there is little to no perceptible input lag.

That's exactly what I meant. Except the last part, since the lag is quite perceptible at least when playing with a mouse (I didn't try with a controller).
 

Micerider

Member
On a next-gen Gears of War or Halo you could have entire battle scenes in the background done in the cloud that don't affect the player but are completely visual leaving any local and low latency computations to the local hardware.

If a rendering element is not impacted by player's input, you might as well pre-render it, which is much more convenient than "cloud" calculation...
 

DesertFox

Member
That's exactly what I meant. Except the last part, since the lag is quite perceptible at least when playing with a mouse (I didn't try with a controller).

Sorry, your sentence was confusing, you said "local resources" before you mentioned OnLive servers. Made it sound like you were saying the client was running the engine.
 
Top Bottom