• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Jonathan Blow Criticizes MS’s Claim of Increasing Servers to 300K, Calls It A Lie!

Foaloal

Member
Oh c'mon now?

Are you honestly saying, on GAF for that matter, that cloud computing isn't possible?

Why isn't it possible? Please explain what Microsoft is doing and how they're doing it and why it cannot work.

Thanks.

He is talking about the XO going from "10x more powerful" to "40x more powerful" than an xbox 360 with the power of the cloud. Do you really believe the cloud will add the power of 30 xbox 360s to each Xbone?
 

ymmv

Banned
And yet he does not, which is the point I am trying to make. He is bias and should not be taken seriously in this context.

If Sony had said that the PS4 was going to be three times more powerful than the Xbone thanks to the power of the cloud, they'd be criticized too.

<whitney-show-me-the-receipts.jpg>
 

FordGTGuy

Banned
He is talking about the XO going from "10x more powerful" to "40x more powerful" than an xbox 360 with the power of the cloud. Do you really believe the cloud will add the power of 30 xbox 360s to each Xbone?

Will they make it 40x more powerful? No.

Is it possible? Definitely.

If Sony had said that the PS4 was going to be three times more powerful than the Xbone thanks to the power of the cloud, they'd be criticized too.

<whitney-show-me-the-receipts.jpg>

On NeoGAF?

More like praised.
 

2MF

Member
The 40x performance gains number is bs, but there are tangible benefits to offloading computations in a distributed manner. It is possible.

Game engines are very dependent on low-latency calculations, and have very few latency-tolerant parts. Yeah, you can put some AI on the cloud so that fish can move out of the way, but that saves you just a small amount of CPU.

Graphics rendering is definitely a low-latency thing.
 

FordGTGuy

Banned
Game engines are very dependent on low-latency calculations, and have very few latency-tolerant parts. Yeah, you can put some AI on the cloud so that fish can move out of the way, but that saves you just a small amount of CPU.

Graphics rendering is definitely a low-latency thing.

This is not entirely true, it highly depends on the game and the type of game it is.

There are a lot of non-sensitive to latency stuff you can pull off and even more stuff that can start from the local hardware and continued on the cloud once it catches up.

Global lighting and weather is a pretty good example because you have no direct control over it as a player.

On a next-gen Gears of War or Halo you could have entire battle scenes in the background done in the cloud that don't affect the player but are completely visual leaving any local and low latency computations to the local hardware.
 

Alx

Member
He is talking about the XO going from "10x more powerful" to "40x more powerful" than an xbox 360 with the power of the cloud. Do you really believe the cloud will add the power of 30 xbox 360s to each Xbone?

Continuously, certainly not. For peak calculations ? Yeah, why not ?
When you ask Siri to recognize a command, it is processed by a server that is much more powerful than a regular iPhone. Certainly more than 4 times. So for the duration of that task, the cloud increased the power of the iPhone by a factor of 4 (or more).
 

yurinka

Member
Cloud or smoke? I think that to improve graphics with cloud computing is bullshit. Specially when games were not supposed to require to be always online.
 

Godslay

Banned
This is not entirely true, it highly depends on the game and the type of game it is.

There are a lot of non-sensitive to latency stuff you can pull off and even more stuff that can start from the local hardware and continued on the cloud once it catches up.

Global lighting and weather is a pretty good example because you have no direct control over it as a player.

On a next-gen Gears of War or Halo you could have entire battle scenes in the background done in the cloud that don't affect the player but are completely visual leaving any local and low latency computations to the local hardware.

There is also latency sensitive stuff that can be pulled off as well.
 

FordGTGuy

Banned
There is also latency sensitive stuff that can be pulled off as well.

Yep for instance in a game like Forza, when you go to start a burnout the local hardware can start the smoke effects and once the servers catch up they can continue these effects without bogging down the CPU.

In fact if you watch the Forza 5 trailer you can see a ton of smoke lingering after one of the McLarens drift around the corner I'm actually guessing this is what they are doing to pull it off.
 

Dunlop

Member
If Sony had said that the PS4 was going to be three times more powerful than the Xbone thanks to the power of the cloud, they'd be criticized too.

<whitney-show-me-the-receipts.jpg>

They did mentioned they would have the world's fastest gaming network multiple times

<whitney-show-me-the-receipts.jpg>
 

ymmv

Banned
This is not entirely true, it highly depends on the game and the type of game it is.

There are a lot of non-sensitive to latency stuff you can pull off and even more stuff that can start from the local hardware and continued on the cloud once it catches up.

Global lighting and weather is a pretty good example because you have no direct control over it as a player.

But what's the point? You could perhaps move a number of CPU intensive calculations to the cloud, but it also creates additional overhead and latency. Plus every single "cloud powerd" subroutine should have an equivalent "local" version running on the Xbone - unless you want games that won't even run without an internet connection. Another thing, I don't think that processing power will be free for devs to use. I suspect publishers will have to rent/buy server capacity if they want to use distributed computing in their games.
 
People find it hard to believe because they didn't say "(virtual) servers". They just said "servers". And 300,000 physical servers dedicated to xbox live is hard to believe.

Really though, you would have to be a complete idiot when it comes to modern day network infrastructure to assume they were physical. They really shouldn't have to clarify.
 

FordGTGuy

Banned
But what's the point? You could perhaps move a number of CPU intensive calculations to the cloud, but it also creates additional overhead and latency. Plus every single "cloud powerd" subroutine should have an equivalent "local" version running on the Xbone - unless you want games that won't even run without an internet connection. Another thing, I don't think that processing power will be free for devs to use. I suspect publishers will have to rent/buy server capacity if they want to use distributed computing in their games.

My point is that it's possible even if people on GAF think it's not.

Moving a few CPU intensive operations to the cloud would be huge in freeing up the CPU for local work.

As far as offline I would expect a LoD drop and most effects to be lessened and a few completely gone until reconnected.
 

maeh2k

Member
Of course he's right that it's very hard to use the power of the cloud in games. Some of the examples that were mentioned like lighting may even be very challenging to impossible. But that stuff is still being researched and experimented with. I wouldn't discount it as bullshit just yet, even if the first games that will utilize it may be a long way off.
Other applications like AI in the cloud seem quite feasible. Maybe not for every type of game, though.

I wouldn't discount the potential performance gains, either. For one they said it was like having one One at home, and three Ones in the cloud, so 4x the performance and not 40x. That obviously doesn't make the One four times as fast and will only have a limited number of applications, but in situations where using the cloud works, I have no doubt we'll see these kinds of gains (if a developer really decides to use the cloud / always-online).

Don't forget that the One will be used for years. If someone said that right now on the 360, after eight years, you could do some computation in the cloud so that the computation would run four times as fast as on a single 360, then that would sound pretty believable to me, since computers tend to be a lot faster than the 360 now.


When I think about that cloud stuff I don't really think realtime lighting. How about a game like Civilization. Should be relatively easy and use little bandwidth to do the whole AI computation in the cloud. In those kinds of games, waiting for AI moves can take quite a bit of time. Could theoretically also come with other benefits. E.g. if someone made an AI that uses machine learning and could learn from all the games that are played, that might be interesting.
 

FordGTGuy

Banned
Of course he's right that it's very hard to use the power of the cloud in games. Some of the examples that were mentioned like lighting may even be very challenging to impossible. But that stuff is still being researched and experimented with. I wouldn't discount it as bullshit just yet, even if the first games that will utilize it may be a long way off.
Other applications like AI in the cloud seem quite feasible. Maybe not for every type of game, though.

I wouldn't discount the potential performance gains, either. For one they said it was like having one One at home, and three Ones in the cloud, so 4x the performance and not 40x. That obviously doesn't make the One four times as fast and will only have a limited number of applications, but in situations where using the cloud works, I have no doubt we'll see these kinds of gains (if a developer really decides to use the cloud / always-online).

Don't forget that the One will be used for years. If someone said that right now on the 360, after eight years, you could do some computation in the cloud so that the computation would run four times as fast as on a single 360, then that would sound pretty believable to me, since computers tend to be a lot faster than the 360 now.


When I think about that cloud stuff I don't really think realtime lighting. How about a game like Civilization. Should be relatively easy and use little bandwidth to do the whole AI computation in the cloud. In those kinds of games, waiting for AI moves can take quite a bit of time. Could theoretically also come with other benefits. E.g. if someone made an AI that uses machine learning and could learn from all the games that are played, that might be interesting.

Forza 5 is apparently using it as a launch title, it's nice to see a intelligent post finally in one of these threads.
 

Godslay

Banned
Of course he's right that it's very hard to use the power of the cloud in games. Some of the examples that were mentioned like lighting may even be very challenging to impossible. But that stuff is still being researched and experimented with. I wouldn't discount it as bullshit just yet, even if the first games that will utilize it may be a long way off.
Other applications like AI in the cloud seem quite feasible. Maybe not for every type of game, though.

I wouldn't discount the potential performance gains, either. For one they said it was like having one One at home, and three Ones in the cloud, so 4x the performance and not 40x. That obviously doesn't make the One four times as fast and will only have a limited number of applications, but in situations where using the cloud works, I have no doubt we'll see these kinds of gains (if a developer really decides to use the cloud / always-online).

Don't forget that the One will be used for years. If someone said that right now on the 360, after eight years, you could do some computation in the cloud so that the computation would run four times as fast as on a single 360, then that would sound pretty believable to me, since computers tend to be a lot faster than the 360 now.


When I think about that cloud stuff I don't really think realtime lighting. How about a game like Civilization. Should be relatively easy and use little bandwidth to do the whole AI computation in the cloud. In those kinds of games, waiting for AI moves can take quite a bit of time. Could theoretically also come with other benefits. E.g. if someone made an AI that uses machine learning and could learn from all the games that are played, that might be interesting.

AI stuff works for FPS too. Oddly enough.
 

Vol5

Member
If MS gave any kind of shit about power they would have a better gpu. The amount of damage control coming form them is laughable. PS4 is more powerful. Deal with it.
 

Tellaerin

Member
And yet he does not, which is the point I am trying to make. He is bias and should not be taken seriously in this context.

Biased. Bias is a noun, biased is an adjective.

I have no idea whether fucking that up is some meme I don't know about or people just don't know the difference, but either way, it's annoying.
 

Hana-Bi

Member
If Forza with 60fps will look better than Drive Club with 30fps I don't care which magic sauce lies in the cloud.

I hope we hear some examples for cloud computing on e3...
 

Perkel

Banned
Yep for instance in a game like Forza, when you go to start a burnout the local hardware can start the smoke effects and once the servers catch up they can continue these effects without bogging down the CPU.

In fact if you watch the Forza 5 trailer you can see a ton of smoke lingering after one of the McLarens drift around the corner I'm actually guessing this is what they are doing to pull it off.

Global lighting and weather is a pretty good example because you have no direct control over it as a player.

You have absolutely no idea what you are talking about, in few minutes you will probably say that thanks to cloud textures will be bigger and there will be more trees in forest.

Weather simulation as when rain happen YES
Better weather effects on screen NO

If you have 33ms (30FPS) or 16.6ms(60FPS) you can't offload to cloud any rendering/graphical task because latency of most people is above 50ms sometimes even 100ms.

This is essentially why you either have small multiplayer games (fps games) that can do proper action based gameplay or you have games like WoW where everything is far from being action. And that is for simple data like gameplay logic.

Cloud as you described will happen but not in next 5-10 years
 

FordGTGuy

Banned
If Forza with 60fps will look better than Drive Club with 30fps I don't care which magic sauce lies in the cloud.

I hope we hear some examples for cloud computing on e3...

I think the best thing they could do is have Xbox Ones running in real time on normal connections, one offline and one online side by side showing off the differences that are happening in real time.

I really don't expect this though.

You have absolutely no idea what you are talking about, in few minutes you will probably say that thanks to cloud textures will be bigger and there will be more trees in forest.

Weather simulation as when rain happen YES
Better weather effects on screen NO

If you have 33ms (30FPS) or 16.6ms(60FPS) you can't offload to cloud any rendering/graphical task because latency of most people is above 50ms sometimes even 100ms.

This is essentially why you either have small multiplayer games (fps games) that can do proper action based gameplay or you have games like WoW where everything is far from being action. And that is for simple data like gameplay logic.

Cloud as you described will happen but not in next 5-10 years

You really don't understand the idea of non-sensitive to latency computations do you?

Are you really so blinded by your fanboyism?
 

fallingdove

Member
Yep for instance in a game like Forza, when you go to start a burnout the local hardware can start the smoke effects and once the servers catch up they can continue these effects without bogging down the CPU.

In fact if you watch the Forza 5 trailer you can see a ton of smoke lingering after one of the McLarens drift around the corner I'm actually guessing this is what they are doing to pull it off.
You must write for tales from my ass.
 

ryamkajr

Banned
God, he is living up to his last name' - Blow(s). One "decent" game does not make you an expert on the business of games.
He has become as whiny a b*tch as Molyneux is an egotistical marketer.

I wish these d-bags would just shut up and make games.
 
Jonathan Blow, who is making an "exclusive" game for the PS4, is bad mouthing PS4's direct competitor? I wouldn't pay attention to him, he has a horse in this race and he will say whatever he wants so people buy more PS4s over XBones. Every PS4 sale is a possible Witness sale = money in his pocket.

Regardless if he's right or not, who do you think will have more servers running by launch, Sony or MS?
 
God, he is living up to his last name' - Blow(s). One "decent" game does not make you an expert on the business of games.
He has become as whiny a b*tch as Molyneux is an egotistical marketer.

I wish these d-bags would just shut up and make games.

Nothing he said warrants this kind of personal attack.
 

ymmv

Banned
Do you have any evidence that something like this isn't possible?

Or are you talking out of your ass too?

Because it could be true, it must be true? Why not first assume that what you see on screen is powered by the Xbox One instead of attributing it to the power of the cloud? It's not like a bit of hovering smoke is beyond the power of even current gen consoles ...
 

StuBurns

Banned
God, he is living up to his last name' - Blow(s). One "decent" game does not make you an expert on the business of games.
He has become as whiny a b*tch as Molyneux is an egotistical marketer.

I wish these d-bags would just shut up and make games.
Thank God you used an asterisk, or I'd have thought your post was obnoxious.
 

Auto_aim1

MeisaMcCaffrey
God, he is living up to his last name' - Blow(s). One "decent" game does not make you an expert on the business of games.
He has become as whiny a b*tch as Molyneux is an egotistical marketer.

I wish these d-bags would just shut up and make games.
Seriously? He's an intelligent guy. There's no need to dismiss him like that. Besides Microsoft need to clarify a lot of things, people are going to question things that they doubt or if something doesn't sound right.
 
There is some military-grade naivete going on in here. It doesn't matter how many computations the cloud is capable of because all it will be doing is pushing Mountain Dew ads to your in-game billboards and storing screenshots. There is no infrastructure for delivering what MS is claiming. Every performance multiplier being thrown around here is just laughable.
 

Perkel

Banned
You really don't understand the idea of non-sensitive to latency computations do you?

Are you really so blinded by your fanboyism?

I don't think you know what is non-sensitive and what is sensitive in first place.
Smoke effects/lighting just proves that.

Cloud computation is nothing new. Every MMO game is doing cloud compute be it players position, stats, gameplay etc It is cloud compute.
 

Aeonin

Member
You must write for tales from my ass.

Haha - perfect response. Gave me a good hearty laugh.

driver116 said:
Cloud rendering, what a load of BS

Indeed.

How could anyone, who has any knowledge of low-latency graphics, not be anything but skeptical is beyond me.

The idea of leaving smoke effects on a car to the cloud is almost "polish the graphics on level 5". You wouldn't get that shit back in any relevant time frame.
 

Alx

Member
When I think about that cloud stuff I don't really think realtime lighting. How about a game like Civilization. Should be relatively easy and use little bandwidth to do the whole AI computation in the cloud. In those kinds of games, waiting for AI moves can take quite a bit of time. Could theoretically also come with other benefits. E.g. if someone made an AI that uses machine learning and could learn from all the games that are played, that might be interesting.

Yeah, turn/turn games seem like an easy way to use cloud-based AI (too bad it's a rare genre today :D).
Since I'm no expert in game architecture, I always had the example of chess games in mind, because the AI is based on testing all possible sequence of moves. While a console could be able to predict 5 moves with its local hardware, it would be easy to send a small amount of data (the current board configuration) to a powerful server, have it predict 10 moves, and get the result back. And voilà, you're playing against a supercomputer, while using it only for a few milliseconds here and there.
 

FordGTGuy

Banned
Because it could be true, it must be true? Why not first assume that what you see on screen is powered by the Xbox One instead of attributing it to the power of the cloud? It's not like a bit of hovering smoke is beyond the power of even current gen consoles ...

I'm not the one with the burden of proof.

I merely stated a hypothetical example, I have no proof to back them up and I never claimed to have any.

However if you're going to claim that it's not possible you have the burden of proof to back up these claims.

Haha - perfect response.

Indeed.

How could anyone, who has any knowledge of low-latency graphics, not be anything but skeptical is beyond me.

The idea of leaving smoke effects on a car to the cloud is almost "polish the graphics on level 5". You wouldn't get that shit back in any relevant time frame.

Another person who has no understanding of what we are talking about.

I don't think you know what is non-sensitive and what is sensitive in first place.
Smoke effects/lighting just proves that.

Cloud computation is nothing new. Every MMO game is doing cloud compute be it players position, stats, gameplay etc It is cloud compute.

Did you even read what I said about smoke effects?

I said that the local hardware would start them and the cloud would then pick up once the latency catches up and continue the smoke effects without bogging down the CPU.
 

Godslay

Banned
There is some military-grade naivete going on in here. It doesn't matter how many computations the cloud is capable of because all it will be doing is pushing Mountain Dew ads to your in-game billboards and storing screenshots. There is no infrastructure for delivering what MS is claiming. Every performance multiplier being thrown around here is just laughable.

Back your assertion with data. Nobody is throwing multipliers around.
 

zou

Member
Blow is full of shit himself.

Of course Azure is running on virtualized hardware, anything else wouldn't really be feasible. And it's pretty much standard, AWS is all "VMs", as is Google Compute.

As for the 300k number: While no one in the industry gives out this sort of information, it's unlikely that the 300k represent anywhere near the total number of physical machines. So 300k could very well be the number of dedicated servers just for XB1. In 2008 Microsoft mentioned they were adding tens of thousands of servers a month. That was before they even launched Azure. In 2009 they opened up their Chicago data center, which can hold anywhere from 200k - 400k servers. That's one data center, almost 5 years ago. Since then they've opened additional data centers in Iowa, Ireland, Amsterdam and Virgian, to name a few.

So yes, Blow is full of shit.

Sources:

At a conference in Las Vegas last week, Michael Manos, Microsoft's senior director of data center services, said in a keynote speech that the first floor of a data center being built by the software vendor in the Chicago area will hold up to 220 shipping containers, each preconfigured to support between 1,000 and 2,000 servers, according to various news reports and blog posts.

That means the $500 million, 550,000-square-foot facility in the Chicago suburb of Northlake, Ill., could have as many as 440,000 Windows servers on the first floor alone &#8212; or up to 11 times more than the total of 40,000 to 80,000 servers that conventional data centers of the same size typically can hold, according to Manos. He was quoted as saying that Microsoft also plans to install an undisclosed number of servers on the building's second floor, which will have a traditional raised-floor layout.

http://www.computerworld.com/s/arti...uilds_first_major_container_based_data_center


http://www.globalfoundationservices...icrosoft-cloud-scale-data-center-designs.aspx
 

leroidys

Member
Yep for instance in a game like Forza, when you go to start a burnout the local hardware can start the smoke effects and once the servers catch up they can continue these effects without bogging down the CPU.

In fact if you watch the Forza 5 trailer you can see a ton of smoke lingering after one of the McLarens drift around the corner I'm actually guessing this is what they are doing to pull it off.

I don't think real time smoke rendering is a good candidate due to how much data is involved.
 
Top Bottom