• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why don't MS offer dedicated servers to 3rd party MP games?

gofreak

GAF's Bob Woodward
So it's not 'free' for devs? At least beyond a certain point?

The biggest 'revolution' in MS's cloud talk was the idea that it would be provisioned for 'free' or as built-in to the Live fee.

Any dev on any platform can spin up some cloud servers at a cost for their game, if they want.
 

CariusD

Member
Uh.. because there will still be people playing on the Xbox?

What I meant is while this is impressive stuff, I don't see 3rd parties going out of their way in terms of extra dev/QA time for things like extra detailed vegetation interaction compared to the amounts that have been shown for a local physics system on top of the actual bill for the servers.

For multiplayer systems I can imagine it is a good price for smaller studios.
 

Eoin

Member
What I meant is while this is impressive stuff, I don't see 3rd parties going out of their way in terms of extra dev/QA time for things like extra detailed vegetation interaction compared to the amounts that have been shown for a local physics system on top of the actual bill for the servers.

For multiplayer systems I can imagine it is a good price for smaller studios.
Age of ascent is a PC game though isn't it? That tweet doesn't really tell us anything about cost for the xbox one.
 

Qassim

Member
What I meant is while this is impressive stuff, I don't see 3rd parties going out of their way in terms of extra dev/QA time for things like extra detailed vegetation interaction compared to the amounts that have been shown for a local physics system on top of the actual bill for the servers.

For multiplayer systems I can imagine it is a good price for smaller studios.

It can be used for something as simple as just multiplayer servers, if Microsoft give a good deal to developers (which it sounds like they are), and there is good integration into the SDK, it may be silly to not use them. The advantage of cloud platforms like this is to rapidly scale up and down based on demand without any of the cost of running this yourself.

I thought that these servers were MS platform exclusive, if they aren't then I admit that weakens my argument.

Microsoft provide the Azure platform for anyone to use for almost any purpose. https://azure.microsoft.com/en-us/

Apple have been known to use Microsoft Azure to help deliver some of their own services that are in direct competition with other Microsoft services. From Azure's perspective, Microsoft is just a service provider and if they placed competition-based limits on it, no one would use it. Even if someone weren't immediately planning to use their Azure platform to compete with Microsoft, what if Microsoft decided to move into another area of business and then were competitors? That's why these cloud service platforms remain neutral, Google, Amazon, Microsoft, etc.

Simplifying infrastructure may help bring down costs, so I wouldn't be surprised that despite the discounts for the Xbox side, that we may see developers use the same Azure platform to host services for the PlayStation games too. If they want to use one of these cloud platforms anyway, it may reduce costs significantly if they can have all their versions of the game running off a single infrastructure (reduced costs in maintaining different toolsets for deployment, simpler troubleshooting, etc).
 

jimi_dini

Member
I don't see why based on this tweet, https://twitter.com/lee_stott/status/486437448348860417 that 3rd parties would pay 6.5 million a year per 1 million users when a majority of sales are going to the playstation.

Wait, why are players forced to pay for online, when 3rd parties have to pay for servers as well? And does this mean that 3rd parties will have to pay each year? Which means they could just shut down those servers prematurely? Or are 3rd parties required to pay until "the two" gets released?
 

CariusD

Member
Wait, why are players forced to pay for online, when 3rd parties have to pay for servers as well? And does this mean that 3rd parties will have to pay each year? Which means they could just shut down those servers prematurely? Or are 3rd parties required to pay until "the two" gets released?

Because of the need for profits, and yes the third parties would need to pay as long as they use them. They could shut down whenever they wanted to before, nothing changes that. And I assume they could work out overarching plan that would be per studio per year, not necessarily per project.
 

Bsigg12

Member
Because

A) That's actually a pretty decent price for the level of service being provided and

B) You can use these servers for PlayStation games.

I'm pretty sure the heavily discounted price for using them only applies to what you use for Xbox. If you went on to use them for PlayStation, they probably move the rate back to their standard corporate cloud compute prices.
 
I'm pretty sure the heavily discounted price for using them only applies to what you use for Xbox. If you went on to use them for PlayStation, they probably move the rate back to their standard corporate cloud compute prices.

That price was for regular use; the game in question was on PC. It'd be even cheaper on Xbox.
 
This is what I am allowed to share.

sequenz010wuvh.gif


Running in real-time on XBO. Very early wip, so don't care for the lighting and so on. It is a very basic frequency test where the grass splines update 12 times a second. This is nothing special so far. The cool thing is tho that the start and endpoints of our splines influenced by wind and objects are being calculated by Azure. This means: the physic calculations you see are costing us pretty much no local power (excluding GPU ofc). We can use the saved power for other things - like AI, animations and so on. We are very proud of it - especially since we completely eliminated any chance of clipping. I just wanted to add that here.

And no, this won't be a golf/grass/whatever simulator - I just thought maybe it is interesting to see;)

dam...
 

JaggedSac

Member
This is what I am allowed to share.

sequenz010wuvh.gif


Running in real-time on XBO. Very early wip, so don't care for the lighting and so on. It is a very basic frequency test where the grass splines update 12 times a second. This is nothing special so far. The cool thing is tho that the start and endpoints of our splines influenced by wind and objects are being calculated by Azure. This means: the physic calculations you see are costing us pretty much no local power (excluding GPU ofc). We can use the saved power for other things - like AI, animations and so on. We are very proud of it - especially since we completely eliminated any chance of clipping. I just wanted to add that here.

And no, this won't be a golf/grass/whatever simulator - I just thought maybe it is interesting to see;)

This is pretty cool. What kind of bandwidth does this sort of thing use?

EDIT: I see you are not banned. Must have been verified.
 

tkalamba

Member
This is what I am allowed to share.

sequenz010wuvh.gif


Running in real-time on XBO. Very early wip, so don't care for the lighting and so on. It is a very basic frequency test where the grass splines update 12 times a second. This is nothing special so far. The cool thing is tho that the start and endpoints of our splines influenced by wind and objects are being calculated by Azure. This means: the physic calculations you see are costing us pretty much no local power (excluding GPU ofc). We can use the saved power for other things - like AI, animations and so on. We are very proud of it - especially since we completely eliminated any chance of clipping. I just wanted to add that here.

And no, this won't be a golf/grass/whatever simulator - I just thought maybe it is interesting to see;)

Now I want to see more
 

hipbabboom

Huh? What did I say? Did I screw up again? :(
I was hoping to read more discussion on this topic. It's weird how the thread got quite after reason proved to be valid. Surely there's more to say than "MS is being dishonest."

I was a little deflated when I heard the emphatic statement that cloud could not be used for light techniques. I was hoping their would be a way to do real-time voxel volumes for photon tracing to say something like... if a dynamic object is at this point in space, here are the static rays bouncing off of static objects that could be used for local calculations... clearly this isn't my area of expertise but I've always been interesting in such styles of massively parallel real-time remote calculations.
 

arhra

Member
I was hoping to read more discussion on this topic. It's weird how the thread got quite after reason proved to be valid. Surely there's more to say than "MS is being dishonest."

I was a little deflated when I heard the emphatic statement that cloud could not be used for light techniques. I was hoping their would be a way to do real-time voxel volumes for photon tracing to say something like... if a dynamic object is at this point in space, here are the static rays bouncing off of static objects that could be used for local calculations... clearly this isn't my area of expertise but I've always been interesting in such styles of massively parallel real-time remote calculations.

Nvidia have done some preliminary research into cloud-offloaded lighting solutions. There doesn't seem to be any reason why it can't work, just practical considerations of algorithm design and whether you can amortize the calculations across enough users to make it economically viable.
 

tkalamba

Member
I was hoping to read more discussion on this topic. It's weird how the thread got quite after reason proved to be valid. Surely there's more to say than "MS is being dishonest."

I was a little deflated when I heard the emphatic statement that cloud could not be used for light techniques. I was hoping their would be a way to do real-time voxel volumes for photon tracing to say something like... if a dynamic object is at this point in space, here are the static rays bouncing off of static objects that could be used for local calculations... clearly this isn't my area of expertise but I've always been interesting in such styles of massively parallel real-time remote calculations.

I'm in the same boat. I really want to know more about this and the discussion seems to have died.
 

hipbabboom

Huh? What did I say? Did I screw up again? :(
Nvidia have done some preliminary research into cloud-offloaded lighting solutions. There doesn't seem to be any reason why it can't work, just practical considerations of algorithm design and whether you can amortize the calculations across enough users to make it economically viable.

Nvidia's solution seemed pretty interesting but the way it was explained during the keynote a few years ago where it was demoed made it seem like the client was a glorified thin-client and "at a driver level", information was being sent back to a powerful server where the calculation happened on a massively paralleled GPGPU level.

The two things that stand out from how this would be viable for games are:
~ The technique would have to be moved to a higher level an compartmentalized so a vast majority would have to be done on the server end such that the client does little to know calculation when applying the calculated result (in one round trip to the server for efficiency)

~ I don't know if MS supports or plans to support GPGPU hardware acceleration on Azure. I remember (I'm going to get this wrong) about something MS called ILM which was used to do massively parallel work but like Nvidia's solution, these were applications designed to run on the cloud exclusively as jobs. The amount of CPU power required for some real-time lighting calculation may be more cost effective if GPU computing could be leveraged on the cloud-end of things.

what he described has been done by MMOs for ages :p

Well lets be realistic rather than reductive for a moment: No one is saying this is re-inventing networking or even client-server design patterns. This is just the next step in a world where computing is moving more towards parallelism. MMOs have always calculated events server-side and relays location and states of objects back to the client so all the client has to do is render objects in the clients static world in a certain way. To your point, what's being described seems (to me at least) employ that general paradigm as a baseline. Where things differ seems to be the. So far we described something analogous to a meat grinder where everyone's input goes in and general output goes back out to everyone. This method is more of an bank analogy where everyone's money goes into the economy (server) and the interest rates, for example, becomes a variable that everyone has access to but will not use in exactly the same way. Ultimately, you the client do not have the resources, nor does it make sense, to do the real-time calculation of economic models that would be necessary to get that accurate interest rate value but surely you can use it efficiently to figure out how much a loans going to cost you over time. This works because the rules of immediacy for interest rates are less impactful to you on a day by day basis but might become less reliable if its a month old. There's a time window for data coming back from the server that's somewhere between every few frames to every few seconds so you would need to be cognizant of such a thing in designing such a system.
 
I'm in the same boat. I really want to know more about this and the discussion seems to have died.

Well no ones going to know this discussion is occurring due to the thread title. Also I'm fairly sure most people are so convinced its bullshit they won't be satisfied until its in their hands and they're playing it. And even then it'll be seen as Online DRM.
 

tkalamba

Member
Well no ones going to know this discussion is occurring due to the thread title. Also I'm fairly sure most people are so convinced its bullshit they won't be satisfied until its in their hands and they're playing it. And even then it'll be seen as Online DRM.

I'm hoping that at the very least, Kampfheld can come and explain more. I understand he can't give away too much as is the nature of his work, but I wouldn't mind reading more about how things are being accomplished etc.
 
Well no ones going to know this discussion is occurring due to the thread title. Also I'm fairly sure most people are so convinced its bullshit they won't be satisfied until its in their hands and they're playing it. And even then it'll be seen as Online DRM.

This is the biggest problem I see for cloud compute games. People have been burned so many times by overly restrictive DRM schemes that anyone trying to legitimately take advantage of "the cloud" is going to have a hell of a time convincing people to give them a shot. Kampfheld's solution is probably the best way to go, but programming everything twice is a tough sell to publishers.
 

arhra

Member
Nvidia's solution seemed pretty interesting but the way it was explained during the keynote a few years ago where it was demoed made it seem like the client was a glorified thin-client and "at a driver level", information was being sent back to a powerful server where the calculation happened on a massively paralleled GPGPU level.

If you read the paper (or just watch the video for a higher-level overview), they actually presented three possible models using different lighting algorithms, and varying amounts of work done on the server:

pZdHhTE.png


Any cloud-augmented client is going to be "thinner" than a comparable one that does everything locally, but there are varying degrees of thinness.

The bigger issue, is because it's Nvidia research, they naturally based it on compute nodes with GPUs included, which as you say, Azure doesn't currently offer (although they're apparently considering it), so more infrastructure build-out would be required before it's useful to Xbox.
 

Krilekk

Banned
This is the biggest problem I see for cloud compute games. People have been burned so many times by overly restrictive DRM schemes that anyone trying to legitimately take advantage of "the cloud" is going to have a hell of a time convincing people to give them a shot. Kampfheld's solution is probably the best way to go, but programming everything twice is a tough sell to publishers.

It's just normal and cloud powered detail levels. I think typical PC games have at least four detail levels from low to ultra so it's nothing new. The tough sell will be to let them pay an additional $6.5 per online user.

If the difference between offline and online is big enough people will go online. I can't actually imagine anybody using the X1 offline, even basic functions don't work without Live.
 
This is the biggest problem I see for cloud compute games. People have been burned so many times by overly restrictive DRM schemes that anyone trying to legitimately take advantage of "the cloud" is going to have a hell of a time convincing people to give them a shot. Kampfheld's solution is probably the best way to go, but programming everything twice is a tough sell to publishers.

Also if your a third party do you create a more dynamic online only experience for XB1?

It's going to be opening another Pandora's box if they go down that right, the outcry would be enormous and I don't envy the developer who has to make that decision. But as an XB1 owner I hope they do take that risk.

I hope Microsoft's first party's really take advantage of this though and demonstrate what can be down.
 
It's just normal and cloud powered detail levels. I think typical PC games have at least four detail levels from low to ultra so it's nothing new. The tough sell will be to let them pay an additional $6.5 per online user.

If the difference between offline and online is big enough people will go online. I can't actually imagine anybody using the X1 offline, even basic functions don't work without Live.

If you turn on offline mode they do, but that's beside the point.

I was referring more to dev time. PC settings are generally not programmed separately, right? The flexibility is built into the coding. The way Kamp described the process was that they essentially had to build the same section of code twice, once for cloud process, once for offline. I think, anyway. Might be misreading it.
 

tkalamba

Member
If you turn on offline mode they do, but that's beside the point.

I was referring more to dev time. PC settings are generally not programmed separately, right? The flexibility is built into the coding. The way Kamp described the process was that they essentially had to build the same section of code twice, once for cloud process, once for offline. I think, anyway. Might be misreading it.

Coding it twice sounds like it could lead to so many problems with QA and bugs. I assume they'd approach it like it's approached on PC with graphic sliders and the like.
 

hipbabboom

Huh? What did I say? Did I screw up again? :(
If you read the paper (or just watch the video for a higher-level overview), they actually presented three possible models using different lighting algorithms, and varying amounts of work done on the server:

pZdHhTE.png


Any cloud-augmented client is going to be "thinner" than a comparable one that does everything locally, but there are varying degrees of thinness.

The bigger issue, is because it's Nvidia research, they naturally based it on compute nodes with GPUs included, which as you say, Azure doesn't currently offer (although they're apparently considering it), so more infrastructure build-out would be required before it's useful to Xbox.

MS needs to get on the ball of this. I get the feeling that that's what they've been trying to do in many of the directX standards they've been introducing over the past few years but I expected them to announce this along with DX12 but its been crickets. It leads me to believe that they don't have a strategy to normalize the competing instruction set standards.

I hope Microsoft's first party's really take advantage of this though and demonstrate what can be down.

This is the second reason I'm looking forward to Crackdown; I want to see how they plan to prove out the idea behind some of the techniques they've spoke on at a higher level about until now. I think the future of these technologies is also an exercise in PR management and until internet performance and availability is as reliable as electricity, many of the games will hopefully be prudent enough to include ingenious fallbacks for offline play.

Coding it twice sounds like it could lead to so many problems with QA and bugs. I assume they'd approach it like it's approached on PC with graphic sliders and the like.

Could you believe there was a much simpler time in game development where the underlying game code was relatively straight-forward enough that most of the variables that devs dealt with before the advent of middleware were driver fragmentation challenges for used about every input-output permutation imaginable?
 

k3rn3ll

Neo Member
MS needs to get on the ball of this. I get the feeling that that's what they've been trying to do in many of the directX standards they've been introducing over the past few years but I expected them to announce this along with DX12 but its been crickets. It leads me to believe that they don't have a strategy to normalize the competing instruction set standards.



This is the second reason I'm looking forward to Crackdown; I want to see how they plan to prove out the idea behind some of the techniques they've spoke on at a higher level about until now. I think the future of these technologies is also an exercise in PR management and until internet performance and availability is as reliable as electricity, many of the games will hopefully be prudent enough to include ingenious fallbacks for offline play.



Could you believe there was a much simpler time in game development where the underlying game code was relatively straight-forward enough that most of the variables that devs dealt with before the advent of middleware were driver fragmentation challenges for used about every input-output permutation imaginable?

Theoretically couldn't an offline mode work in the same way as optimization sliders work on pc? I understand it's more complex than that but that would have to be their ultimate goal without passing off a whole bunch of the community. (MMost of which probably would have good enough Internet anyways)
 

Guerrilla

Member
I know GAF "in general" is very pessimistic about the cloud. And yes, somehow I can understand that - because so far, there weren't that much games that really took advantage of server calculations. To be honest, no new-gen so far really did (Titanfall touched 5% of the possibilities ...). And this is also why I try to be very careful about the words I choose.

No. You can not boost your games resolution with Azure. And no. You can not create better lighting effects with Azure. But, if you focus on it, you can still boost the overall graphical look of your game by a mile. We are currently creating a game. But in fact, we are kind of creating two-in-one. One with Azure available, and one for offline only. Everything you code, you need to code for two scenerarios. This is a ton of work. if online = dynamic grass; if offline=static grass ... To say it very simple. And so on.That's why we are currently thinking about going "online-only". But to be very open to you, we have some fear about that. Obviously. The gaming community is very careful when they hear "online-only" ... Games like Sim City simply ... Well, did it wrong.

If I could show you a screen comparison of our latest build right now - Azure on/off (no, sorry, I can't ...), you would understand what I am talking about. Wind, dynamically moving vegetations, footprints that stay for hours and even wildlife nearly without losing any local CPU power. This is just awesome in the right situations.

I know MS has some own projects in the works, too, that will go all-in with the Azure servers. Crackdown is the already known example.

That's all I can say for now. Really. I'm out here! :)

I can definitely understand those concerns, and it's not the same as destiny as someone posted earlier, since destiny is a multiplayer game. If this is single player there will certainly be some haters for online only.

On the other hand I reckon you would get a lot of good will if you show more examples of the difference here later when the game is ready to be announced. If you explain it honestly (as you are doing already) and walk people through it, the majority will be ok with it i think. And never underestimate the power of gaf ;) N4g and so on are basically just cherry picking gaf conversations and all the journos of the big publications are reading threads on here.

So positive buzz on here will spread out.


But on another note, I'm baffled this isn't getting more replies. If phil spencer just mentions the cloud anywhere, we get a 1000 replies per minute ;) and now a dev comes out and basically tells us tha power of da cloud is actually real and has real life benefits (in some areas at least), and the discussion dies down one page later...
Maybe it really is due to the topic name, maybe there should be a split with a new topic named accordingly. This seems way to big of a deal to die here...
 

Guerrilla

Member
It's just normal and cloud powered detail levels. I think typical PC games have at least four detail levels from low to ultra so it's nothing new. The tough sell will be to let them pay an additional $6.5 per online user.

If the difference between offline and online is big enough people will go online. I can't actually imagine anybody using the X1 offline, even basic functions don't work without Live.

Is it really that expensive for x1 games?

Edit: Ah ok the 6.5$ per user is a flawed calculation based on a tweet regarding a game that's not coming to xbox one ;)
He just did the following calculation: 100.000 users x 24 x 75$ x 365 x10 for a million users. But I think if you have a million users there will on average never be more than 100,000 online simultaneously so you only would need a 100k slots which would cut the price down to 0.65$ per user and now add the very probable discount you will get for xbox one game and you are closer to 40 cents per user. Which makes a whole lot more sense and seems a whole lot more realistic. + this isnt even taking into account the decreasing userbase. A single player game will be mostly played in the first month. So everyone only playing for one month would only cost a 12th of that 40 cents (This is all pure speculation of course)
 

p3tran

Banned
This is what I am allowed to share.

sequenz010wuvh.gif


Running in real-time on XBO. Very early wip, so don't care for the lighting and so on. It is a very basic frequency test where the grass splines update 12 times a second. This is nothing special so far. The cool thing is tho that the start and endpoints of our splines influenced by wind and objects are being calculated by Azure. This means: the physic calculations you see are costing us pretty much no local power (excluding GPU ofc). We can use the saved power for other things - like AI, animations and so on. We are very proud of it - especially since we completely eliminated any chance of clipping. I just wanted to add that here.

And no, this won't be a golf/grass/whatever simulator - I just thought maybe it is interesting to see;)

thanks for sharing.
the deafening silence, I guess it means that you were heard.
;)

whatever else you might be able to tell us, be sure you do it too
cheers!
 
This is what I am allowed to share.

sequenz010wuvh.gif


Running in real-time on XBO. Very early wip, so don't care for the lighting and so on. It is a very basic frequency test where the grass splines update 12 times a second. This is nothing special so far. The cool thing is tho that the start and endpoints of our splines influenced by wind and objects are being calculated by Azure. This means: the physic calculations you see are costing us pretty much no local power (excluding GPU ofc). We can use the saved power for other things - like AI, animations and so on. We are very proud of it - especially since we completely eliminated any chance of clipping. I just wanted to add that here.

And no, this won't be a golf/grass/whatever simulator - I just thought maybe it is interesting to see;)
I can't wait for forza horizon 3 driving through the fields.
 

Kssio_Aug

Member
Honestly I'm believing on the cloud feature since I read more about the new Crackdown.

I completly understand the general skepticism, but if what they told us about Crackdown and the desctruction level is true (and they talk pretty confident about it) then I guess we will have some really cool games for Xbox One on the near future. And its very good to have a second opinion about it, specially coming from someone who is working with that feature on another game.

Can't wait to see it in action!
 

Fezan

Member
I know GAF "in general" is very pessimistic about the cloud. And yes, somehow I can understand that - because so far, there weren't that much games that really took advantage of server calculations. To be honest, no new-gen so far really did (Titanfall touched 5% of the possibilities ...). And this is also why I try to be very careful about the words I choose.

No. You can not boost your games resolution with Azure. And no. You can not create better lighting effects with Azure. But, if you focus on it, you can still boost the overall graphical look of your game by a mile. We are currently creating a game. But in fact, we are kind of creating two-in-one. One with Azure available, and one for offline only. Everything you code, you need to code for two scenerarios. This is a ton of work. if online = dynamic grass; if offline=static grass ... To say it very simple. And so on.That's why we are currently thinking about going "online-only". But to be very open to you, we have some fear about that. Obviously. The gaming community is very careful when they hear "online-only" ... Games like Sim City simply ... Well, did it wrong.

If I could show you a screen comparison of our latest build right now - Azure on/off (no, sorry, I can't ...), you would understand what I am talking about. Wind, dynamically moving vegetations, footprints that stay for hours and even wildlife nearly without losing any local CPU power. This is just awesome in the right situations.

I know MS has some own projects in the works, too, that will go all-in with the Azure servers. Crackdown is the already known example.

That's all I can say for now. Really. I'm out here! :)

I was very pessimistic about it because I have worked with cloud processing but I believe the scenario you have posted is exactly what can he achieved with cloud for enhancing graphics. Changes in environment with which players are not interacting too much changing dynamically
 
J

JoJo UK

Unconfirmed Member
But won't devs still have to code the game in a way to cater to offline gamers too?
So crappier AI and animations for offline mode?
Well it's wont be crappier, just not enhanced. The offline AI/physics version would just play as current games.
 

Ghost

Chili Con Carnage!
But won't devs still have to code the game in a way to cater to offline gamers too?
So crappier AI and animations for offline mode?

...what? You take the cloud away and you've got the exact same animation & AI, you just haven't got the fancy grass (lighting, whatever it may be that's rendered in the cloud).

Think of it like graphical sliders on a PC game, the game doesn't change as you move the sliders down it just loses graphical fidelity.
 
Top Bottom