• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Engadget: the secret sauce of Titanfall is Microsoft's cloud

Need a technical reason why my butt doesn't improve graphics so I can paste that shit after every astro turfing post.

Code:
Numbers Everyone Should Know

[B]L1 cache reference                             0.5 ns
Branch mispredict                              5 ns
L2 cache reference                             7 ns
Mutex lock/unlock                            100 ns (25)
Main memory reference                        100 ns[/B]
Compress 1K bytes with Zippy              10,000 ns (3,000)
Send 2K bytes over 1 Gbps network         20,000 ns
Read 1 MB sequentially from memory       250,000 ns
Round trip within same datacenter        500,000 ns
Disk seek                             10,000,000 ns
[I]Read 1 MB sequentially from network   10,000,000 ns
Read 1 MB sequentially from disk      30,000,000 ns (20,000,000)[/I]
[I]Send packet CA->Netherlands->CA      150,000,000 ns[/I]

http://www.cs.cornell.edu/projects/ladis2009/talks/dean-keynote-ladis2009.pdf

Basically the bolded is where you're doing your calculations and setting up and rendering. Italicised is where the "cloud" is.
 
That isn't the point of 90% of CoD and now TitanFall's design though. The entire point of these games is to have a skill ceiling crushed shooter where even shitty players can jump in and feel like they "win". The AI in TitanFall is just there for these players to grind their perk progress on, nothing more. That is literally their entire point, making the AI smart or dangerous means they might as well cut the AI and increase the player cap.

Has this ever actually been said by anyone on the development team? I see this rationale for the dumb AI often.
 
I spent hours playing against bots on the Unreal Tournament 99 demo back in the days... It was computed locally, nothing fancy but I I was fighting gangs of Albert Einstein clones when compared to Titanfall's grunts... In the beta at least. Has somebody ever died to a grunt?

I did once actually. He did a meelee attack from the back, I haven't seen him.
 
mj-spins-michael-jackr9qzj.gif

Perfect gif

Also that article is pure unadulterated creamy smooth horseshit of the finest quality
 
Code:
Numbers Everyone Should Know

[B]L1 cache reference                             0.5 ns
Branch mispredict                              5 ns
L2 cache reference                             7 ns
Mutex lock/unlock                            100 ns (25)
Main memory reference                        100 ns[/B]
Compress 1K bytes with Zippy              10,000 ns (3,000)
Send 2K bytes over 1 Gbps network         20,000 ns
Read 1 MB sequentially from memory       250,000 ns
Round trip within same datacenter        500,000 ns
Disk seek                             10,000,000 ns
[I]Read 1 MB sequentially from network   10,000,000 ns
Read 1 MB sequentially from disk      30,000,000 ns (20,000,000)[/I]
[I]Send packet CA->Netherlands->CA      150,000,000 ns[/I]

http://www.cs.cornell.edu/projects/ladis2009/talks/dean-keynote-ladis2009.pdf

Basically the bolded is where you're doing your calculations and setting up and rendering. Italicised is where the "cloud" is.
... that one packet is going from California to the Netherlands then back to California. Crossing the ocean twice. That would be a rare case which would only happen in the most extreme conditions (not being able to find anyone within half of the planet to play against). Under normal conditions that packet would be going to an Azure Data center within the same region, thus a lower response time then going across the world.

Azure is nothing but servers available for anyone who can pay for them to use. Just like Amazon's AWS service or RackSpace etc etc. A lot of devs already use AWS etc for their games. The only difference here is that, MS is giving devs Azure for free and that is the only reason they are using it. There is NO advantage of Azure over any other cloud computing provider. Actually MS Azure was late to the party with the servers in the cloud and only supports Windows environment as far as I know while others support UNIX/LINUX environments too. This is nothing unique to XB1 and if Sony wanted to use these servers they can too. I mean they are using for PC too so whats the big deal.

MS needs to stop spreading this BS about their super cloud where it has actually degraded the game. Look at that abysmal A.I in TF.
Azure does Linux.http://www.windowsazure.com/en-us/documentation/services/virtual-machines/
 
Code:
Numbers Everyone Should Know

[B]L1 cache reference                             0.5 ns
Branch mispredict                              5 ns
L2 cache reference                             7 ns
Mutex lock/unlock                            100 ns (25)
Main memory reference                        100 ns[/B]
Compress 1K bytes with Zippy              10,000 ns (3,000)
Send 2K bytes over 1 Gbps network         20,000 ns
Read 1 MB sequentially from memory       250,000 ns
Round trip within same datacenter        500,000 ns
Disk seek                             10,000,000 ns
[I]Read 1 MB sequentially from network   10,000,000 ns
Read 1 MB sequentially from disk      30,000,000 ns (20,000,000)[/I]
[I]Send packet CA->Netherlands->CA      150,000,000 ns[/I]

http://www.cs.cornell.edu/projects/ladis2009/talks/dean-keynote-ladis2009.pdf

Basically the bolded is where you're doing your calculations and setting up and rendering. Italicised is where the "cloud" is.

Thank you thank you thank you.

Now I can go to forums, spot an astroturfer, and do my very own

"Standby for

Truthbombers
Intercepting
Transient
Astroturfing
Neanderthals

For
All
Learned
Luddites
 
Yeah...dat silky smooth framerate.

Jeff Gerstmann (Giant Bomb) : The frame rate in Titanfall is uneven on the Xbox One and though it's usually fine, it can get downright nasty in specific situations.** In one Last Titan Standing match--where every player spawns in a robot suit--several players crammed their mechs into a tight area and began duking it out, and **the frame rate dived down to what must have been single digits per second. Even out in wider areas, the game feels a little hitchy from time to time, and there's noticeable tearing throughout. The visuals in Titanfall look nice, but that's mostly due to some solid art and interesting design, not the performance. As of this writing, I haven't seen enough of the PC version to know how well it runs.
http://www.giantbomb.com/reviews/this-is-not-a-titanfall-review-yet/1900-630/


Technically speaking, those battles look impressive, but my eyeballs remain un-melted. Titans, pilots, maps, and weapon effects are all perfectly acceptable, save for the occasional dip below the otherwise-normal 60 frames per second in a huge multi-titan explosion
http://www.ign.com/articles/2014/03/10/titanfall-review-2

Of course, an online multiplayer-only game like Titanfall is only as good as its servers, and how they fare when the eager hordes descend on them remains to be seen. The About the Author section of this review contains more information on the circumstances in which I played it, which weren't always ideal. I experienced a few laggy matches and occasional frame rate issues, but these in-game hitches were the exception to the rule during the many hours I played.
http://www.gamespot.com/reviews/titanfall-review/1900-6415690/

Structurally is where Titanfall shows its limitations, and the reliance on the same old standard match types feels like a concession to the restrictions the rest of the game kicks against. Technically, too, it's not quite the polished gem some may be expecting. Textures don't always impress when wall-running brings you close to them, and the frame rate can still chug when the action gets intense. There's noticeable screen tearing and load times can be a chore. Character models and animations don't always stand up to much scrutiny either. None of this is particularly harmful, but it does take the shine off what many will expect to be a pristine, console-justifying experience.
http://www.eurogamer.net/articles/2014-03-10-titanfall-launch-review
 
Code:
Numbers Everyone Should Know

[B]L1 cache reference                             0.5 ns
Branch mispredict                              5 ns
L2 cache reference                             7 ns
Mutex lock/unlock                            100 ns (25)
Main memory reference                        100 ns[/B]
Compress 1K bytes with Zippy              10,000 ns (3,000)
Send 2K bytes over 1 Gbps network         20,000 ns
Read 1 MB sequentially from memory       250,000 ns
Round trip within same datacenter        500,000 ns
Disk seek                             10,000,000 ns
[I]Read 1 MB sequentially from network   10,000,000 ns
Read 1 MB sequentially from disk      30,000,000 ns (20,000,000)[/I]
[I]Send packet CA->Netherlands->CA      150,000,000 ns[/I]

http://www.cs.cornell.edu/projects/ladis2009/talks/dean-keynote-ladis2009.pdf

Basically the bolded is where you're doing your calculations and setting up and rendering. Italicised is where the "cloud" is.

Watch people claim they can get response time low enough for it to work.


(If they break the laws of physics as we know them maybe. :P)
 
I spent hours playing against bots on the Unreal Tournament 99 demo back in the days... It was computed locally, nothing fancy but I I was fighting gangs of Albert Einstein clones when compared to Titanfall's grunts... In the beta at least. Has somebody ever died to a grunt?

That's great. TF grunts aren't supposed to replace players. They're fodder to be farmed
 
... that one packet is going from California to the Netherlands then back to California. Crossing the ocean twice. That would be a rare case which would only happen in the most extreme conditions (not being able to find anyone within half of the planet to play against). Under normal conditions that packet would be going to an Azure Data center within the same region, thus a lower response time then going across the world.

even the latency is half or a third or less it still isn't fast enough
 
Gaming industry paying for people to comment like this, sad

Actually the consumers and fans are paying for this shit

you give them your money so they can make a good game but they spend a large amount of that money on advertising and PR and astroturfing (not to mention the amount spent on beancounters and managment)
 
That's great. TF grunts aren't supposed to replace players. They're fodder to be farmed

Yes, most I would think know this. The problem comes from touting the advanced AI, which are essentially, target dummies.

I'm not sold on the claim that the AI is both advanced, and just there for farming purposes.
 
Azure is nothing but servers available for anyone who can pay for them to use. Just like Amazon's AWS service or RackSpace etc etc. A lot of devs already use AWS etc for their games. The only difference here is that, MS is giving devs Azure for free and that is the only reason they are using it. There is NO advantage of Azure over any other cloud computing provider. Actually MS Azure was late to the party with the servers in the cloud and only supports Windows environment as far as I know while others support UNIX/LINUX environments too. This is nothing unique to XB1 and if Sony wanted to use these servers they can too. I mean they are using for PC too so whats the big deal.

MS needs to stop spreading this BS about their super cloud where it has actually degraded the game. Look at that abysmal A.I in TF.

9flWqDh.png


We must be using two entirely different services.
 
Huh, must have been a clear day then cause the bots were dumb as shit.

They're actually doing some pretty clever things. Basically, they exist to guide you back to the combat. They tend to avoid focusing on the player, so they're never really a challenge as a result.

I think having enemy AI as smart as anything in Unreal/FEAR/Halo/Gears/AvP would... really hurt the overall play experience. Imagine having to deal with a Skaarj in a multiplayer match. That'd be kinda... hm. I'm not sure it'd work, you know? How would you feel being killed by Titans that had the capabilities of Halo's Elites or Unreal's bots?
 
Forgive my ignorance, but why are so many people doubting this article? Isn't it possible they're telling the truth and not the paid shills people are claiming them to be?

I'm not a technical whiz, so if someone could break it down for me like you would a toddler, or perhaps a person with a severe closed head injury, I'd appreciate it.
 
If you ever feel bad because you don't feel like you're getting enough accomplished in work/school/etc, at least it didn't take you the better part of an entire week to put an article like this together.


Dude, of course he was going to wait for the check to clear the bank before hitting "publish" on the MS provided copy. That can take the better part of a week, easy.
 
Forgive my ignorance, but why are so many people doubting this article? Isn't it possible they're telling the truth and not the paid shills people are claiming them to be?

I'm not a technical whiz, so if someone could break it down for me like you would a toddler, or perhaps a person with a severe closed head injury, I'd appreciate it.

All of this has been done before, for a long time. Dedicated servers. Throw in some false info ("more detailed graphics and the game's silky-smooth frame rate") and you have yourself a shit sandwich. The author makes it sound like the second coming with all the hyperbole.
 
Not understanding the problem some people are having here. MMOs have used server side scripts for atleast a decade, its not like they're pulling this out of nothing. I think MS made a wise choice by offering these servers to Respawn, this game is clearly going to be a hit, and most likely a system seller.

So people are mad because the article tries to explain in layman terms how titanfall works? Not everyone is a tech guru, and console mp has been peer to peer for the longest time, so yeah, it is new tech to some people.
 
All of this has been done before, for a long time. Dedicated servers. Throw in some false info ("more detailed graphics and the game's silky-smooth frame rate") and you have yourself a shit sandwich. The author makes it sound like the second coming with all the hyperbole.

Not only done before but was standard, you didnt make a FPS without these features. And now companies, some of the which were the same ones that took these features out, are trying to market it has some sort of holy grail of online gaming.
 
All of this has been done before, for a long time. Dedicated servers. Throw in some false info ("more detailed graphics and the game's silky-smooth frame rate") and you have yourself a shit sandwich. The author makes it sound like the second coming with all the hyperbole.

But couldn't it help you put more system-side power into things like graphics and frame rate if you're offloading other stuff to the cloud?

Also, has anyone tweeted the author of this article to get him to explain where he got his info?


Okay, this got me.
 
But couldn't it help you put more system-side power into things like graphics and frame rate if you're offloading other stuff to the cloud?

Also, has anyone tweeted the author of this article to get him to explain where he got his info?



Okay, this got me.

Look at the chart at the top of the page from your post, it shows some examples of the latency needed for different things, rendering graphics is at the top and server to client communication is at the bottom.

And just look at the numbers involved. And anything you can offloaded is not related to rendering anything, so it does almost nothing.
 
They're actually doing some pretty clever things. Basically, they exist to guide you back to the combat. They tend to avoid focusing on the player, so they're never really a challenge as a result.

I think having enemy AI as smart as anything in Unreal/FEAR/Halo/Gears/AvP would... really hurt the overall play experience. Imagine having to deal with a Skaarj in a multiplayer match. That'd be kinda... hm. I'm not sure it'd work, you know? How would you feel being killed by Titans that had the capabilities of Halo's Elites or Unreal's bots?

The problem here is that the logic is that challenging bots would just 'ruin' your killstreaks
a challenging interesting bot would be a lot more fun to fight than the braindead immersion breaking fodder in titanfall in maps that are pretty devoid of players to start with

But no dying to bots would 'suck' in dooty and titanfall because you're not trying to have engaging combat you're chasing a carrot
elites+ grunts etc in halo were quite funny and entertaining to fight, ut bots put up a good fight and can rocketjump/impact hammer jump to take shortcuts or even jump over walls or terrain or catch powerups etc and they know how to aim a shock combo
 
Took me a bit longer than it should have to realise I had the "Cloud to Butt" Chrome extension. And here I thought the secret sauce of Titanfall was actually Microsoft's butt.
 
Forgive my ignorance, but why are so many people doubting this article? Isn't it possible they're telling the truth and not the paid shills people are claiming them to be?

I'm not a technical whiz, so if someone could break it down for me like you would a toddler, or perhaps a person with a severe closed head injury, I'd appreciate it.

It would be lag city for these types of games.

Even specialized servers that deal with offloading graphics to the cloud publicly note that GPU cloud computing will NOT benefit FPS games without lag.

http://www.infoworld.com/d/cloud-co...tream-offloads-graphics-work-the-cloud-230978

"Frazzini did warn that a small subset of applications with zero tolerance for latency (fast-paced first-person shooters for instance) would not be suitable for running entirely on AppStream"


So if you can't offload GPU anything to the cloud without lag for a FPS, then how is something that isn't even specialized for a GPU task, like the server Azure will be using, going to help improve graphics? This is a question which we have been given no answer, and that is because there isn't one.
 
Many look at Titanfall as the first true next-gen game, offering an experience we haven't seen on last-generation hardware (think: the PlayStation 3 and Xbox 360).

But it's on the Xbox 360. And probably would have been on PS3 if not for moneyhats.

This article is straight up astroturf.
 
9flWqDh.png


We must be using two entirely different services.

I stand corrected. I used it long time back. But my main point was not this. Anyways, GAF is intelligent and can smell the BS MS is trying to spread. But there must be others who are being misguided by these false claims.
 
But couldn't it help you put more system-side power into things like graphics and frame rate if you're offloading other stuff to the cloud?

You're not offloading anything, since the server is handling computations the server always would have handled. The advantages of using dedicated servers are the advantages of using dedicated servers. They have always existed in comparison to peer-hosted games that became dominant on the 360. But they're not new outside of the attempt to "brand" the adoption of cloud platforms for cost and flexibility reasons.

Also, has anyone tweeted the author of this article to get him to explain where he got his info?

We know where he got his information: Microsoft PR. The problem is he doesn't appear to understand the technology well enough to understand or objectively evaluate the information he was given and chose to simply print whatever claims were made unquestioned and unchallenged as if they were true.
 
Forgive my ignorance, but why are so many people doubting this article? Isn't it possible they're telling the truth and not the paid shills people are claiming them to be?

I'm not a technical whiz, so if someone could break it down for me like you would a toddler, or perhaps a person with a severe closed head injury, I'd appreciate it.
There was an analogy I was told in class once, I wish I could remember the specifics but it used cooking (or food preparation period really) as how quickly you can access certain ingredients, I think it's something like how CPU memory caches are like whatever's in your hand that moment, what's in the RAM is your cupboard, and going to the hard drive or reading off a disc is like driving to the grocery store.

Continuing with this using the internet is like mail ordering ingredients. And if you're cooking that's simply going to be too damn late no matter what you do.
 
But couldn't it help you put more system-side power into things like graphics and frame rate if you're offloading other stuff to the cloud?

Also, has anyone tweeted the author of this article to get him to explain where he got his info?

No, the CPU handles AI and does not handle graphics (other than draw call generation).
Even if it drops the CPU usage a little bit (due to not running the AI), it is also no different than games like DOTA2, LoL, WoW, Diablo 3, Path of Exile or any other games that have a server running bots and thus is not some special advantage.
 
Because Titanfall's advanced AI is handled by the Azure servers, your Xbox's or PC's innards can be used to achieve more detailed graphics and the game's silky-smooth frame rate.

I'm a little confused. Wouldn't offloading the AI to the servers only free up some of the CPU? And aren't graphics more of a GPU thing (hence the name)...? So why would dedicated servers handling the AI result in better graphics?

And realistically, how much of the CPU's resources would something like the AI in Titanfall even take up? I can't imagine it being a large enough portion to result in any significant gains to framerate (as it is Titanfall's framerate is rather rubbish) or any other part of the game.
 
I know. Just pointing out a flaw with those numbers in this specific situation.
Actually, you weren't. His post said that the cloud was the italicized figures; the one you disagreed with isn't the only italicized line. He (correctly) pointed out an entire range of possible cloud speeds, all of which are many orders of magnitude higher than the local processing trip times. You saying that one particular number in that range is unrealistic says nothing to the argument. (Not to mention that a ping of 150 is hardly impossible, even with regional servers.)
 
I'm a little confused. Wouldn't offloading the AI to the servers only free up some of the CPU? And aren't graphics more of a GPU thing (hence the name)...? So why would dedicated servers handling the AI result in better graphics?

And realistically, how much of the CPU's resources would something like the AI in Titanfall even take up? I can't imagine it being a large enough portion to result in any significant gains to framerate (as it is Titanfall's framerate is rather rubbish) or any other part of the game.

Yeah, it would not.
The article is just rubbish from someone that does not have a clue what they are talking about.
 
I'm a little confused. Wouldn't offloading the AI to the servers only free up some of the CPU? And aren't graphics more of a GPU thing (hence the name)...? So why would dedicated servers handling the AI result in better graphics?

And realistically, how much of the CPU's resources would something like the AI in Titanfall even take up? I can't imagine it being a large enough portion to result in any significant gains to framerate (as it is Titanfall's framerate is rather rubbish) or any other part of the game.

Since it's the Source engine, apparently it's pretty CPU-heavy. Albeit, "heavily modified", but who really knows to what extent but Respawn themselves.

Either way, I don't think the Cloud is really helping much at all and not freeing up much.
 
I just don't get why it's being viewed as the first truely "Next Gen" experience.
Is it because of the method they use to out source the AI? or because it's on a next gen console?
 
Its pure PR BS talk. They need/want offload the AI to the cloud/server because it's a multiplayer game and make more sense to put there for better performance. Why? Because the info of location/action of the bots must be available to all players in the match and not just on single console :P
 
I just don't get why it's being viewed as the first truely "Next Gen" experience.
Is it because of the method they use to out source the AI? or because it's on a next gen console?

Because Microsoft marketing says so. The real reality is that if this were truly "next gen" it would have to mean something that could not be done on previous hardware, and in two weeks that is exactly what we'll see.
 
What a steaming pile of bullshit, did they just post bits written by MS PR?

I'm not commenting on the quality of the game, just this article and it's straight from MS lines.
 
Man this article even has gaffers believing... I can only imagine what really uninformed people are thinking.

Biggest red flag for most should be all this first real next gen game Bs about a fucking cross gen and pc game
 
michael-jordan-laugh.gif


Azure is nothing but servers available for anyone who can pay for them to use. Just like Amazon's AWS service or RackSpace etc etc. A lot of devs already use AWS etc for their games. The only difference here is that, MS is giving devs Azure for free and that is the only reason they are using it. There is NO advantage of Azure over any other cloud computing provider. Actually MS Azure was late to the party with the servers in the cloud and only supports Windows environment as far as I know while others support UNIX/LINUX environments too. This is nothing unique to XB1 and if Sony wanted to use these servers they can too. I mean they are using for PC too so whats the big deal.

MS needs to stop spreading this BS about their super cloud where it has actually degraded the game. Look at that abysmal A.I in TF.


Actually you are wrong. Azure differs from AWS and Rackspace in that, in its initial conception, it was more focused on providing APIs to interface apps directly with cloud computing. Amazon and rackspace are more geared toward spinning up prepackaged virtual servers in the cloud and running them in traditional modes. MS virtual studio and a custom version of SQL were geared toward this. Originally you couldn't even spin up a basic virtual server.

Problem was is it wasn't what people wanted. Why write/invest in a customized app that would only be tied to MS proprietary APIs and limit porting to another cloud vendor if MS jacks up the price. Write your app using traditional methods and let be easily ported to any situalion. Hell the original azureSQL wasnt even compatible with non Azure databases.

I think it became more like AWS, that was the way it was going when I was active, haven't used it in ages. But the original intent was a backbone for app development. Bottom line is Sony could do the exact same thing on any other cloud provider
 
Look at the chart at the top of the page from your post, it shows some examples of the latency needed for different things, rendering graphics is at the top and server to client communication is at the bottom.

And just look at the numbers involved. And anything you can offloaded is not related to rendering anything, so it does almost nothing.

It would be lag city for these types of games.

Even specialized servers that deal with offloading graphics to the cloud publicly note that GPU cloud computing will NOT benefit FPS games without lag.

http://www.infoworld.com/d/cloud-co...tream-offloads-graphics-work-the-cloud-230978

"Frazzini did warn that a small subset of applications with zero tolerance for latency (fast-paced first-person shooters for instance) would not be suitable for running entirely on AppStream"


So if you can't offload GPU anything to the cloud without lag for a FPS, then how is something that isn't even specialized for a GPU task, like the server Azure will be using, going to help improve graphics? This is a question which we have been given no answer, and that is because there isn't one.

You're not offloading anything, since the server is handling computations the server always would have handled. The advantages of using dedicated servers are the advantages of using dedicated servers. They have always existed in comparison to peer-hosted games that became dominant on the 360. But they're not new outside of the attempt to "brand" the adoption of cloud platforms for cost and flexibility reasons.



We know where he got his information: Microsoft PR. The problem is he doesn't appear to understand the technology well enough to understand or objectively evaluate the information he was given and chose to simply print whatever claims were made unquestioned and unchallenged as if they were true.

There was an analogy I was told in class once, I wish I could remember the specifics but it used cooking (or food preparation period really) as how quickly you can access certain ingredients, I think it's something like how CPU memory caches are like whatever's in your hand that moment, what's in the RAM is your cupboard, and going to the hard drive or reading off a disc is like driving to the grocery store.

Continuing with this using the internet is like mail ordering ingredients. And if you're cooking that's simply going to be too damn late no matter what you do.

No, the CPU handles AI and does not handle graphics (other than draw call generation).
Even if it drops the CPU usage a little bit (due to not running the AI), it is also no different than games like DOTA2, LoL, WoW, Diablo 3, Path of Exile or any other games that have a server running bots and thus is not some special advantage.

Thanks guys/gals. I appreciate you taking the time to break this stuff down for me :)
 
Top Bottom