• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Apparently Splatoon 2 tickrate is only 16Hz

jorgejjvr

Member
I play the game, and it works. Don't even experience lag.

What is this tickrate and why does it matter?

Specially since I don't notice any issues with the game itself.

Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
 
Yeah it's a super common complaint in UC, it's probably the biggest problem with their MP games overall. However, I don't think it manifests quite as badly in UC. I don't experience any where near as many trades in UC as I do in Splatoon. In Splatoon probably 1/10 of my kills are trades, or perhaps more. The real projectiles (rather than hitscans) and low time to kill really contribute to that I would imagine, but the crappy connections of wireless players don't help either.
I agree that WiFi exacerbates netcode issues, but I remember ND saying that 60% of their player base is on WiFi. I imagine Wii U/Switch WiFi usage must be higher than that (80-90% maybe?)

I still don't understand why we haven't gotten a tick rate slider for custom/private games at least. It shouldn't be that hard to implement and it wouldn't affect public games.

ps: I don't think Uncharted or Splatoon are bad games, but I'll never consider them truly competitive. They're fun, casual games to play and basically that's it.

Other games are moving towards a more competitive direction: https://www.reddit.com/r/CoDCompetitive/comments/6ki0rz/wwii_will_have_a_60hz_server_tick_rate_when/
 
I play the game, and it works. Don't even experience lag.

What is this tickrate and why does it matter?

Specially since I don't notice any issues with the game itself.

Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?

From Reddit.

What this all means for a Game.

Generally, a higher tick-rate server will yield a smoother, more accurate interaction between players, but it is important to consider other factors here. If we compare a tick rate of 64 (CSGO matchmaking), with a tick rate of 20, the largest delay due to the difference in tick rate that you could possibly perceive is 35ms. The average would be 17.5ms. For most people this isn't perceivable, but experienced gamers who have played on servers of different tick rates, can usually tell the difference between a 10 or 20 tick server and a 64 tick one.

Keep in mind that a higher tickrate server will not change how lag compensation behaves, so you will still experience times where you ran around the corner and died. 64 Tick servers will not fix that.

If your tv has an input lag above 35ms + the lag of your internet connection is probably your not going to notice anything.
 
I play the game, and it works. Don't even experience lag.

What is this tickrate and why does it matter?

Specially since I don't notice any issues with the game itself.

Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
Let's put it this way: would you enjoy Splatoon the same if it was 30 fps (like Zelda)?

Do you appreciate fluidity even if you don't know what fps/Hz mean?
 

Patch13

Member
Some math, which may or may not be interesting:

At 60 frames per second ...

* 16Hz is ~4 frames. That gives you a worst case of 8 frames of latency, due to tick, if two consoles push out updates with just the wrong timing.

* A 100ms ping adds 6 frames of latency. (Ping is round trip, so we don't have to add your ping to your opponent's ping; we'll assume that you both have roughly the same ping, or that the average of your pings is 100.)

* We might have another frame of latency due to buffering in the graphics pipeline.

* Our Lady of Google informs me that average human reaction time to a visual stimulus is 0.25 seconds, which is 15 frames.

So given a decent ping, a tick rate of 16Hz updates the game about as quickly as you can react to it -- 8 + 6 + 1 = 15 frames of latency in the hardware matching the 15 frames of latency in the wetware in your brain. You see a thing, and do a thing, and that's, in the worst case, when you see the next thing.

"Pro" players are going to have faster reflexes, and might be able to shave a few frames off of that timing. They probably also live on faster networks with lower ping, though, so the game isn't necessarily updating outside of their reaction window (and remember that the 8 frames of latency due to tick is out worst case scenario -- most of the time, the lag due to tick is going to be somewhere between 4 and 8. And each source of latency doesn't necessarily straightforwardly add to the others).

Basically, my back of the envelope math suggests that 16Hz is okay, for a tick rate, though it doesn't leave much room for latency in the network, video pipeline,or for lack of latency in the wetware of the squishy human playing the game. A higher tick rate is safer, leaving for more slop in the ping, and is friendlier to really fast humans. I imagine that the lower tick rate is part of the reason why the team at Nintendo went with regional matchmaking, mitigating some of the worst case scenarios with ping. I suppose time will tell whether the squishy humans feel that the tradeoffs that the dev team made, in terms of battery life, mobile data friendless, and kindness to the cpu in the machine hosting the match,were worthwhile. I know that I, with the reflexes of someone who is now pushing 40, am not the best judge for how this feels to a strapping young person in their 20s. But I don't think that the situation is quite as dire as you might think, reading through the rest of the thread.
 
I play the game, and it works. Don't even experience lag.

What is this tickrate and why does it matter?

Specially since I don't notice any issues with the game itself.

Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?

Read the replies and you'll see why it matters to SOME people. If you're not noticing anything and you're already having fun, you're fine and don't need to worry.

Really don't understand it when people come into a thread about an issue a game has, and then say "well it seems fine to me."
 
Some math, which may or may not be interesting:

At 60 frames per second ...

* 16Ghz is ~4 frames. That gives you a worst case of 8 frames of latency, due to tick, if two consoles push out updates with just the wrong timing.

* A 100ms ping adds 6 frames of latency. (Ping is round trip, so we don't have to add your ping to your opponent's ping; we'll assume that you both have roughly the same ping, or that the average of your pings is 100.)

* We might have another frame of latency due to buffering in the graphics pipeline.

* Our Lady of Google informs me that average human reaction time to a visual stimulus is 0.25 seconds, which is 15 frames.

So given a decent ping, a tick rate of 16Ghz updates the game about as quickly as you can react to it -- 8 + 6 + 1 = 15 frames of latency in the hardware matching the 15 frames of latency in the wetware in your brain. You see a thing, and do a thing, and that's, in the worst case, when you see the next thing.

"Pro" players are going to have faster reflexes, and might be able to shave a few frames off of that timing. They probably also live on faster networks with lower ping, though, so the game isn't necessarily updating outside of their reaction window (and remember that the 8 frames of latency due to tick is out worst case scenario -- most of the time, the lag due to tick is going to be somewhere between 4 and 8. And each source of latency doesn't necessarily straightforwardly add to the others).

Basically, my back of the envelope math suggests that 16Ghz is okay, for a tick rate, though it doesn't leave much room for latency in the network, video pipeline,or for lack of latency in the wetware of the squishy human playing the game. A higher tick rate is safer, leaving for more slop in the ping, and is friendlier to really fast humans. I imagine that the lower tick rate is part of the reason why the team at Nintendo went with regional matchmaking, mitigating some of the worst case scenarios with ping. I suppose time will tell whether the squishy humans feel that the tradeoffs that the dev team made, in terms of battery life, mobile data friendless, and kindness to the cpu in the machine hosting the match,were worthwhile. I know that I, with the reflexes of someone who is now pushing 40, am not the best judge for how this feels to a strapping young person in their 20s. But I don't think that the situation is quite as dire as you might think, reading through the rest of the thread.
Also, remember that the majority of people uses tvs to play with their consoles, so the input lag is normally really really high. Maybe if you use a gaming monitor with a input lag below 1ms and a really good pin you can notice something.
 

mas8705

Member
So what would the chances be of Nintendo fixing this issue? Feels like that if they do want to push Splatoon 2 into the spotlight of eSports, they would want to probably tweak up the performance just a touch, correct.
 

LCGeek

formerly sane
I play the game, and it works. Don't even experience lag.

What is this tickrate and why does it matter?

Specially since I don't notice any issues with the game itself.

Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?

Lag isn't tickrate.

Tickrate is more related to the response rate of online game.

You would infinitely notice a smoother or more responsive experience if it was there even a tickrate of 30 would improvement to that.

Also a tickrate based on GHZ would be insane and almost unnecessary ignoring the fact I doubt we have cpus that strong to feed such a rate.
 
Also a tickrate based on GHZ would be insane and almost unnecessary ignoring the fact I doubt we have cpus that strong to feed such a rate.
The fact that he uses "GHz" for tickrate invalidates his entire post. Pretty sure he doesn't know what he's talking about.

If average human reaction time is 250ms, then that means that cinematic framerates (20-25fps) should "suffice", right? Preposterous to say the least.
 
Human reaction time isn't the same as how many frames you can see. You want to have the most accurate and recent information available to react to. If the server updates 64 times a second, and you see 64 frames per second it may take you a certain number of frames to react, but at least you are reacting to something more accurate than if you only receive 16 updates per second.
 
So this is where people try to invalidate other people's opinions and pretend nothing is happening because they can't notice it? lol
Well, technically with a normal tv (not a pc gamer monitor) + and a average pin is almost impossible to notice something... Can be Better thats for sure, but is not going to solve the "being shoot behind a wall" problem that a lot of people think that is caused because the low tickrate.
 

Patch13

Member
The fact that he uses "GHz" for tickrate invalidates his entire post. Pretty sure he doesn't know what he's talking about.

Yep. My units were off by several orders of magnitude. Brain fart (I blame it on a hot summer day). Changed to Hz (times per second), as it should be. The frames are correct, even if the units I used to get to the frames were not :)

If average human reaction time is 250ms, then that means that cinematic framerates (20-25fps) should "suffice", right? Preposterous to say the least.

You're mixing up perception and reaction time.

For avoiding unnecessary death, reaction times are important, and that is the time it takes your slow-ass brain to process a visual signal, figure out what to do, and propagate an electrochemical signal to the bits of you that need to do the thing.

As this happens, you can totally perceive smaller units of time, and your brain can process things after the fact and evaluate all the agonizing milliseconds of time that went by while you slowly reacted to your impending inky doom. But that doesn't help you actually react any faster to it.
 

Patch13

Member
Also, remember that the majority of people uses tvs to play with their consoles, so the input lag is normally really really high. Maybe if you use a gaming monitor with a input lag below 1ms and a really good pin you can notice something.

This is a good point, and it makes the math much more complicated (and also much more dire). You could argue that a higher tick rate helps because at least you're not adding *more* latency to the stack. But you could also argue that people playing on a TV (especially if they leave all the post processing stuff on and don't set it to "game mode") don't really have any grounds to complain about some other source of latency :)
 

Burnburn

Member
Well, technically with a normal tv (not a pc gamer monitor) + and a average pin is almost impossible to notice something... Can be Better thats for sure, but is not going to solve the "being shoot behind a wall" problem that a lot of people think that is caused because the low tickrate.

You know that what type of TV someone is using doesn't matter in this discussion right? The TV just displays what's happening later. Every display will display what had happened exactly the same, one display just shows it later than the other. You know how a stream shows the exact same thing as the streamer is seeing only 15 seconds later? Same thing. Having a different TV doesn't mean you're magically gonna get sniper behind a wall less.
 
This is a good point, and it makes the math much more complicated (and also much more dire). You could argue that a higher tick rate helps because at least you're not adding *more* latency to the stack. But you could also argue that people playing on a TV (especially if they leave all the post processing stuff on and don't set it to "game mode") don't really have any grounds to complain about some other source of latency :)
Yes, even on a really good and expensive tv the input lag in gaming mode is much much higher than a gaming monitor. The majority of people here are describing Lag issues not anything related to a low tickrate.
 
You're mixing up perception and reaction time.

For avoiding unnecessary death, reaction times are important, and that is the time it takes your slow-ass brain to process a visual signal, figure out what to do, and propagate an electrochemical signal to the bits of you that need to do the thing.

As this happens, you can totally perceive smaller units of time, and your brain can process things after the fact and evaluate all the agonizing milliseconds of time that went by while you slowly reacted to your impending inky doom. But that doesn't help you actually react any faster to it.
I'm not mixing up anything.

Defending low tick rate (Hz) while having a high display refresh rate (fps) is contradictory. Would you enjoy Splatoon if it ran at 16 fps?

Ideally you'd need 1:1 (60 Hz) to make the most use of the high display refresh rate.

I'd even settle for 1:2 (30 Hz) for public matches as a start. 60 Hz might be too taxing for low-end ADSL connections.

Also, I hope you realize that missing network frames are being interpolated and therefore not accurate at all. Here's an explanation of how this works: https://www.youtube.com/watch?v=6EwaW2iz4iA

This is a good point, and it makes the math much more complicated (and also much more dire). You could argue that a higher tick rate helps because at least you're not adding *more* latency to the stack. But you could also argue that people playing on a TV (especially if they leave all the post processing stuff on and don't set it to "game mode") don't really have any grounds to complain about some other source of latency :)
What kind of "argument" is this?

Would a 60 -> 16 fps downgrade be justified if someone didn't turn game mode on? Would a *universal* downgrade be justified if *some* people didn't turn game mode on?

Would lack of surround sound (a useful option) be justified if most people used stereo headsets?

I can go on and on, if you want... this is a slippery slope.
 
You know that what type of TV someone is using doesn't matter in this discussion right? The TV just displays what's happening later. Every display will display what had happened exactly the same, one display just shows it later than the other. You know how a stream shows the exact same thing as the streamer is seeing only 15 seconds later? Same thing. Having a different TV doesn't mean you're magically gonna get sniper behind a wall less.
Is matter a lot. Thats way pro players uses gaming monitor... The "problem" created by the low tickrate is probably less than the Lag of your TV... Thats the point.
 

Patch13

Member
Every display will display what had happened exactly the same, one display just shows it later than the other.

This is incorrect. The horrible eldtritch truth lurking below online video games is that everyone's television displays a slightly different version of events.

Having a different TV doesn't mean you're magically gonna get sniper behind a wall less.

The thing is that you only got sniped behind a wall on your TV. On the sniper's screen, you hadn't moved yet, and were sniped out in the open. A bystander might see what you saw, might see what the sniper saw, or might see something in between. (Or might have been looking the other way.)

Tick rate affects how often everybody's console attempts to bring their picture of the world into sync. A faster tick rate reduces the chances that your display and an opponent's display are grossly out or sync, but it doesn't necessarily help you avoid getting sniped. That's all up to how your reflexes, the sniper's reflexes, the game's prediction algorithm, the total system's latency, the unpredictability of everyone's behavior, etc., interact.

It all boils down to high tick rates probably being better. But high tick rates aren't the only moving part in the system, and you can't really consider tick rate in isolation, separate from all the other pieces.
 

jimboton

Member
The fact that he uses "GHz" for tickrate invalidates his entire post. Pretty sure he doesn't know what he's talking about.

If average human reaction time is 250ms, then that means that cinematic framerates (20-25fps) should "suffice", right? Preposterous to say the least.

Some people here take average human reaction time being around 250 ms to mean that the human brain is some kind of discrete system that only samples reality and takes actions at intervals of 250 ms therefore making any and all events in the Universe that take place in smaller timeframes imperceptible and/or irrelevant. It's kind of hilarious.
 
Tickrate cannot contribute to lag? Or latency?
Of course not, it's part of our imagination!

Some people just don't understand that a higher sampling rate really helps in digital/quantized systems... whether it's audio or network frames. It offers a more accurate representation of data.
 

Seik

Banned
Despite my ignorant comment at the start of this thread, I'd like to thank the people here (and that article) explaining in detail what tick rate is. I'll sleep as a better man tonight. This was a good read, honestly, I love learning that stuff.

Tick rate, I think, is less known amongst most people. The only online 'performance' term I was used to was 'ping', if not that, it's download/upload rate and all the modem and broadband mumbo jumbo. I think that's the case for most here despite being a gaming enthusiast forum since this is a little more deep than the usual performance statistics we're used to read/hear about.

That said, I'm glad I'm not a Splatoon pro, because I'd be pretty pissed, especially since the former game had a higher tick rate. I rather be a casual peasant that doesn't notice that stuff too often, honestly. Ignorance is bliss. :lol

Hopefully Nintendo will please that market though, I truly wish it as they push this game as a E-Sport title.
 

LCGeek

formerly sane
Is matter a lot. Thats way pro players uses gaming monitor... The "problem" created by the low tickrate is probably less than the Lag of your TV... Thats the point.

We know from early versions of counter strike alone tickrate is the difference between mods being useable or totally undoable. So I have to firmly say no that.

Neither of what you mentioned is good.

Tickrate effects registry or consistency so I don't get how you can argue the problems it creates are less when they are systematic.
 
Some people here take average human reaction time being around 250 ms to mean that the human brain is some kind of discrete system that only samples reality and takes actions at intervals of 250 ms therefore making any and all events in the Universe that take place in smaller timeframes imperceptible and/or irrelevant. It's kind of hilarious.
It's mostly console gamers that spout this nonsense and it tends to generate memes like "24 fps are more cinematic". No one has answered my question if Splatoon would be acceptable at 30 or 16 fps.

Disclaimer: I'm a console gamer myself, but I don't buy this BS. Fluidity is a good thing, especially in Competitive MP games.
 

Patch13

Member
Would a 60 -> 16 fps downgrade be justified if someone didn't turn game mode on? Would a *universal* downgrade be justified if *some* people didn't turn game mode on?

No. You're being very silly. 60fps is important because it makes for a more convincing illusion of motion, and because it reduces the rendering pipeline's contribution to lag and latency in the system.

With a networked game, you introduce another potential source of lag and latency to the game's model. But because you can't really expect people to have pings of less than 100ms on a consistent basis, you can't just sync up tick rate with monitor refresh and call it good.

Good netcode involves a combination of frequent-as-possible check-ins with the other machines in the system (tick rate), with interpolation and prediction algorithms to smooth everyone's movement, so that it looks like your opponents are running around in the world, rather than strobing across your vision.

This makes your brain happy, and the result is a convincing illusion that you and your opponents are playing in a shared world. The illusion lasts until you get sniped from behind a wall, or someone warps a few steps, or something else weird happens, and thus begin conversations about what the devs could do to make things better (high tick rates help; lower ping times help; more sophisticated prediction algorithms help, though skilled humans will, of course, be harder to predict).

Regardless, my point was that you're not necessarily dying because of a low tick rate. You might be dying because somebody got the drop on you and killed you before you could react in their version of the world, and you saw the seams in the system as your console caught up.
 

Brofield

Member
A simple analogy; a low tickrate is why you see rubberbanding in Mario Kart online, correct?

The video OP linked does make it a little easier to understand, but I guess the question I want to ask is there a way for the tickrate to be improved from the developer's side with a software patch?
 

Unicorn

Member
I play the game, and it works. Don't even experience lag.

What is this tickrate and why does it matter?

Specially since I don't notice any issues with the game itself.

Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
Basically, lower tick rate results in more trades in fights. Rather than one winning out, it flubs facts and both will kill each other.
 

LCGeek

formerly sane
With a networked game, you introduce another potential source of lag and latency to the game's model. But because you can't really expect people to have pings of less than 100ms on a consistent basis, you can't just sync up tick rate with monitor refresh and call it good.

In a post bufferbloat world we can, we will not due to fact ISPS don't operate to be efficient and most consumers don't know to enable or use anti bufferbloat tech. Anti bufferbloat tech has been throughly benchmarked and it keeps your networks running at microseconds for the most part as it should. The same can equally be said for games not being designed around certain flaws of network tech or outright disabling them as they are useless for real time gaming or smaller networks.
 

KingBroly

Banned
A simple analogy; a low tickrate is why you see rubberbanding in Mario Kart online, correct?

The video OP linked does make it a little easier to understand, but I guess the question I want to ask is there a way for the tickrate to be improved from the developer's side with a software patch?

No...what gave you that idea?
 
With a networked game, you introduce another potential source of lag and latency to the game's model. But because you can't really expect people to have pings of less than 100ms on a consistent basis, you can't just sync up tick rate with monitor refresh and call it good.
What does ping have to do with the tick rate value? And why is tick rate a fixed value, even if I'm playing in a LAN environment with other people (single-digit milliseconds)? What if I'm playing with regional players over the internet (double-digit milliseconds)?

It could be dynamic like this: https://www.reddit.com/r/CoDCompetitive/comments/6ki0rz/wwii_will_have_a_60hz_server_tick_rate_when/

It's more of bandwidth-related compromise than anything else.

This makes your brain happy, and the result is a convincing illusion that you and your opponents are playing in a shared world. The illusion lasts until you get sniped from behind a wall, or someone warps a few steps, or something else weird happens, and thus begin conversations about what the devs could do to make things better (high tick rates help; lower ping times help; more sophisticated prediction algorithms help, though skilled humans will, of course, be harder to predict).
Finally you admit that a low tick rate can increase deaths behind walls/cover.

Regardless, my point was that you're not necessarily dying because of a low tick rate. You might be dying because somebody got the drop on you and killed you before you could react in their version of the world, and you saw the seams in the system as your console caught up.
An experienced player knows when he dies because of tick rate BS and when someone got the drop on him/her legit.

I've played thousands of hours on Uncharted MP and it's very aggravating when I feel bullets pumping into my body like half a second after I'm already in cover, which results in my death. It doesn't matter if it's Uncharted or Splatoon or Overwatch or Battlefield, the principle is the same in every shooter.
 

Brofield

Member
No...what gave you that idea?

Everyone keeps suggesting that a higher tickrate means a smoother and more accurate response. If I'm watching a player nearly right next to me suddenly teleport further down the track because they just used a mushroom, while I'm still experiencing no lag nor frame drops, is that not due to a low tickrate?
 
Again it doesn't mean it isn't fun or isn't a big deal to some what it does hurt is it becoming an esport. It is also worth knowing and hopefully influence the devs to improve it.

Still playable and still a good game and obviously for some great considering sales and people playing.
 
No...what gave you that idea?

People just keep showing up in this thread with bizarre misconceptions. I'm not trying to single that guy out to be mean or anything, he had an honest question, but like...god damn. There's a new one every page or two, something new blamed on tick rate. Someone above asked if tick rate affects the game's FPS. Someone in the previous thread thought the game literally ran 30% slower than the previous, like actual slo mo (probably because the article has a line that says "the game runs 30% slower").

Props to the people putting up the good fight in explaining and re-explaining constantly.
 

Patch13

Member
Anti bufferbloat tech has been throughly benchmarked and it keeps your networks running at microseconds for the most part as it should.

Sure. In an ideal world, everybody would be hunting down and eliminating all sources of lag, as they sat down in front of their gaming monitors on their wired networks, killing all background processes on their PC, making sure that they had the latest driver updates installed for their new GPUs.

In the world that we live in, though, you have to make games work on what people have. And someone playing on a Switch tethered to their phone while they ride the A train to work is not necessarily going to benefit from a higher tick rate. Similarly, someone sitting at home on a nice clean wired network is not necessarily going to lose a match solely because there was a disagreement between machines about whether they jumped behind a wall in time or not.

That Nintendo, though ...
 

Hojaho

Member
Everyone keeps suggesting that a higher tickrate means a smoother and more accurate response. If I'm watching a player nearly right next to me suddenly teleport further down the track because they just used a mushroom, while I'm still experiencing no lag nor frame drops, is that not due to a low tickrate?

Probably the other player lagging.
Depends who's hosting.
 
Everyone keeps suggesting that a higher tickrate means a smoother and more accurate response. If I'm watching a player nearly right next to me suddenly teleport further down the track because they just used a mushroom, while I'm still experiencing no lag nor frame drops, is that not due to a low tickrate?

No, if they suddenly teleport it's because your game was behind in its interpolated/assumed state and it just got an update that said "no no no actually this guy is way up here now." It's lag.

As said above, 16 hz tick rate means a network update every 4 frames. The game is running at 60 frames per second. Your game's state will never be more than 1/16th of a second off from the actual location of things, which is not enough to cause wonky teleporting around. Someone's not going to be a mile away only 4 frames later.
 

Patch13

Member
An experienced player knows when he dies because of tick rate BS and when someone got the drop on him/her legit.

This thread did open with an "experienced" player posting a video of themselves barreling into a situation like they were Han Solo, and then being sad that they got sniped. I'm not sure that people are really all that great about distinguishing between the times when they made a mistake, and the times they were just on the wrong side of a sync. But I will allow that shenanigans do happen, sometimes not even in your favor :)
 

Moondrop

Banned
Splatoon is my favorite series. I noticed Splatoon 2 felt weird in the first testfire. Kind of like sluggishness but not? Couldn't describe it.

Tried to tell myself it was in my head or just the demo version. Release version same. So many WTF trades. I played it like crazy the first week, still love it, but can't shake the feeling that something is off.
 
Top Bottom