I don't know and I'm sure reading won't give the answers either.What does this even mean and why does it matter?
I don't know and I'm sure reading won't give the answers either.What does this even mean and why does it matter?
I play the game, and it works. Don't even experience lag.
What is this tickrate and why does it matter?
Specially since I don't notice any issues with the game itself.
Will my gameplay experience somehow drastically improve when I feel there's nothing to improve every time I jump in a game?
.I don't know and I'm sure reading won't give the answers either.
I agree that WiFi exacerbates netcode issues, but I remember ND saying that 60% of their player base is on WiFi. I imagine Wii U/Switch WiFi usage must be higher than that (80-90% maybe?)Yeah it's a super common complaint in UC, it's probably the biggest problem with their MP games overall. However, I don't think it manifests quite as badly in UC. I don't experience any where near as many trades in UC as I do in Splatoon. In Splatoon probably 1/10 of my kills are trades, or perhaps more. The real projectiles (rather than hitscans) and low time to kill really contribute to that I would imagine, but the crappy connections of wireless players don't help either.
I play the game, and it works. Don't even experience lag.
What is this tickrate and why does it matter?
Specially since I don't notice any issues with the game itself.
Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
Generally, a higher tick-rate server will yield a smoother, more accurate interaction between players, but it is important to consider other factors here. If we compare a tick rate of 64 (CSGO matchmaking), with a tick rate of 20, the largest delay due to the difference in tick rate that you could possibly perceive is 35ms. The average would be 17.5ms. For most people this isn't perceivable, but experienced gamers who have played on servers of different tick rates, can usually tell the difference between a 10 or 20 tick server and a 64 tick one.
Keep in mind that a higher tickrate server will not change how lag compensation behaves, so you will still experience times where you ran around the corner and died. 64 Tick servers will not fix that.
Let's put it this way: would you enjoy Splatoon the same if it was 30 fps (like Zelda)?I play the game, and it works. Don't even experience lag.
What is this tickrate and why does it matter?
Specially since I don't notice any issues with the game itself.
Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
I play the game, and it works. Don't even experience lag.
What is this tickrate and why does it matter?
Specially since I don't notice any issues with the game itself.
Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
Also, remember that the majority of people uses tvs to play with their consoles, so the input lag is normally really really high. Maybe if you use a gaming monitor with a input lag below 1ms and a really good pin you can notice something.Some math, which may or may not be interesting:
At 60 frames per second ...
* 16Ghz is ~4 frames. That gives you a worst case of 8 frames of latency, due to tick, if two consoles push out updates with just the wrong timing.
* A 100ms ping adds 6 frames of latency. (Ping is round trip, so we don't have to add your ping to your opponent's ping; we'll assume that you both have roughly the same ping, or that the average of your pings is 100.)
* We might have another frame of latency due to buffering in the graphics pipeline.
* Our Lady of Google informs me that average human reaction time to a visual stimulus is 0.25 seconds, which is 15 frames.
So given a decent ping, a tick rate of 16Ghz updates the game about as quickly as you can react to it -- 8 + 6 + 1 = 15 frames of latency in the hardware matching the 15 frames of latency in the wetware in your brain. You see a thing, and do a thing, and that's, in the worst case, when you see the next thing.
"Pro" players are going to have faster reflexes, and might be able to shave a few frames off of that timing. They probably also live on faster networks with lower ping, though, so the game isn't necessarily updating outside of their reaction window (and remember that the 8 frames of latency due to tick is out worst case scenario -- most of the time, the lag due to tick is going to be somewhere between 4 and 8. And each source of latency doesn't necessarily straightforwardly add to the others).
Basically, my back of the envelope math suggests that 16Ghz is okay, for a tick rate, though it doesn't leave much room for latency in the network, video pipeline,or for lack of latency in the wetware of the squishy human playing the game. A higher tick rate is safer, leaving for more slop in the ping, and is friendlier to really fast humans. I imagine that the lower tick rate is part of the reason why the team at Nintendo went with regional matchmaking, mitigating some of the worst case scenarios with ping. I suppose time will tell whether the squishy humans feel that the tradeoffs that the dev team made, in terms of battery life, mobile data friendless, and kindness to the cpu in the machine hosting the match,were worthwhile. I know that I, with the reflexes of someone who is now pushing 40, am not the best judge for how this feels to a strapping young person in their 20s. But I don't think that the situation is quite as dire as you might think, reading through the rest of the thread.
So is this where people are going to act like they noticed this all along? lol
Can someone explain this in mortal terms? Like does this affect FPS?
I play the game, and it works. Don't even experience lag.
What is this tickrate and why does it matter?
Specially since I don't notice any issues with the game itself.
Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
The fact that he uses "GHz" for tickrate invalidates his entire post. Pretty sure he doesn't know what he's talking about.Also a tickrate based on GHZ would be insane and almost unnecessary ignoring the fact I doubt we have cpus that strong to feed such a rate.
Well, technically with a normal tv (not a pc gamer monitor) + and a average pin is almost impossible to notice something... Can be Better thats for sure, but is not going to solve the "being shoot behind a wall" problem that a lot of people think that is caused because the low tickrate.So this is where people try to invalidate other people's opinions and pretend nothing is happening because they can't notice it? lol
The fact that he uses "GHz" for tickrate invalidates his entire post. Pretty sure he doesn't know what he's talking about.
If average human reaction time is 250ms, then that means that cinematic framerates (20-25fps) should "suffice", right? Preposterous to say the least.
Also, remember that the majority of people uses tvs to play with their consoles, so the input lag is normally really really high. Maybe if you use a gaming monitor with a input lag below 1ms and a really good pin you can notice something.
Well, technically with a normal tv (not a pc gamer monitor) + and a average pin is almost impossible to notice something... Can be Better thats for sure, but is not going to solve the "being shoot behind a wall" problem that a lot of people think that is caused because the low tickrate.
Yes, even on a really good and expensive tv the input lag in gaming mode is much much higher than a gaming monitor. The majority of people here are describing Lag issues not anything related to a low tickrate.This is a good point, and it makes the math much more complicated (and also much more dire). You could argue that a higher tick rate helps because at least you're not adding *more* latency to the stack. But you could also argue that people playing on a TV (especially if they leave all the post processing stuff on and don't set it to "game mode") don't really have any grounds to complain about some other source of latency
I'm not mixing up anything.You're mixing up perception and reaction time.
For avoiding unnecessary death, reaction times are important, and that is the time it takes your slow-ass brain to process a visual signal, figure out what to do, and propagate an electrochemical signal to the bits of you that need to do the thing.
As this happens, you can totally perceive smaller units of time, and your brain can process things after the fact and evaluate all the agonizing milliseconds of time that went by while you slowly reacted to your impending inky doom. But that doesn't help you actually react any faster to it.
What kind of "argument" is this?This is a good point, and it makes the math much more complicated (and also much more dire). You could argue that a higher tick rate helps because at least you're not adding *more* latency to the stack. But you could also argue that people playing on a TV (especially if they leave all the post processing stuff on and don't set it to "game mode") don't really have any grounds to complain about some other source of latency
Is matter a lot. Thats way pro players uses gaming monitor... The "problem" created by the low tickrate is probably less than the Lag of your TV... Thats the point.You know that what type of TV someone is using doesn't matter in this discussion right? The TV just displays what's happening later. Every display will display what had happened exactly the same, one display just shows it later than the other. You know how a stream shows the exact same thing as the streamer is seeing only 15 seconds later? Same thing. Having a different TV doesn't mean you're magically gonna get sniper behind a wall less.
Every display will display what had happened exactly the same, one display just shows it later than the other.
Having a different TV doesn't mean you're magically gonna get sniper behind a wall less.
Is matter a lot. Thats way pro players uses gaming monitor... The "problem" created by the low tickrate is less than the Lag of your TV... Thats the point.
Tickrate and Lag are not the same things.Are you saying the low tickrate lag does not contribute to the overall lag?
Tickrate and Lag are not the same things.
The fact that he uses "GHz" for tickrate invalidates his entire post. Pretty sure he doesn't know what he's talking about.
If average human reaction time is 250ms, then that means that cinematic framerates (20-25fps) should "suffice", right? Preposterous to say the least.
Of course not, it's part of our imagination!Tickrate cannot contribute to lag? Or latency?
Is matter a lot. Thats way pro players uses gaming monitor... The "problem" created by the low tickrate is probably less than the Lag of your TV... Thats the point.
It's mostly console gamers that spout this nonsense and it tends to generate memes like "24 fps are more cinematic". No one has answered my question if Splatoon would be acceptable at 30 or 16 fps.Some people here take average human reaction time being around 250 ms to mean that the human brain is some kind of discrete system that only samples reality and takes actions at intervals of 250 ms therefore making any and all events in the Universe that take place in smaller timeframes imperceptible and/or irrelevant. It's kind of hilarious.
Would a 60 -> 16 fps downgrade be justified if someone didn't turn game mode on? Would a *universal* downgrade be justified if *some* people didn't turn game mode on?
Basically, lower tick rate results in more trades in fights. Rather than one winning out, it flubs facts and both will kill each other.I play the game, and it works. Don't even experience lag.
What is this tickrate and why does it matter?
Specially since I don't notice any issues with the game itself.
Will my gameplay experience somehow drastically improve when I already don't notice anything wrong with the online experience?
With a networked game, you introduce another potential source of lag and latency to the game's model. But because you can't really expect people to have pings of less than 100ms on a consistent basis, you can't just sync up tick rate with monitor refresh and call it good.
A simple analogy; a low tickrate is why you see rubberbanding in Mario Kart online, correct?
The video OP linked does make it a little easier to understand, but I guess the question I want to ask is there a way for the tickrate to be improved from the developer's side with a software patch?
What does ping have to do with the tick rate value? And why is tick rate a fixed value, even if I'm playing in a LAN environment with other people (single-digit milliseconds)? What if I'm playing with regional players over the internet (double-digit milliseconds)?With a networked game, you introduce another potential source of lag and latency to the game's model. But because you can't really expect people to have pings of less than 100ms on a consistent basis, you can't just sync up tick rate with monitor refresh and call it good.
Finally you admit that a low tick rate can increase deaths behind walls/cover.This makes your brain happy, and the result is a convincing illusion that you and your opponents are playing in a shared world. The illusion lasts until you get sniped from behind a wall, or someone warps a few steps, or something else weird happens, and thus begin conversations about what the devs could do to make things better (high tick rates help; lower ping times help; more sophisticated prediction algorithms help, though skilled humans will, of course, be harder to predict).
An experienced player knows when he dies because of tick rate BS and when someone got the drop on him/her legit.Regardless, my point was that you're not necessarily dying because of a low tick rate. You might be dying because somebody got the drop on you and killed you before you could react in their version of the world, and you saw the seams in the system as your console caught up.
No...what gave you that idea?
No...what gave you that idea?
Anti bufferbloat tech has been throughly benchmarked and it keeps your networks running at microseconds for the most part as it should.
Everyone keeps suggesting that a higher tickrate means a smoother and more accurate response. If I'm watching a player nearly right next to me suddenly teleport further down the track because they just used a mushroom, while I'm still experiencing no lag nor frame drops, is that not due to a low tickrate?
Everyone keeps suggesting that a higher tickrate means a smoother and more accurate response. If I'm watching a player nearly right next to me suddenly teleport further down the track because they just used a mushroom, while I'm still experiencing no lag nor frame drops, is that not due to a low tickrate?
An experienced player knows when he dies because of tick rate BS and when someone got the drop on him/her legit.