I just want to say I am glad pro gaming took over. Back in the day it was only Quake players advocating for 120 FPS (for various reasons, including Q3 physics being somewhat broken), 125hz mice and stuff like that. I am talking 20 years ago.
The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.
There's definitely diminishing returns the higher we go with refresh rates. 60hz to 240hz for example is like playing a completely different game. But going from 240hz to 360hz, even in CSGO it's a lot harder to notice a difference.
Personally I believe the newly announced 300hz 27" 1440p monitors[0] are going to be the perfect sweet spot for the foreseeable future. I imagine it will be a long time before technology emerges that is a noticeable improvement to this.
>I just want to say I am glad pro gaming took over.
Yes! I have been crying about latency for nearly 10 years [1]. Computing has always been optimising for throughput. And before Pro Gaming, there just hasn't been a marketable incentive for companies to work on / minimise latency. Now we finally do!
Even in the best case scenario, the lowest latency is still 25ms, and in most cases we are still above 50ms. I think it is worth posting [2] Microsoft Research on Input latency. It would be nice if we could get average system end to end latency down to sub 10ms level. Which is what I hope work on VR will bring us.
I believe that nonsense was originally send in the world by the movie industry to have an argument for not increasing the roll sizes and weights to disproportionate sizes. Not to mention that the earliest film rolls were also highly incendiary giving even more incentive not to make them too big or store too many
I think people were saying 60fps was the limit but still I agree
That being said in the quake days I dont think monitors could go over 60hz anyways so even at 120FPS you were not gaining a similar advantage from what we have today. From what I remember however there were other advantages to high FPS in games like Counterstrike as well in terms of player movement - the monitor might have "smoothed" the motion back down to 60 fps but it still resulted in a more accurate experience.
I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
And of course you can overclock an LCD monitor quite easily - most will not do much but there are some that I got to 90hz which (in my opinion) is a massive improvement compared to 60 and the 30hz difference is a much, much larger jump than the next jump from 90 to 120hz.
Great read. I have one nit-pick recommendation for clarity: the article makes no mention of "input latency" anywhere. Saying just "latency" is very confusing since the term applies to many areas of a game, and in most cases will typically be attributed to network latency in multiplayer games.
I usually get 10ms ping on CSGO.... they must have something better? (I have 5ms right now with a Comcast cable link)... as much as I hate having to call Comcast for any issues, when it works, it is pretty good.)
Input lag is the time between you perform the action and the computer shows that on screen. It depends on your frame rate, refresh rate, and peripheral polling rate, as well as how good the game schedules things (which is what LatencyFleX tries to optimize).
Network ping on the other hand is often hidden away. Whether you are on 2ms ping or 100ms ping, the bullet always goes where you aim at: this is done through rollback netcode [1], which rewinds the server state to the time the action has been performed. I'm not saying that having low ping is pointless, it has an effect on things like peeker's advantage, but the effect of network ping is drastically different from the effect of input lag.
This is about adapting algorithms that deal with congestion of network packets to reduce congestion of a game loop (refreshing as fast as possible, but no faster).
The elephant in the room here is that you can pay to win in any game by buying a monitor with higher reftesh rate and use a larger GPU that uses more electricity to have 2x more time to react.
Fortunately for us humans that seems to stop at 120Hz because most games can't even hold that at a steady rate with a 3090.
Now whether a 300+W gaming device is interesting in the long run will be answered this year by your electricity bill!
By that logic, every sports player that uses high quality gear is "paying to win". Is Nadal winning solely based on the quality of his racquet? Of course not. Would he play with a basic or low quality one? Absolutely not. What's wrong using the best gear possible?
I don't know where'd you get this notion that it stops at 120hz. It's been proven again and again that even with monitors with low refresh rate you still get a better experience by having more fps available. Better so when you have both the frames and the refresh rate in your monitor.
Lots of multiplayer games intentionally implement low-vis, low-contrast environments (mud-colored players in mud-colored environments) which is why things like Digital Vibrance and "Black Enhancers" are so popular. Arguably the competitive advantage of those, tuned to the game [1], exceeds everything else once you've done the basics (120+ Hz, normal-acting hardware).
[1] In a particular game I discovered that abusing the R/G/B controls into giving you something that looks almost like one of those colorblind simulations in normal conditions would give you a massive advantage to the point of most players calling hacks.
> The elephant in the room here is that you can pay to win in any game by buying a monitor with higher reftesh rate and use a larger GPU that uses more electricity to have 2x more time to react.
I'll be honest, it sounds like you have no idea how competitive gaming works, or any sport at all. Your comments sounds exactly like thinking one can be better at football by buying more expensive boots.
beebeepka|4 years ago
The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.
eertami|4 years ago
Personally I believe the newly announced 300hz 27" 1440p monitors[0] are going to be the perfect sweet spot for the foreseeable future. I imagine it will be a long time before technology emerges that is a noticeable improvement to this.
[0]: https://www.nvidia.com/en-us/geforce/news/new-g-sync-monitor...
ksec|4 years ago
Yes! I have been crying about latency for nearly 10 years [1]. Computing has always been optimising for throughput. And before Pro Gaming, there just hasn't been a marketable incentive for companies to work on / minimise latency. Now we finally do!
Even in the best case scenario, the lowest latency is still 25ms, and in most cases we are still above 50ms. I think it is worth posting [2] Microsoft Research on Input latency. It would be nice if we could get average system end to end latency down to sub 10ms level. Which is what I hope work on VR will bring us.
[1] https://news.ycombinator.com/item?id=6422632
[2] https://www.youtube.com/watch?v=vOvQCPLkPt4
hetspookjee|4 years ago
Melatonic|4 years ago
That being said in the quake days I dont think monitors could go over 60hz anyways so even at 120FPS you were not gaining a similar advantage from what we have today. From what I remember however there were other advantages to high FPS in games like Counterstrike as well in terms of player movement - the monitor might have "smoothed" the motion back down to 60 fps but it still resulted in a more accurate experience.
I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
And of course you can overclock an LCD monitor quite easily - most will not do much but there are some that I got to 90hz which (in my opinion) is a massive improvement compared to 60 and the 30hz difference is a much, much larger jump than the next jump from 90 to 120hz.
hlbjhblbljib|4 years ago
billconan|4 years ago
I couldn't understand "congestion control relied on packet loss"? could somebody explain? Thanks!
does it mean "congestion control is triggered by the packet loss event, which is a signal for buffer being full"?
ishitatsuyuki|4 years ago
netcode_fan|4 years ago
kupopuffs|4 years ago
unknown|4 years ago
[deleted]
willis936|4 years ago
moffkalast|4 years ago
MaxikCZ|4 years ago
[deleted]
hffftz|4 years ago
ishitatsuyuki|4 years ago
Input lag is the time between you perform the action and the computer shows that on screen. It depends on your frame rate, refresh rate, and peripheral polling rate, as well as how good the game schedules things (which is what LatencyFleX tries to optimize).
Network ping on the other hand is often hidden away. Whether you are on 2ms ping or 100ms ping, the bullet always goes where you aim at: this is done through rollback netcode [1], which rewinds the server state to the time the action has been performed. I'm not saying that having low ping is pointless, it has an effect on things like peeker's advantage, but the effect of network ping is drastically different from the effect of input lag.
[1]: https://ki.infil.net/w02-netcode.html
HelloNurse|4 years ago
bullen|4 years ago
Fortunately for us humans that seems to stop at 120Hz because most games can't even hold that at a steady rate with a 3090.
Now whether a 300+W gaming device is interesting in the long run will be answered this year by your electricity bill!
mat0|4 years ago
I don't know where'd you get this notion that it stops at 120hz. It's been proven again and again that even with monitors with low refresh rate you still get a better experience by having more fps available. Better so when you have both the frames and the refresh rate in your monitor.
formerly_proven|4 years ago
[1] In a particular game I discovered that abusing the R/G/B controls into giving you something that looks almost like one of those colorblind simulations in normal conditions would give you a massive advantage to the point of most players calling hacks.
1_player|4 years ago
I'll be honest, it sounds like you have no idea how competitive gaming works, or any sport at all. Your comments sounds exactly like thinking one can be better at football by buying more expensive boots.
lbotos|4 years ago
Also, higher refresh will give skilled players an advantage, but it's def not pay to win.
babypuncher|4 years ago