I just want to say I am glad pro gaming took over. Back in the day it was only Quake players advocating for 120 FPS (for various reasons, including Q3 physics being somewhat broken), 125hz mice and stuff like that. I am talking 20 years ago.
The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.
There's definitely diminishing returns the higher we go with refresh rates. 60hz to 240hz for example is like playing a completely different game. But going from 240hz to 360hz, even in CSGO it's a lot harder to notice a difference.
Personally I believe the newly announced 300hz 27" 1440p monitors[0] are going to be the perfect sweet spot for the foreseeable future. I imagine it will be a long time before technology emerges that is a noticeable improvement to this.
There are diminishing returns, but 360Hz is still too low to display sharp-looking motion without strobing. 360Hz strobing is visible as phantom array effect whenever you move your eyes. If you are sensitive to this artifact and instead want motion that looks like real life, you need more like 1000fps/1000Hz. There is no hardware capable of this, but at high enough frame rate you could probably get away with interpolating the frames along motion vectors, e.g from 500fps to 1000fps, with very minor latency/artifacting.
I am not convinced we need to go nearly that high - 300hz puts a crapload more stress on your system performance wise for not much gain. 90hz is in my opinion already such a massive improvement over 60 that I do not see mind blowing results even going to 120 or 144. And many pro gamers were using 120hz monitors over 144hz at one point.
Realistically I think the two sweet spots are 120hz and 240hz - not necessarily because they are the best of the best but because they are each divisible by both 24 and 30 (the most common FPS of films and television) AND they offer two tiers of increased performance for different hardware requirements. You can run a much more taxing game at 120 and then if you want to spend the big bucks on the latest hardware move up to 240.
As for resolution I completely agree with you - 1440P is really a sweet spot for 27" monitors. If display / DPI scaling improves across multiple OS then I think eventually we will likely have 4K become the norm for 27" sized monitors and it will show some improvement but again be diminishing returns like the difference between 120-240. That being said as more film content moves to 4k I think we will also start to see 1440P become less popular as people will want to view content in something that doesnt scale.
All of this however is nothing compared to the improvement that a true HDR display brings - a high end monitor that can show a large increase in dynamic range is such a game changer and I do not think most people realize it yet - it brings us so much closer to how the human eye really sees that I really think it is equivalent to the difference of going from laserdisc resolution to something 4k. And on top of that now that cameras are also shooting in such massive dynamic ranges it is going to make older content just look plain in comparison.
> And many pro gamers were using 120hz monitors over 144hz at one point.
This was done solely in order to enable strobing as 144 Hz panels at the time were too slow to support strobing which requires scanout speeds equivalent to ~200 or so Hz at 144 Hz.
And strobing is solely used because S&H displays have too much transition and motion blur at 1xx and 2xx Hz.
> Realistically I think the two sweet spots are 120hz and 240hz - not necessarily because they are the best of the best but because they are each divisible by both 24 and 30 (the most common FPS of films and television)
Principally I agree that 120/240 Hz are more suited to general purpose use for this reason [1], but on the other hand this really has nothing to do with the hardware and is purely so because of software limitations. Really what one would like to see is that video playback causes the variable-display refresh to adjust to a multiple of the exact video frame rate instead of janky ad-hoc frame-rate conversions.
This is a common theme; hardware is generally much more capable than what the software/drivers allow everywhere you look.
> All of this however is nothing compared to the improvement that a true HDR display brings
Many people would probably already be quite happy with something that doesn't turn shadows into a foggy, cloudy mess like all IPS panels do, and VA panels as well (but less so).
[1] though 120 Hz does not solve the 50p problem, as content produced by broadcasters in 50 Hz countries, which is basically all of the world that isn't the US, cannot easily be converted at playback time. 25p can just be handled like 24p with 1:1 playback, basically nobody notices the slight speed-up / slow-down and that's how films have been shown in television in 50 Hz countries since always.
There are definitely diminishing returns with increase of refresh rates. But nonetheless your comparison is unfair, since you are comparing quadrupling of refresh rate with a mere 50% increase, which is similar to comparing 60 to 90, not 60 to 240. And with advent of VR demand for refresh rate increase of display panels will only grow, since it's much more noticeable while using headset.
As for VR I think that's an excellent illustration of my point - we know that many people don't do well at 60FPS per eye in VR due to motion sickness. Move up to 90FPS per eye though and there is a massive improvement that I have seen first hand others. By the time you get to 120FPS the experience feels pretty damn smooth and while I would of course like to see more frames I am not convinced going a ton beyond 120 is really worth it performance wise considering you have to render that twice and the extra compute could be instead spent on the new shiny like ray tracing.
Luckily when I was working at a VR startup I was one of the few people that never seemed to get motion sickness so I became the test dummy for everyone's work - they would throw me in something they hadn't optimized at all yet that was only getting 40fps on a system with dual Xeons and quad Nvidia top of the line workstations cards and while it felt a little weird it for some reason never bothered me :-D
It doesn't make any sense to invest much in displays over ~90hz vs just working on adaptive refresh rates
Your eyes really do work at a pretty low speed. At some point it makes more sense to just track the eyeballs and put updated scenery in front of them at the exact instant the game engine produces it, rather than try to run at some insanely high speed generating frames that aren't actually having any effect on the player's brain
90, 144, 240hz, etc all look better than 60hz because there's less random lag between when the game generates a frame and when it appears on the screen. You can't see an 8ms delay, but you CAN see a variable 0-10ms delay that's happening as the game engine and computer monitor drift in and out of sync again and again.
>I just want to say I am glad pro gaming took over.
Yes! I have been crying about latency for nearly 10 years [1]. Computing has always been optimising for throughput. And before Pro Gaming, there just hasn't been a marketable incentive for companies to work on / minimise latency. Now we finally do!
Even in the best case scenario, the lowest latency is still 25ms, and in most cases we are still above 50ms. I think it is worth posting [2] Microsoft Research on Input latency. It would be nice if we could get average system end to end latency down to sub 10ms level. Which is what I hope work on VR will bring us.
I believe that nonsense was originally send in the world by the movie industry to have an argument for not increasing the roll sizes and weights to disproportionate sizes. Not to mention that the earliest film rolls were also highly incendiary giving even more incentive not to make them too big or store too many
Part of that however is also highly related to motion blur - many big directors have done tests in theaters showing "HFR" content (like 60fps) and audiences distinctly said they did not like it on average. The D-Day scene from Saving Private Ryan is a good example - it was not shown HFR but they intentionally made the shutter speed faster to give it that "staccato" and jerky and gritty sort of feel. While in photography we use all kinds of shutters speeds for different effects (think of things like using a super fast shutter speed to freeze the propellers of a plane or using a very long shutter speed in a landscape photo with a river so that the river becomes a nice smooth blur) the movie industry mostly abides by the rule of "180 degree shutter" meaning that your shutter speed is 1 over 2x your fps (x=fps). So for most cinema shot at 24fps the shutter speed is 1/48 of a second.
The importance of this is that because you are not shooting still frames and instead of shooting a series of frames to be played back quickly this adds a motion blur effect that smooths the transition between frames and creates a sort of artistic look. There are technical limitations of this blurring (medium fast pans across a scene are a great example - the whole thing becomes too blurred and is hard to see). Any scene with slower moving objects such as people adds a sort of natural motion blur that many cinematographers believe is an artistically ideal choice.
Now that being said you do not need to abide by the 180 degree shutter with modern cameras (like Saving Private Ryan) and one can theoretically choose a variety of shutter speeds for different scenes regardless of what FPS one is shooting at. A fast pan could be shot at something like 1/120 and even at 24FPS it will appear much sharper and easier to make out individual objects (although perhaps not quite as smooth on the panning motion). However you are theoretically limited on the low end to a shutter speed that is approximately equal to your frame rate (or your shutter would be open LONGER than the frame itself and defeat the purpose of shooting "frames" in the first place). So theoretically we could move to 48 FPS content and still shoot at 1/48 and have the same amount of motion blur PER FRAME but also double the amount of frames which would be a large improvement from a technical sense. I haven't seen any films shot this way but I have experimented quite a bit with my own camera shooting at these kinds of speeds and it works quite well. You can also shoot at 24FPS and drag the shutter to 1/24 to get a full stop (double the amount of light) vs normal 24FPS footage if you are shooting in a very dark environment that is already pushing the limits of your cameras sensor. This of course introduces even more motion blur but depending on the scene it may not be very noticeable or even introduce interesting artistic looks.
TLDR: I think we should move to 48 FPS and shoot most content at a variety of shutter speeds, the most common being the already standard 1/48s shutter, and either increase or decrease that within reason depending on the nature of the scene and the desired artistic outcome.
I think people were saying 60fps was the limit but still I agree
That being said in the quake days I dont think monitors could go over 60hz anyways so even at 120FPS you were not gaining a similar advantage from what we have today. From what I remember however there were other advantages to high FPS in games like Counterstrike as well in terms of player movement - the monitor might have "smoothed" the motion back down to 60 fps but it still resulted in a more accurate experience.
I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
And of course you can overclock an LCD monitor quite easily - most will not do much but there are some that I got to 90hz which (in my opinion) is a massive improvement compared to 60 and the 30hz difference is a much, much larger jump than the next jump from 90 to 120hz.
> I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
Yes, even the standard VGA 13h mode (320x200x8) is 70Hz and many CRTs could do 85Hz. By Quake 3's time CRTs that could do 120Hz and above were very common. Personally i have such a CRT as well as another that can do 160Hz.
Also FWIW the refresh rate is only part of the story - CRTs have practically instant "response time" so 120Hz on a CRT vs 120Hz on a LCD feels very different (in favor of the CRT). Supposedly OLED could be made to be close but personally i haven't seen such a case (and people who have both OLED and CRTs still say that CRTs are better there). I have a 165Hz LCD and doesn't hold a candle to the CRTs i have around in terms of motion feel.
Nowadays you can find small-ish CRTs for dirt cheap on Facebook Marketplace, etc (some even give them for free) - i recommend trying to find one that can do 120Hz if for no other reason than to experience the liquid butter smoothness of FPS motion (and join us in the lamenting its loss in modern monitor tech :-P). Also kinda amusing that when those were new chances are the PCs they were used with couldn't do high framerates (and low framerates do not feel as bad on a CRT as on an LCD, but i'm not sure if it is related).
A large reason why CRTs are rather excellent in regards to motion is because they're not sample and hold displays resulting in very low duty cycles (dominated by the phosphor's fall time of somewhere between 200-1000 µs, as rise time is basically instant at <20 ns or so and video bandwidth is well above 100 MHz). That's the main reason why a 240 Hz LCD using BFI for strobing (D=0.5) can't compete with a 120 Hz CRT (D~0.05 or so).
Interesting - I had a nice high end 21" CRT at one point I got for free when some tech company went under and told the building maintenance to just trash all of the brand new equipment. Luckily my uncle was that maintenance guy and I got free pick of whatever parts I wanted before it went to the landfill.
I do remember being VERY good at the original counterstrike (pre v1.5) comparatively - I know CRT's have very low input lag - I wonder if I was playing at higher fps / hz and didnt even realize it!
From what I remember VGA could actually do some decent resolutions (SVGA and whatnot were more limited) and DVI was starting to rear its head around that time as well. I vaguely using a resolution like 1900 x 1200 which is about what a modern 1080P HD is doing (slightly higher in fact)
Now this is making me wonder how my plasma TV actually compares - from what I remember plasmas do not have a "hertz" so to say but also werent really coveted for gaming (although burn in may to be to blame for that). Input lag on it seems decent but I would guess might be its big limitation. Surprisingly it does do 10 bit video and while it wont accept an HDR signal I suspect the display itself is capable of showing more dynamic range than many of the cheaper "HDR" LCD's
I'm pretty sure I was playing Counter Strike at, as reported by the game, 99 fps on my Sony Trinitron back then (2000?). We'd use "low poly" mods (simplified 3D characater models) to speed everything up and reach these speeds.
There are still messages on message boards from this era where several people mention reaching that same 99 fps.
It was two decades ago but I'm pretty sure it was 1024x768 @ 99 fps (19" Sony Trinitron). I may be mistaken on the monitor though.
Great read. I have one nit-pick recommendation for clarity: the article makes no mention of "input latency" anywhere. Saying just "latency" is very confusing since the term applies to many areas of a game, and in most cases will typically be attributed to network latency in multiplayer games.
I usually get 10ms ping on CSGO.... they must have something better? (I have 5ms right now with a Comcast cable link)... as much as I hate having to call Comcast for any issues, when it works, it is pretty good.)
Input lag is the time between you perform the action and the computer shows that on screen. It depends on your frame rate, refresh rate, and peripheral polling rate, as well as how good the game schedules things (which is what LatencyFleX tries to optimize).
Network ping on the other hand is often hidden away. Whether you are on 2ms ping or 100ms ping, the bullet always goes where you aim at: this is done through rollback netcode [1], which rewinds the server state to the time the action has been performed. I'm not saying that having low ping is pointless, it has an effect on things like peeker's advantage, but the effect of network ping is drastically different from the effect of input lag.
> Whether you are on 2ms ping or 100ms ping, the bullet always goes where you aim at
However what you are aiming at may not actually be where you see it. If it's another player or something critical to multiplayer gameplay, then you are going to see what the server has told you you're going to see at that location. And latency means that will be delayed. So you may think you have a clear hit on a target, but with latency the target may have moved and you haven't gotten an update yet. Or by the time your fire command gets to the server, the target has already moved.
This can get pretty confusing especially if the game does client side hit effects (like bullet impacts). You shoot, you see immediate feedback of the bullet hitting, but due to latency your target moved and you missed - so the feedback is false. But if you don't do the client-side hit effect, there's a strange subtle delay that feels wrong. Client tells server they fired, server determines hit location, server sends back "hit here" message, client draws hit effect.
It doesn't matter if where the player actually is, is somewhere else. As long as you aim at them and shoot before they shoot you, even if they are no longer there, the server will "rewind" and say you shot them, up to a certain limit. This is covered in your link :)
Every subsystem goes "ah buffering for a few ms or a frame doesn't hurt anybody" and in the end you have click-to-photon latencies of 150+ ms.
This starts with incorrect debouncing in input devices (which can cause 10+ ms delay on its own even in dedicated "gaming" hardware!), slow/inefficent poll rates, wrong input handling in games (many still seem to receive input on the "main" thread ticking along at the frame rate, which causes huge amounts of input lag at lower framerates and also clamps inputs to frametimes, which causes significant position errors), incorrect V-Sync implementation (like the broken "triple buffering" in D3D9-11 games, this should be a thing of the past nowadays), graphics drivers favoring throughput over latency as they are almost exclusively benchmarked on average fps (used to be that they could buffer up to 9 frames, or ~150 ms) etc.
Input lag is the period between input and output, not just the time to poll the input. E.g. on 250 Hz standard mice just polling the mouse will average 2 ms input lag with 2 ms jitter and that's before you've even done anything with the input. If you don't have a high refresh rate VRR gaming display and don't want tearing the same repeats, e.g. 60hz would be 8.3 +- 8.3. That's halfway to 20 and we haven't even gotten to delays from the game code where you can choose things like triple buffering for higher FPS at the cost of another frame of latency. Input lag can also include output to the monitor and delay from the monitor depending how it's being measured.
In this case it's more about the rendering pipeline input lag so the USB polling delay and the monitor output delay aren't counted, the swapchain delay to prevent tearing or trading latency/FPS are though and then you have to add in your actual game still.
Those are End to End System latency. Your USB Input Devices and Monitor Refresh Rate, excluding anything in between, could easily had 20ms delay in the worst case scenario. A 60Hz monitor would have had 16.6ms of worst case latency alone.
There are a variety of factors that add up to many ms of lag. USB and HDMI are a few that come to mind. USB is incredibly complex when compared to e.g. ps/2. If you are building an embedded board like rasberry pi and you want to add ps/2, you just add a couple pins. If you want to add usb, you buy an extra chip.
Polling is often lower latency than interrupts on modern hardware, since it can run at much finer grains and has so much less overhead to startup over an ISR.
Only in the slowest modes. It is possible to go faster than the 1ms poll rate, though it is back to interrupts:
> Transaction latency
> For low speed (1.5 Mbit/s) and full speed (12 Mbit/s) devices the shortest time for a transaction in one direction is 1 ms.[6] High speed (480 Mbit/s) uses transactions within each micro frame (125 µs)[7] where using 1-byte interrupt packet results in a minimal response time of 940 ns. 4-byte interrupt packet results in 984 ns.
1 or 2, depends on the kind of modem. VDSL2+ modems might have three, though one of those is the line driver which is zero-delay for the purposes of a computer.
Modems/routers don't use polling nor do they need debouncing. The default USB polling rate is 125Hz, which is already 8ms of latency. Debouncing depends on the switch and is generally a couple milliseconds as well.
This is about adapting algorithms that deal with congestion of network packets to reduce congestion of a game loop (refreshing as fast as possible, but no faster).
The elephant in the room here is that you can pay to win in any game by buying a monitor with higher reftesh rate and use a larger GPU that uses more electricity to have 2x more time to react.
Fortunately for us humans that seems to stop at 120Hz because most games can't even hold that at a steady rate with a 3090.
Now whether a 300+W gaming device is interesting in the long run will be answered this year by your electricity bill!
By that logic, every sports player that uses high quality gear is "paying to win". Is Nadal winning solely based on the quality of his racquet? Of course not. Would he play with a basic or low quality one? Absolutely not. What's wrong using the best gear possible?
I don't know where'd you get this notion that it stops at 120hz. It's been proven again and again that even with monitors with low refresh rate you still get a better experience by having more fps available. Better so when you have both the frames and the refresh rate in your monitor.
To be fair, it is a big factor in competitive cycling. Such that, just getting good is not enough. You have to get good and but the best equipment.
Not that big of a deal when you are all getting together in real life. You can see if you were at a disadvantage due to equipment. Online, you don't get to see.
There's nothing unique about competitive cycling in this regard. Take two tennis players of comparable skill and give one the best racket and the other a $20 one from Walmart. Or in football, do the same with two receivers and give one the latest Nike Vapor Jet gloves and have the other use no gloves. Or golf with clubs. I could go on and on.
Equipment quality will always have an impact in sports.
To an extent, I can't but agree. The number of components in cycling is staggering, though. And I have seen far more people effectively buying a few seconds time in cycling than I have elsewhere.
And, at high end competitive, I don't really see a problem. Even low level, I don't see a huge problem. But it does exist. Is why I offered this as a "being fair."
Lots of multiplayer games intentionally implement low-vis, low-contrast environments (mud-colored players in mud-colored environments) which is why things like Digital Vibrance and "Black Enhancers" are so popular. Arguably the competitive advantage of those, tuned to the game [1], exceeds everything else once you've done the basics (120+ Hz, normal-acting hardware).
[1] In a particular game I discovered that abusing the R/G/B controls into giving you something that looks almost like one of those colorblind simulations in normal conditions would give you a massive advantage to the point of most players calling hacks.
> The elephant in the room here is that you can pay to win in any game by buying a monitor with higher reftesh rate and use a larger GPU that uses more electricity to have 2x more time to react.
I'll be honest, it sounds like you have no idea how competitive gaming works, or any sport at all. Your comments sounds exactly like thinking one can be better at football by buying more expensive boots.
I agree - this is not true at all. A really good competitive gamer is going to destroy even a very very good casual gamer even if you gave them a crap rig running at like 30FPS
Definitely there are real advantages to running higher framerates but its all diminishing returns. If your rig is fast enough to maintain CONSISTENT frames that is probably more important than going for the absolute highest - there is a reason many set a minimum / maximum FPS target. Consistency is important.
Most of the really competitive games arent really all that hard to run either these days - even an older but still decent gaming computer can do great. Considering how cheap it is now to build something that will run most games well vs how it used to be in the past I think the pay to play aspect has actually REDUCED quite a bit. You can buy a quality mouse these days for 20-30$ that would blow away what we had 10 years ago and that mouse probably would last you for as long as you want.
Reminds me of an old racquetball tournament where they would
put everyone in the same level (as in the top players intermingled with the bottom) but depending on the players skill they were given a type of massive handicap. If you were in the very top tier racquetball level they gave you a racquet that had been strung in a very clever circular way where there was a literal HOLE right in the middle of the racquet where the sweet spot was!
I think the open post was more lamenting that casual play will be annoying by folks that pay to win. And... That feels likely?
Not really new, mind. Lan parties that had that one person that spent way more than everyone else was a thing. Did they automatically win? No. But they punched over their league in the party.
The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.