Does anyone really need a 1,000 Hz gaming display? | Ars OpenForum

Does anyone really need a 1,000 Hz gaming display?

Post content hidden for low score. Show…
Post content hidden for low score. Show…

ERIFNOMI

Ars Tribunus Angusticlavius
11,864
Subscriptor++
Where's the blurbusters guy? He'll explain why this is so awesome.

Of course we need to be able to feed monitors with 1000fps. We'll get there.

E: I know it's mentioned in the article, but that guy goes into even more detail than that in the comments every time this discussion comes up. As someone that's susceptible to both "simulation sickness" and migraines when playing games, I'm glad someone is figuring this shit out for us.
 
Upvote
15 (28 / -13)

ERIFNOMI

Ars Tribunus Angusticlavius
11,864
Subscriptor++
Game runs at 30 fps. Meh.

This is cool and like others said if it makes producing the 200ish panels cheaper then awesome.

But I don't feel like there is a going to be meaningful real world benefits for the average person from such a high refresh rate.
Your game might run at 30fps. Mine sure don't.
 
Upvote
57 (64 / -7)

keltor

Ars Praefectus
5,538
Subscriptor
If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.
And likely for physical reasons, the panels will have to be faster than this (by multiples). So maybe 3600 or 4800Hz and with 16000 ppi pixel densities. We're a LOOONG way to that.
 
Upvote
-8 (5 / -13)
And likely for physical reasons, the panels will have to be faster than this (by multiples). So maybe 3600 or 4800Hz and with 16000 ppi pixel densities. We're a LOOONG way to that.
And likely for physical reasons, the panels will have to be faster than this (by multiples).
What physical reasons?
16000 ppi pixel densities
The referenced paper says (indirectly) that 6-10k should be fine. What makes you think 16k is required?
 
Upvote
22 (22 / 0)

ERIFNOMI

Ars Tribunus Angusticlavius
11,864
Subscriptor++
70 to 120 I can tell, but 120 to 144 is barely noticable...
This is the problem with inverse measurements. What we're actually measuring is the time between frames, which approach 0. Hz approach infinity, so the same improvement requires larger and larger increases. 20Hz from 60 to 80 is a much bigger improvement than 20Hz from 120 to 140.

70Hz is ~14ms frame to frame.
120Hz is ~8ms frame to frame.
144Hz is ~7ms frame to frame.
 
Upvote
109 (110 / -1)

Emon

Ars Praefectus
3,807
Subscriptor++
If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.
Yeah people seriously underestimate our eyes' and brain's sensory and perceptive capabilities. People use to go into a fucking RAGE over the notion of "you can't see anything past 24 fps" just because movies were 24 fps. People have been regurgitating "your eye can't see past X" nonsense for a long time.

Movies only appear smooth because of the motion blur captured by exposure time. It's very lifelike and similar to our own perceived motion blur because it's basically just a function of exposure and time. As opposed to something like an LCD that creates a very different type of motion blur that does not look correct to our eyes.

Also, you can actually very much see the limitations of 24 fps if you carefully watch a fast moving object, or if the camera pans quickly. Take your eyes and move them "with" the motion of the object and the individual frames and the low temporal resolution become quite clear.

Or don't, because then you'll have a hard time unseeing it and it'll annoy you forever
 
Upvote
92 (96 / -4)
Yeah people seriously underestimate our eyes' and brain's sensory and perceptive capabilities. People use to go into a fucking RAGE over the notion of "you can't see anything past 24 fps" just because movies were 24 fps. People have been regurgitating "your eye can't see past X" nonsense for a long time.

Movies only appear smooth because of the motion blur captured by exposure time. It's very lifelike and similar to our own perceived motion blur because it's basically just a function of exposure and time. As opposed to something like an LCD that creates a very different type of motion blur that does not look correct to our eyes.

Also, you can actually very much see the limitations of 24 fps if you carefully watch a fast moving object, or if the camera pans quickly. Take your eyes and move them "with" the motion of the object and the individual frames and the low temporal resolution become quite clear.

Or don't, because then you'll have a hard time unseeing it and it'll annoy you forever
Also, you can actually very much see the limitations of 24 fps if you carefully watch a fast moving object, or if the camera pans quickly. Take your eyes and move them "with" the motion of the object and the individual frames and the low temporal resolution become quite clear.
That's exactly the issue the researches claim necessitates 1800Hz.
 
Upvote
28 (29 / -1)

ERIFNOMI

Ars Tribunus Angusticlavius
11,864
Subscriptor++
As the saying goes, if the headline is a yes or no question, the answer is always, "No."

In this case that's based on the difference between want, and need. I expect a lot of people will "want" it. But none of them will realistically "need" it.
You didn't read the article, did you? It's not about "seeing" 1000fps. It's about not inducing blur. Blue which leads to "motion sickness" or "simulation sickness."
 
Upvote
32 (35 / -3)
They need to separate the frame duration from the frame update rate. The blurring they are showing is based on a frame persisting until the next frame. This is different than showing a frame for a short time, then another for a short time some time later.
It really depends on the kind of display. For sample-and-hold displays like what is being demonstrated in the image in question, the image in question is perfectly fine.
 
Upvote
6 (6 / 0)

Kodiack

Seniorius Lurkius
21
Subscriptor++
One thing that I rarely see mentioned is that the need for VRR (e.g. G-SYNC/FreeSync) is reduced as refresh rates increase. At 1000 Hz, you've only got 1 ms between refreshes, where a "missed" frame is going to result in noticeably less stutter than a missed frame at 60 Hz (16.7 ms); 120 Hz (8.3 ms); or even 240 Hz (4.2 ms).

Effectively, 1000 Hz displays can kind of brute force VRR's job, but without the handful of issues that VRR can introduce. We won't see gamma curve-induced flickering at a fixed refresh rate, nor is there any need for low-framerate compensation. Black-frame insertion is far easier to implement with a fixed refresh rate as well.
 
Upvote
34 (34 / 0)
One thing that I rarely see mentioned is that the need for VRR (e.g. G-SYNC/FreeSync) is reduced as refresh rates increase. At 1000 Hz, you've only got 1 ms between refreshes, where a "missed" frame is going to result in noticeably less stutter than a missed frame at 60 Hz (16.7 ms); 120 Hz (8.3 ms); or even 240 Hz (4.2 ms).

Effectively, 1000 Hz displays can kind of brute force VRR's job, but without the handful of issues that VRR can introduce. We won't see gamma curve-induced flickering at a fixed refresh rate, nor is there any need for low-framerate compensation. Black-frame insertion is far easier to implement with a fixed refresh rate as well.
Like anti aliasing is less necessary as resolution increases because the sheer number of pixels smooths the jagged edges on its own.
 
Upvote
37 (37 / 0)

ZenBeam

Ars Tribunus Militum
2,951
Subscriptor
It really depends on the kind of display. For sample-and-hold displays like what is being demonstrated in the image in question, the image in question is perfectly fine.
Which is the distinction I was talking about. Sample and hold is the problem. Sample then off does not have that blurring, and does not require any more bandwidth.
 
Upvote
8 (8 / 0)

Kodiack

Seniorius Lurkius
21
Subscriptor++
Like anti aliasing is less necessary as resolution increases because the sheer number of pixels smooths the jagged edges on its own.
Exactly, that's a great comparison! Higher pixel density means lower need for high-quality anti-aliasing at a given viewing distance, and we're able to use faster/lower-quality AA at 4K than at 1080p, for example. Many TAA implementations can look fairly blurry in general, but can range from "tolerable" to "good" at 4K. However, at 1080p they may be excessively blurry in some cases.
 
Upvote
10 (10 / 0)
Post content hidden for low score. Show…
If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.
If you read into it deeper, they say that's a conservative assumption.
Yes, that is exactly what I said, no need for deeper reading, you could have just read my post, because I said that, in my post, the one you quoted, the one where I said "a conservative assumption".
 
Upvote
34 (36 / -2)

WozNZ

Wise, Aged Ars Veteran
197
The technical mastery, epic.

That said this is still a totally pointless thing. As you jump from say 60Hz->120Hz->240Hz there is massive diminished returns and the hardware requirements to actually run against those limits jumps in cost and also power demands far above what is sane.

The only benefits will be how the cost of "lesser monitors" will fall ol
 
Upvote
-14 (3 / -17)
There's a limited budget of GPU horsepower, that can either be spent on reaching ultra high frame rates, or spent on generating ultra photorealistic images and physics. We will always have to choose one of the two: I don't think we'll ever get both at the same time - at least, not within our lifetimes. For my part, I'd rather have ultra-realistic ray-traced volumetric graphics with super high-fidelity physics. I'll gladly live with motion blur in exchange.
 
Upvote
6 (10 / -4)

ERIFNOMI

Ars Tribunus Angusticlavius
11,864
Subscriptor++
There's a limited budget of GPU horsepower, that can either be spent on reaching ultra high frame rates, or spent on generating ultra photorealistic images and physics. We will always have to choose one of the two: I don't think we'll ever get both at the same time - at least, not within our lifetimes. For my part, I'd rather have ultra-realistic ray-traced volumetric graphics with super high-fidelity physics. I'll gladly live with motion blur in exchange.
It's going to be done with image generation, as the article mentioned.

People can complain about those not being "real frames" but if they're only on screen for literally a millisecond, you won't know they're "fake" and in return, you get crisp, no blur motion that doesn't make you throw up. Sign me up.
 
Upvote
17 (19 / -2)