TCL's ultra-fast 4K LCD prototype has us musing about diminishing smoothness returns.
See full article...
See full article...
If it means that "lesser" rate monitors (144, 240, etc) go down in price - then absolutely agree.I know it's almost certainly massive overkill, but from a purely technical standpoint it's still awesome to see this kind of progress.
If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.I know it's almost certainly massive overkill, but from a purely technical standpoint it's still awesome to see this kind of progress.
Your game might run at 30fps. Mine sure don't.Game runs at 30 fps. Meh.
This is cool and like others said if it makes producing the 200ish panels cheaper then awesome.
But I don't feel like there is a going to be meaningful real world benefits for the average person from such a high refresh rate.
Just making a console game joke.Your game might run at 30fps. Mine sure don't.
And likely for physical reasons, the panels will have to be faster than this (by multiples). So maybe 3600 or 4800Hz and with 16000 ppi pixel densities. We're a LOOONG way to that.If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.
And likely for physical reasons, the panels will have to be faster than this (by multiples). So maybe 3600 or 4800Hz and with 16000 ppi pixel densities. We're a LOOONG way to that.
What physical reasons?And likely for physical reasons, the panels will have to be faster than this (by multiples).
The referenced paper says (indirectly) that 6-10k should be fine. What makes you think 16k is required?16000 ppi pixel densities
This is the problem with inverse measurements. What we're actually measuring is the time between frames, which approach 0. Hz approach infinity, so the same improvement requires larger and larger increases. 20Hz from 60 to 80 is a much bigger improvement than 20Hz from 120 to 140.70 to 120 I can tell, but 120 to 144 is barely noticable...
Yeah people seriously underestimate our eyes' and brain's sensory and perceptive capabilities. People use to go into a fucking RAGE over the notion of "you can't see anything past 24 fps" just because movies were 24 fps. People have been regurgitating "your eye can't see past X" nonsense for a long time.If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.
Yeah people seriously underestimate our eyes' and brain's sensory and perceptive capabilities. People use to go into a fucking RAGE over the notion of "you can't see anything past 24 fps" just because movies were 24 fps. People have been regurgitating "your eye can't see past X" nonsense for a long time.
Movies only appear smooth because of the motion blur captured by exposure time. It's very lifelike and similar to our own perceived motion blur because it's basically just a function of exposure and time. As opposed to something like an LCD that creates a very different type of motion blur that does not look correct to our eyes.
Also, you can actually very much see the limitations of 24 fps if you carefully watch a fast moving object, or if the camera pans quickly. Take your eyes and move them "with" the motion of the object and the individual frames and the low temporal resolution become quite clear.
Or don't, because then you'll have a hard time unseeing it and it'll annoy you forever
That's exactly the issue the researches claim necessitates 1800Hz.Also, you can actually very much see the limitations of 24 fps if you carefully watch a fast moving object, or if the camera pans quickly. Take your eyes and move them "with" the motion of the object and the individual frames and the low temporal resolution become quite clear.
You didn't read the article, did you? It's not about "seeing" 1000fps. It's about not inducing blur. Blue which leads to "motion sickness" or "simulation sickness."As the saying goes, if the headline is a yes or no question, the answer is always, "No."
In this case that's based on the difference between want, and need. I expect a lot of people will "want" it. But none of them will realistically "need" it.
It really depends on the kind of display. For sample-and-hold displays like what is being demonstrated in the image in question, the image in question is perfectly fine.They need to separate the frame duration from the frame update rate. The blurring they are showing is based on a frame persisting until the next frame. This is different than showing a frame for a short time, then another for a short time some time later.
Like anti aliasing is less necessary as resolution increases because the sheer number of pixels smooths the jagged edges on its own.One thing that I rarely see mentioned is that the need for VRR (e.g. G-SYNC/FreeSync) is reduced as refresh rates increase. At 1000 Hz, you've only got 1 ms between refreshes, where a "missed" frame is going to result in noticeably less stutter than a missed frame at 60 Hz (16.7 ms); 120 Hz (8.3 ms); or even 240 Hz (4.2 ms).
Effectively, 1000 Hz displays can kind of brute force VRR's job, but without the handful of issues that VRR can introduce. We won't see gamma curve-induced flickering at a fixed refresh rate, nor is there any need for low-framerate compensation. Black-frame insertion is far easier to implement with a fixed refresh rate as well.
finally! something I can run solitaire on in all its glory!
Which is the distinction I was talking about. Sample and hold is the problem. Sample then off does not have that blurring, and does not require any more bandwidth.It really depends on the kind of display. For sample-and-hold displays like what is being demonstrated in the image in question, the image in question is perfectly fine.
Sample then off still has plenty of its own issues.Which is the distinction I was talking about. Sample and hold is the problem. Sample then off does not have that blurring, and does not require any more bandwidth.
"Sample then off still has plenty of its own issues, not a single one of which I can be bothered to mention."Sample then off still has plenty of its own issues.
Exactly, that's a great comparison! Higher pixel density means lower need for high-quality anti-aliasing at a given viewing distance, and we're able to use faster/lower-quality AA at 4K than at 1080p, for example. Many TAA implementations can look fairly blurry in general, but can range from "tolerable" to "good" at 4K. However, at 1080p they may be excessively blurry in some cases.Like anti aliasing is less necessary as resolution increases because the sheer number of pixels smooths the jagged edges on its own.
If you ask Microsoft researchers they will tell you that a conservative assumption of required frame rate for life-like VR is 1800Hz.
Yes, that is exactly what I said, no need for deeper reading, you could have just read my post, because I said that, in my post, the one you quoted, the one where I said "a conservative assumption".If you read into it deeper, they say that's a conservative assumption.
It's going to be done with image generation, as the article mentioned.There's a limited budget of GPU horsepower, that can either be spent on reaching ultra high frame rates, or spent on generating ultra photorealistic images and physics. We will always have to choose one of the two: I don't think we'll ever get both at the same time - at least, not within our lifetimes. For my part, I'd rather have ultra-realistic ray-traced volumetric graphics with super high-fidelity physics. I'll gladly live with motion blur in exchange.