Submitted by NotJoeMama727 t3_10pubxs in explainlikeimfive
Aevum1 t1_j6ma6aw wrote
Lets explain a few things
First whats an Hz, a Hz is a clock movement, its the rythem to which a process advances. its a clock tick which fixes the rate in which it works.
Hz is mesured at X/second usually, X being the number, so 60Hz is something that happens 60 times a second, your 5ghz processor has a work rate of 5,000,000,000 clock "ticks" per second.
to display video, a display usually works by using still frames at a very fast rate, 24fps for europe and 30fps in the US, mainly becuase the electricity which carries a AC current which provides a 50 or 60hz clock signal, this is a legacy component (a left over component) from when TV signals used the electricity frequency as a clock signal.
the thing is that with digital video and every device having its own clock signal (usually a quartz occilator on the board) theres no need for this now, and especially with gaming class video cards, they exceed the normal output of 24 or 30fps or even 60fps.
So you can use a higher refresh for a more immersive expiriance,
The problem is that the more work done by the GPU, the more power it consumes, and the display has to redraw the whle screan, meaning reseting all the pixels and recoloring them for each frame, and if th display consumes 1watt per redraw, at 30fps it will consume half of what it consumes at 60fps. (lighting not included, thats a constant power cost)
Theres also Interleved (thats why you see P or i next to resolutions) P is progressive scan that draws the whole image, and interleved draws only half the screen in every other line, but since its so fast you dont notice.
and as a final note, theres Vsync.
V-sync, aka vertical sync is a technology which fixes the video cards frames per second to a fixed refresh on the monitor, so if the monitor has a 60hz refresh, it will not allow the video card to exceed 60hz. now more advance technologies like g-sync and freesync adapt the videocard output to the maximum refresh of the monitor, but this is more to avoid screen tearing, which is when the Frames per second the video card is outputting is not divisiable by the refresh of the monitor and you start getting half rendered screens.
Bensemus t1_j6ntwih wrote
> but since its so fast you dont notice.
Ho boy is interlaced ever noticeable. When it first came out it was an upgrade but now if people watch interlaced content they will really notice how poor moving images are.
NickyXIII t1_j6oqtl2 wrote
It's much less noticable in motion when using an analog interlaced display than when using a modern digital progressive display. Interlaced is absolutely worse, but that is exactly why once we could logistically make progressive happen we did
Viewing a single comment thread. View all comments