The refresh rate, also known as vertical refresh rate or vertical scan rate in reference to terminology originating with the cathode-ray tubes (CRTs), is the number of times per second that a raster-based display device displays a new image. This is independent from frame rate, which describes how many images are stored or generated every second by the device driving the display. On CRT displays, higher refresh rates produce less flickering, thereby reducing eye strain. In other technologies such as liquid-crystal displays, the refresh rate affects only how often the image can potentially be updated.[1]

Non-raster displays may not have a characteristic refresh rate. Vector displays, for instance, do not trace the entire screen, only the actual lines comprising the displayed image, so refresh speed may differ by the size and complexity of the image data.[2] For computer programs or telemetry, the term is sometimes applied to how frequently a datum is updated with a new external value from another source (for example; a shared public spreadsheet or hardware feed).

Physical factors

While all raster display devices have a characteristic refresh rate, the physical implementation differs between technologies.

Cathode-ray tubes

Thumb
Electron beam in the process of refreshing an image on a CRT

Raster-scan CRTs by their nature must refresh the screen since their phosphors will fade and the image will disappear quickly unless refreshed regularly.

In a CRT, the vertical scan rate is the number of times per second that the electron beam returns to the upper left corner of the screen to begin drawing a new frame.[3] It is controlled by the vertical blanking signal generated by the video controller, and is partially limited by the monitor's maximum horizontal scan rate.

The refresh rate can be calculated from the horizontal scan rate by dividing the scanning frequency by the number of horizontal lines, plus some amount of time to allow for the beam to return to the top. By convention, this is a 1.05x multiplier.[4] For instance, a monitor with a horizontal scanning frequency of 96 kHz at a resolution of 1280 × 1024 results in a refresh rate of 96,000 ÷ (1024 × 1.05) ≈ 89 Hz (rounded down).

CRT refresh rates have historically been an important factor in video game programming. In early videogame systems, the only time available for computation was during the vertical blanking interval, during which the beam is returning to the top right corner of the screen and no image is being drawn.[5] Even in modern games, however, it is important to avoid altering the computer's video buffer except during the vertical retrace, to prevent flickering graphics or screen tearing.

Liquid-crystal displays

Unlike CRTs, where the image will fade unless refreshed, the pixels of liquid-crystal displays retain their state for as long as power is provided, and consequently there is no intrinsic flicker regardless of refresh rate. However, the refresh rate still determines the highest frame rate that can be displayed, and despite there being no actual blanking of the screen, the vertical blanking interval is still a period in each refresh cycle when the screen is not being updated, during which the image data in the host system's frame buffer can be updated. Vsync options can eliminate screen tearing by rendering the whole image at the same time.

Computer displays

A video of a CPU fan rotating at 0, 300 and 1300 revolutions per minute, recorded at 60 frames per second

On smaller CRT monitors (up to about 15 in or 38 cm), few people notice any discomfort between 6072 Hz. On larger CRT monitors (17 in or 43 cm or larger), most people experience mild discomfort unless the refresh is set to 72 Hz or higher. A rate of 100 Hz is comfortable at almost any size. However, this does not apply to LCD monitors. The closest equivalent to a refresh rate on an LCD monitor is its frame rate, which is often locked at 60 fps. But this is rarely a problem, because the only part of an LCD monitor that could produce CRT-like flicker—its backlight—typically operates at around a minimum of 200 Hz.

Different operating systems set the default refresh rate differently. Microsoft Windows 95 and Windows 98 (First and Second Editions) set the refresh rate to the highest rate that they believe the display supports. Windows NT-based operating systems, such as Windows 2000 and its descendants Windows XP, Windows Vista and Windows 7, set the default refresh rate to a conservative rate, usually 60 Hz. Some fullscreen applications, including many games, now allow the user to reconfigure the refresh rate before entering fullscreen mode, but most default to a conservative resolution and refresh rate and let you increase the settings in the options.[citation needed]

Old monitors could be damaged if a user set the video card to a refresh rate higher than the highest rate supported by the monitor. Some models of monitors display a notice that the video signal uses an unsupported refresh rate.

Dynamic refresh rate

Some LCDs support adapting their refresh rate to the current frame rate delivered by the graphics card. Two technologies that allow this are FreeSync and G-Sync.

Stereo displays

When LCD shutter glasses are used for stereo 3D displays, the effective refresh rate is halved, because each eye needs a separate picture. For this reason, it is usually recommended to use a display capable of at least 120 Hz, because divided in half this rate is again 60 Hz. Higher refresh rates result in greater image stability, for example 72 Hz non-stereo is 144 Hz stereo, and 90 Hz non-stereo is 180 Hz stereo. Most low-end computer graphics cards and monitors cannot handle these high refresh rates, especially at higher resolutions.

For LCD monitors the pixel brightness changes are much slower than CRT or plasma phosphors. Typically LCD pixel brightness changes are faster when voltage is applied than when voltage is removed, resulting in an asymmetric pixel response time. With 3D shutter glasses this can result in a blurry smearing of the display and poor depth perception, due to the previous image frame not fading to black fast enough as the next frame is drawn.[citation needed]

Televisions

Thumb
This gif animation shows a rudimentary comparison of how motion varies with 4 Hz, 12 Hz, and 24 Hz refresh rates. Entire sequence has a frame rate of 24 Hz.[6]

The development of televisions in the 1930s was determined by a number of technical limitations. The AC power line frequency was used for the vertical refresh rate for two reasons. The first reason was that the television's vacuum tube was susceptible to interference from the unit's power supply, including residual ripple. This could cause drifting horizontal bars (hum bars). Using the same frequency reduced this, and made interference static on the screen and therefore less obtrusive. The second reason was that television studios would use AC lamps, filming at a different frequency would cause strobing.[7][8][9] Thus producers had little choice but to run sets at 60 Hz in America, and 50 Hz in Europe. These rates formed the basis for the sets used today: 60 Hz System M (almost always used with NTSC color coding) and 50 Hz System B/G (almost always used with PAL or SECAM color coding). This accident of chance gave European sets higher resolution, in exchange for lower frame-rates. Compare System M (704 × 480 at 30i) and System B/G (704 × 576 at 25i). However, the lower refresh rate of 50 Hz introduces more flicker, so sets that use digital technology to double the refresh rate to 100 Hz are now very popular. (see Broadcast television systems)

Another difference between 50 Hz and 60 Hz standards is the way motion pictures (film sources as opposed to video camera sources) are transferred or presented. 35 mm film is typically shot at 24 frames per second (fps). For PAL 50 Hz this allows film sources to be easily transferred by accelerating the film by 4%. The resulting picture is therefore smooth, however, there is a small shift in the pitch of the audio. NTSC sets display both 24 fps and 25 fps material without any speed shifting by using a technique called 3:2 pulldown, but at the expense of introducing unsmooth playback in the form of telecine judder.

Similar to some computer monitors and some DVDs, analog television systems use interlace, which decreases the apparent flicker by painting first the odd lines and then the even lines (these are known as fields). This doubles the refresh rate, compared to a progressive scan image at the same frame rate. This works perfectly for video cameras, where each field results from a separate exposure  the effective frame rate doubles, there are now 50 rather than 25 exposures per second. The dynamics of a CRT are ideally suited to this approach, fast scenes will benefit from the 50 Hz refresh, the earlier field will have largely decayed away when the new field is written, and static images will benefit from improved resolution as both fields will be integrated by the eye. Modern CRT-based televisions may be made flicker-free in the form of 100 Hz technology.

Many high-end LCD televisions now have a 120 or 240 Hz (current and former NTSC countries) or 100 or 200 Hz (PAL/SECAM countries) refresh rate. The rate of 120 was chosen as the least common multiple of 24 fps (cinema) and 30 fps (NTSC TV), and allows for less distortion when movies are viewed due to the elimination of telecine (3:2 pulldown). For PAL at 25 fps, 100 or 200 Hz is used as a fractional compromise of the least common multiple of 600 (24 × 25). These higher refresh rates are most effective from a 24p-source video output (e.g. Blu-ray Disc), and/or scenes of fast motion.[10]

Displaying movie content on a TV

As movies are usually filmed at a rate of 24 frames per second, while television sets operate at different rates, some conversion is necessary. Different techniques exist to give the viewer an optimal experience.

The combination of content production, playback device, and display device processing may also give artifacts that are unnecessary. A display device producing a fixed 60 fps rate cannot display a 24 fps movie at an even, judder-free rate. Usually a 3:2 pulldown is used, giving a slight uneven movement.

While common multisync CRT computer monitors have been capable of running at even multiples of 24 Hz since the early 1990s, recent "120 Hz" LCDs have been produced for the purpose of having smoother, more fluid motion, depending upon the source material, and any subsequent processing done to the signal. In the case of material shot on video, improvements in smoothness just from having a higher refresh rate may be barely noticeable.[11]

In the case of filmed material, as 120 is an even multiple of 24, it is possible to present a 24 fps sequence without judder on a well-designed 120 Hz display (i.e., so-called 5-5 pulldown). If the 120 Hz rate is produced by frame-doubling a 60 fps 3:2 pulldown signal, the uneven motion could still be visible (i.e., so-called 6-4 pulldown).

Additionally, material may be displayed with synthetically created smoothness with the addition of motion interpolation abilities to the display, which has an even larger effect on filmed material.

"50 Hz" TV sets (when fed with "50 Hz" content) usually get a movie that is slightly faster than normal, avoiding any problems with uneven pulldown.

See also

References

Wikiwand in your browser!

Seamless Wikipedia browsing. On steroids.

Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.

Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.