On 07/01/2021 07:40, Ahem A Rivet's Shot wrote:
> No, but 50Hz interlaced monitors (stripped TVs really) are and
> removing the interlace was considered enough to make it flicker free and
> "professional" circa 1980 - I sometimes turned the interlace back on and
> pushed the resolution up. In more recent times 60Hz was said to be needed
> for flicker free and now it's 70Hz I'm seeing and high end TVs are currently
> advertising 120Hz.
Several different issues there.
Old CRT TVs worked with an interlaced signal so needed long persistence
phosphors so the each field of alternating scan lines would still be
visible when the next field was displayed. UK PAL TVs were set up for
50Hz fields so had a longer persistence than US NTSC at 60Hz.
CRT monitors (generally) worked with non interlaced signals, so had low
persistence phosphors as all the scan lines would be updated every
frame. Monitors were invariably designed for the US market so had
phosphor persistence for a minimum of 60Hz frame rate, and would look
terrible if driven at 50Hz.
As monitor resolutions and sizes increased the frame rates also
increased to 70Hz or 75Hz, to reduce flicker and smearing of screen updates.
Now with LCD, LCD/LED and OLED technologies the screen does not flicker
at the update rate. Panels used to have a 60Hz update rate to match US
content (but would be fine for 50Hz), but active 3D drove the adoption
of 120Hz or greater panels, so alternate views could be shown at 60Hz.
3D has largely been dropped now, but 120Hz or greater panels still
exist, so they can either be driven at a higher frame rate by games
consoles, or additional intermediate frames are generated for lower
refresh rate content, in both cases to make animation smoother.
---druck
--- SoupGate-Win32 v1.05
* Origin: Agency HUB, Dunedin - New Zealand | FidoUsenet Gateway (3:770/3)
|