• TAG@lemmy.world
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    1 year ago

    Serious question, why not? Do they produce some harmful flicker or something?

    • PapstJL4U@lemmy.world
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      2
      ·
      1 year ago

      burn in with static ui elements - The idea is, that regular very different images reduces the risk.

      • Some screens have pixelshifting to mitigate this. Also, burn in doesn’t happen as quickly as you might think.

        Now, you wouldn’t want to go using an OLED for a billboard.

        • Refurbished Refurbisher@lemmy.sdf.org
          link
          fedilink
          arrow-up
          12
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Actually a different issue. With CRTs and plasma, burn-in would actually burn an image into the phosphors of the screen.

          OLEDs will slowly burn out after continued use, usually starting with the blue LEDs, giving a similar appearance, but not exactly the same (for example, you can only see the image persistence when displaying an image).

          LCDs don’t have either issue, but sometimes crystals can get stuck in a specific orientation, leading to (usually temporary) image retention. Gets worse in very cold weather. This is rare, though.

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          It would take thousands of hours with literally no image movement/change for a CRT or LCD to get burn in. Yes, technically every TV/monitor is capable of getting it, but functionally they really don’t. That’s why (generally) the only examples of CRT burn in you find are hospital monitors and such.