top | item 27939731

(no title)

ayane_m | 4 years ago

The effect largely goes away if the frequency exceeds 4 kHz, as even the quickest saccades would see a blur rather than a flicker. putting the flicker above 20 kHz would be ideal, because that way it is well beyond both visual range and even any noise generated by the circuit would be inaudible. The real question is why higher frequencies aren't used for driving LEDs. They have very low cycling latency, so it's a no-brainer.

discuss

order

daniel_reetz|4 years ago

In the mini-led industry the clocks are often much higher. That said, you are limited by scan (how many LEDs driven per driver chip), because the clock is divided among the driven LEDs.

Consider that raising the PWM frequency doesn't automatically solve the problems you raise. 20khz PWM can still have audible harmonics, for example. In the mini-LED industry there are actually spread spectrum/randomized PWM approaches to address this, and to help with EMI.

watersb|4 years ago

> 20khz PWM can still have audible harmonics... there are actually spread spectrum/randomized PWM approaches to address this, and to help with EMI.

!! Great.. now I have to consider Tempest shielding for all the displays; I had thought once CRTs went away, we'd be safe...

jiggawatts|4 years ago

> They have very low cycling latency, so it's a no-brainer.

I read somewhere that that makes it difficult to accurately control the brightness. With HDR content already requiring 10 bits, anything that interferes with that is a problem. Future panels might require 12 bits or more.