The effect largely goes away if the frequency exceeds 4 kHz, as even the quickest saccades would see a blur rather than a flicker. putting the flicker above 20 kHz would be ideal, because that way it is well beyond both visual range and even any noise generated by the circuit would be inaudible. The real question is why higher frequencies aren't used for driving LEDs. They have very low cycling latency, so it's a no-brainer.
daniel_reetz|4 years ago
Consider that raising the PWM frequency doesn't automatically solve the problems you raise. 20khz PWM can still have audible harmonics, for example. In the mini-LED industry there are actually spread spectrum/randomized PWM approaches to address this, and to help with EMI.
watersb|4 years ago
!! Great.. now I have to consider Tempest shielding for all the displays; I had thought once CRTs went away, we'd be safe...
jiggawatts|4 years ago
I read somewhere that that makes it difficult to accurately control the brightness. With HDR content already requiring 10 bits, anything that interferes with that is a problem. Future panels might require 12 bits or more.