top | item 46949376

(no title)

joefourier | 20 days ago

The reason the nearest neighbour interpolation can sound better is that the aliasing fills the higher frequencies of the audio with a mirror image of the lower frequencies. While humans are less sensitive to higher frequencies, you still expect them to be there, so some people prefer the "fake" detail from aliasing to them just been outright missing in a more accurate sample interpolation.

It's basically doing an accidental and low-quality form of spectral band replication: https://en.wikipedia.org/wiki/Spectral_band_replication which is used in modern codecs.

discuss

order

Sesse__|20 days ago

It's actually the other way round: Aliasing fills the lower frequencies with a mirror image of the higher frequencies. So where do the higher frequencies come from? From the upsampling that happens before the aliasing. _That_ makes the higher frequencies contain (non-mirrored!) copies of the lower frequencies. :-)

Sesse__|18 days ago

Just so that my wrongness isn't there for posterity: This is wrong for a real-valued signal (which is what we're discussing here). I had forgotten about the negative frequencies. So there _is_ a mirror coming from the upsampling. Sorry. :-)

joefourier|20 days ago

Oh yes you're correct, imaging would be the correct term for what's happening I think (aliasing is high -> low and imaging is low -> high)?