Because upsampling requires interpolation. I'm using distortion in the abstract/academic sense here.
Any signal through a non ideal (or nontrivial, eg a gain) system will introduce distortion, harmonic being new energy at different frequencies, frequency distortion in the relative magnitude at the output that is different than the input, and the same for phase.
If you use a polynomial interpolator there will be harmonic distortion. If you use an interpolation filter there will be frequency distortion, which is highly correlated to the phase distortion.
What that distortion is and if it is perceptible depends on your design and constraints. But the key element that is at play here is that oversampling for nonlinear system modeling, which introduces harmonic distortion after the fact, is that it can amplify frequency (and therefore phase) distortion and your constraints are tighter than they normally are.
If you want an example take a the ideal interpolator, which is a sinc function. Because a true sync is non causal you cannot implement it, therefore lossless interpolation is not achievable. There will always be some loss of information, the design problem is trading off that loss with the resources required. We can do pretty damn good upsampling today however.
One thing that might work in our favor here, is that for this particular application -- simulating a guitar amplifier -- we are only dealing with the subset of behaviors that are physically realizable in an electronic circuit.
unlinked_dll|6 years ago
Any signal through a non ideal (or nontrivial, eg a gain) system will introduce distortion, harmonic being new energy at different frequencies, frequency distortion in the relative magnitude at the output that is different than the input, and the same for phase.
If you use a polynomial interpolator there will be harmonic distortion. If you use an interpolation filter there will be frequency distortion, which is highly correlated to the phase distortion.
What that distortion is and if it is perceptible depends on your design and constraints. But the key element that is at play here is that oversampling for nonlinear system modeling, which introduces harmonic distortion after the fact, is that it can amplify frequency (and therefore phase) distortion and your constraints are tighter than they normally are.
If you want an example take a the ideal interpolator, which is a sinc function. Because a true sync is non causal you cannot implement it, therefore lossless interpolation is not achievable. There will always be some loss of information, the design problem is trading off that loss with the resources required. We can do pretty damn good upsampling today however.
analog31|6 years ago