Thanks! Even this still defines noise only in terms of 'interference', and then takes a bit of digging to infer that it's something to do with timing and interrupts, and so it's probably jitter in execution time rather than referring to RF interference or electrical noise. I'm no kernel hacker but I've spent a lot of time close to the hardware and I've never heard jitter referred to as 'noise'.
I agree indeed. In statistical analysis or signal processing noise is described as an element presenting data/energy other than energy/data to be observed. I guess any data collecting system can call noise anything that is filtered out from that being measured.
Windows folks may be familiar with LatencyMon or DPC Checker which do similar things on windows to find misbehaving hardware or drivers which can cause issues for realtime/pseudo-realtime applications like AV capture/mixing.
I read this but I do not get the point of OSNOISE. I was thinking it would be used for better random number generation but the doc does not seem to confirm that.
It's about measuring jitter caused by the OS. When you have a large message passing system like you see in HPC environments, delays in processing exactly the right messages add up to huge failures of utilization because of the dependency graph of work to be done. In the past I had even seen national labs write their own Linux syscall compatible kernels in order to combat jitter. For example: https://github.com/HobbesOSR/kitten
It's about jitter caused by the OS, or in this case HW (e.g. due to SMM or similar shenanigans), that can be problematic for real time or HPC applications.
taneq|3 years ago
javier_e06|3 years ago
gorkish|3 years ago
jmclnx|3 years ago
monocasa|3 years ago
jabl|3 years ago