?
2011-07-26 14:20:25 UTC
Surely, an oscilloscope or spectrum analyser has to do its measurement over at certain interval of time during which the error will average out to zero and become impossible to measure?!
So I have +ve and -ve errors that average out at zero but there's still gonna be a standard deviation for all those errors.
am I missing something?