Jitter is a fascinating concept, with applications in digital imagery and quantum mechanics. The word is a corruption of the scotticism "chitter", which is an omomatopoeic rendering of the noise that your teeth make when you shiver (another offshoot is "chatter"). So jittering is a jerky jumping between positions that surround a central averaged point, and "having the jitters" means being nervously jumpy, or having the shakes for some other reason (e.g. drug or alcohol withdrawal, see also the origins of the word jitterbug). In digital measuring systems, it's the tendency for background noise to make measurements jump about between adjacent states when the real signal value is close to a quantisation threshold.
At first sight, jitter looks like an engineering annoyance. If you feed a slowly-changing analogue signal into a digitiser you might expect the correct result to be a "simple" stepped waveform, but if the signal is noisy, and the signal level happens to be near a digital crossing-point, then that noise can make the output "jitter" back and forth between the two nearest states. A small amount of noise well below the quantisation threshold can be amplified and generate 1-bit noise on the digital data stream, as the output "jitters" between the two closest states.
So early audio engineers would try to filter this sort of noise out of the signal before quantisation. However, they later realised that the effect was useful, and that the jittering actually carried valuable additional information. If you had an digitiser that could only output a stream of eight-bit numbers, and you needed that stream to run at a certain rate, you could run the hardware at a multiple of the required rate, and deliberately inject low-level, high-frequency noise into the signal, causing the lowest bit to dance around at the higher clockrate. If the original signal level lay exactly between two digital levels, the random jitter would tend to make the output jump between those two levels with a ratio of about ~50:50. If the signal voltage was slightly higher, then additional system noise would tend to make the sampling process flip to the "higher" state more often than the "lower state. If the original input signal was lower than the 50:50 mark, the noise wouldn't reach the higher threshold quite so often, and the "jittered" datastream would have more low bits than high bits. So the ratio between "high" and "low" bit-noise told us approximately where the original signal level lay, with sub-bit accuracy.
This generated the apparently paradoxical result that we could make more accurate measurements by adding random noise to the signal that we wanted to measure! Although each individual sample would tend to be less reliable than it would have been if the noise source wasn't there, when a group of adjacent samples were averaged together, they'd conspire to recreate a statistical approximation of the original signal voltage, at a higher resolution than the physical bit-resolution of the sampling device. All you had to do was to run the sampling process at a higher rate than you actually wanted, then smooth the data to create a datastream at the right frequency, and the averaging process would give you extra digits of resolution after the "point".
So if you sampled a "jittery" DC signal, and measured "9, 10, 9, 10, 10, 9, 10, 10", then your averaged value for the eight samples would be 9.625, and you'd evaluate the original signal to have had a value of somewhere just over nine-and-a-half.
Jitter allowed us to squeeze more data through a given quantised information gateway by using spare bandwidth, and passing the additional information as statistical trends carried on the back of an overlaid noise signal. It was transferring the additional resolution information through the gateway by shunting it out of the "resolution" domain and into a statistical domain. You didn't have to use random noise to "tickle" the sampling hardware – with more sophisticated electronics you could use a high-frequency rampwave signal to make the process a little more orderly - but noise worked, too.
So jitter lets us make measurements that at first sight appear to break the laws of physics. No laws are really being broken (because we aren't exceeding the total information bandwidth of the gateway), but there are some useful similarities here with parts of quantum mechanics – we're dealing with a counterintuitive effect, where apparently random and unpredictable individual events and fluctuations in our measurements somehow manage to combine to recreate a more classical-looking signal at larger scales. Even with a theoretically-random noise source with a polite statistical distribution tickling the detector thresholds, the resulting noise in the digitised signal still manages to carry statistical correlations that carry real and useful information about what's happening under the quantisation threshold.
Once you know a little bit about digital audio processing tricks, some of the supposedly "spooky" aspects of quantum mechanics start to look a little more familiar.
Quantum Tech Could Be Used in Warfare, Scientists Warn.
-
Physics, as history taught us, can be dangerous. As quantum computing
continues to progress (slowly), researchers and governments across the
world are worr...
4 hours ago
No comments:
Post a Comment