Allan variance and its variants have proven useful outside the scope of timekeeping and are a set of improved statistical tools to use whenever the noise processes are not unconditionally stable, thus a derivative exists.
These noise forms become a challenge for traditional statistical tools such as standard deviation, as the estimator will not converge.
[4] While the two-sample variance did not completely allow all types of noise to be distinguished, it provided a means to meaningfully separate many noise-forms for time-series of phase or frequency measurements between two or more oscillators.
These noise forms have the effect that the standard variance estimator does not converge when processing time-error samples.
Since y(t) is the derivative of x(t), we can without loss of generality rewrite it as This definition is based on the statistical expected value, integrating over infinite time.
A first simple estimator would be to directly translate the definition into or for the time series: These formulas, however, only provide the calculation for the τ = τ0 case.
These could be modified to introduce the new variable n such that no new time-series would have to be generated, but rather the original time series could be reused for various values of n. The estimators become with
The confidence interval depends on the number of observations in the sample series, the dominant noise type, and the estimator being used.
[14][15] The Allan variance is unable to distinguish between WPM and FPM, but is able to resolve the other power-law noise types.
Alternatively,[4][16] the μ value of the dominant noise form may be inferred from the measurements using the bias functions.
For telecommunication needs, such methods have been required in order to ensure comparability of measurements and allow some freedom for vendors to do different implementations.
Further developments on the Allan variance was performed to let the hardware bandwidth be reduced by software means.
This development of a software bandwidth allowed addressing the remaining noise, and the method is now referred to modified Allan variance.
Such dead time introduces systematic measurement biases, which needs to be compensated for in order to get proper results.
Dead-time effects on measurements have such an impact on the produced result that much study of the field have been done in order to quantify its properties properly.
It is recommended that: A large number of conversion constants, bias corrections and confidence intervals depends on the dominant noise type.
The speed of which the time-interval counter can complete the measurement, output the result and prepare itself for the next arm will limit the trigger frequency.
Drift limitations in measurements can be severe, so letting the oscillators become stabilized, by long enough time being powered on, is necessary.
Limiting factors involve single-shot resolution, trigger jitter, speed of measurements and stability of reference clock.
The articles and panel discussions concur on the existence of the frequency flicker noise and the wish to achieve a common definition for both short-term and long-term stability.
Important papers, including those of David Allan,[4] James A. Barnes,[23] L. S. Cutler and C. L. Searle[2] and D. B. Leeson,[3] appeared in the IEEE Proceedings on Frequency Stability and helped shape the field.
David Allan's article analyses the classical M-sample variance of frequency, tackling the issue of dead-time between measurements along with an initial bias function.
The choice of such parametrisation allows good handling of some noise forms and getting comparable measurements; it is essentially the least common denominator with the aid of the bias functions B1 and B2.
Howe, Allan and Barnes presented the analysis of confidence intervals, degrees of freedom, and the established estimators.
It gives the first overview of the field, stating the problems, defining the basic supporting definitions and getting into Allan variance, the bias functions B1 and B2, the conversion of time-domain measures.
A classical reference is the NBS Monograph 140[24] from 1974, which in chapter 8 has "Statistics of Time and Frequency Data Analysis".
[25] This is the extended variant of NBS Technical Note 394 and adds essentially in measurement techniques and practical processing of values.
The NIST Special Publication 1065 "Handbook of Frequency Stability Analysis" of W. J. Riley[15] is a recommended reading for anyone wanting to pursue the field.
It is rich of references and also covers a wide range of measures, biases and related functions that a modern analyst should have available.
[28] A guest editor for that issue will be David's former colleague at NIST, Judah Levine, who is the most recent recipient of the I. I. Rabi Award.