The Allan variance of signal and reference frequencies is measured by a least-square fit of the output of two analog-to-digital converters to ideal sine waves. The difference in the fit phase of the two channels generates the timing data needed for the Allan variance. The fits are performed at the signal frequency (≈10 MHz) without the use of heterodyning. Experimental data from a modified digital oscilloscope yield a residual Allan deviation of 3 × 10/τ, where τ is the observation time in s. This corresponds to a standard deviation in time of <300 fs or 20 μrad in phase. The experimental results are supported by statistical theory and Monte Carlo simulations which suggest that optimized devices may have one or two orders of magnitude better performance.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1063/1.5010140 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!