The minimum detectable phase shift indicated in recent experimental reports of new linear spectrumanalysis techniques of optical interferometric vibration detection is established as the direct consequence of the 1/f noise voltage in the system components. The dynamic range and inaccuracy predicted by the simple theoretical model presented is in good agreement with experimental measurements. The conclusions of the analysis are compared with experimental reports of heterodyne shot-noise-limited optical systems. With this effective tool the generic class of spectrum-analysis techniques can be analyzed and relatively weighed to assess the effect of noise. This analysis is applicable to optical interferometry in general, although the experiments specifically involved fiber-optic modulators.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1364/AO.31.005997 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!