Type: Article
Publication Date: 2011-11-02
Citations: 31
DOI: https://doi.org/10.1103/physrevd.84.102001
A fundamental limit to the sensitivity of optical interferometers is imposed by Brownian thermal fluctuations of the mirrors' surfaces. This thermal noise can be reduced by using larger beams which ``average out'' the random fluctuations of the surfaces. It has been proposed previously that wider, higher-order Laguerre-Gaussian modes can be used to exploit this effect. In this paper, we show that susceptibility to spatial imperfections of the mirrors' surfaces limits the effectiveness of this approach in interferometers used for gravitational-wave detection. Possible methods of reducing this susceptibility are also discussed.