Speaker
Description
Squeezed light is critical in gravitational-wave detection for reaching sensitivities below the standard quantum limit. The success of future detectors will rely on achieving far greater squeezing levels, with an ultimate goal of 10 dB of quantum noise reduction. Even as squeezer technology matures, the internal losses of current detectors remain too large to support such high levels of squeezing. The largest source of internal loss, mode-mismatch between the coupled laser cavities, arises from practical—and largely, irreducible—limitations in the fabrication and positioning of optics. We demonstrate that statistical and machine-learning techniques can be used to optimize coupled-cavity interferometer design for maximum squeezing performance. As an example, we optimize the LIGO A+ design by minimizing its sensitivity to common errors in the positions and radii of curvature of the signal recycling cavity optics. In a head-to-head matchup against the nominal A+ design, we find that in 50% of trials, an optimally error-tolerant design achieves a 43% larger shot noise reduction factor for the same level of injected squeezing.