Speaker
Description
Quantum noise is now a central limitation in laser-interferometric gravitational-wave detectors, making it essential to identify the ultimate sensitivity allowed by quantum mechanics. Because these detectors estimate an entire gravitational-wave waveform rather than a single parameter, the relevant quantum limit is fundamentally a multiparameter one. While optimal measurements are understood in idealized lossless models, the corresponding limit for realistic detectors with optical loss and other imperfections has remained unclear.
In this talk, I will present a compact closed-form expression for the attainable quantum limit of tuned interferometers that includes the main nonidealities relevant to current instruments: optical loss, mode mismatch, and propagation-induced degradation of squeezed light. Applying the result to a realistic LIGO model shows that, over a finite frequency band, the commonly used quantum Cramér–Rao bound is not attainable and differs from the true achievable limit. I will also show how this limit can be reached in practice using multitone local-oscillator readout in a narrow band, and that such a scheme presents a significant astrophysical advantage over current homodyne detection.
Our theoretical results identify the optimal quantum measurement for realistic squeezed-light gravitational-wave detectors and solve the longstanding question of fundamental multiparameter quantum limits in current and next-generation interferometers.