Wednesday, July 1, 2020

Quantum fluctuations have been shown to affect macroscopic objects

In the hands of skilled experimentalists, light can be used as a probe for extremely precise measurements. However, the quantum nature of light places an intrinsic limit on the precision of such measurements. Writing in Nature, Yu et al.1 report that this limit has been overcome in experiments carried out using the Laser Interferometer Gravitational-Wave Observatory (LIGO) at Livingston, Louisiana. Moreover, the authors report the measurement of the effects of quantum fluctuations on macroscopic, kilogram-mass objects at room temperature. This is remarkable, because such fluctuations occur at size scales that are comparable to the dimensions of elementary particles.

Exceptionally sensitive detectors known as interferometers are used to measure the small distance variations induced by gravitational waves, which are produced by some of the most catastrophic events in the Universe. In the LIGO interferometer, mirrors are placed on kilogram-mass test objects at either end of two 4-kilometre-long cavities (arms); each pair of mirrors forms a system called an optical cavity. To attenuate external noise, the test masses are suspended on pendulums, which can oscillate only with frequencies that are much smaller than the frequency of the gravitational signal they are used to detect. Laser light is split into two beams, which are each sent down a different arm and reflected between the mirrors in the cavity. When the beams leave the cavity, they are recombined to produce interference patterns, which are then analysed for evidence of gravitational waves.

Light is electromagnetic radiation, and the lowest-energy quantum state of the electromagnetic light field is known as the vacuum. Despite its name, this vacuum is not completely empty. It contains quantum fluctuations that produce uncertainties in measurements of the amplitude and phase of light waves (in the case of a sinusoidal wave, the phase describes the shift of the waveform away from the minimum amplitude that corresponds to the start of the wave cycle). These uncertainties are quantified by Heisenberg’s uncertainty principle.

Vacuum fluctuations cause noisy readouts in precision measurements made using light. Fluctuations in the measurements of the phase of light produce a phenomenon known as shot noise, whereas fluctuations in measurements of the amplitude of light produce radiation-pressure noise. The combination of these two is called quantum noise, and it limits the precision of measurements of tiny forces and displacements. The highest precision of any measurement that can be achieved using naturally occurring quantum states is called the standard quantum limit (SQL).

The SQL is a direct consequence of the Heisenberg uncertainty principle, which states that it is not possible to measure the position and momentum of an object simultaneously with unlimited precision. An electromagnetic field can be mathematically described as a set of two oscillating components: one component is related to the amplitude, and the other to the phase, of the wave. The fluctuations of these two also obey the Heisenberg uncertainty principle. However, the precision of measurements of amplitude and phase can be greatly improved if the magnitudes of uncertainties regarding the two components correlate with each other (Fig. 1). Such correlations arise spontaneously when light travels in suspended interferometers, such as the one used by LIGO. Suspended interferometers measure the phase of the output field of light waves, which is affected by both amplitude and phase fluctuations of the input vacuum field. This correlation is called the ponderomotive effect2. The detection response of the instrument is frequency dependent, and the effects of the amplitude fluctuations are more evident in the low-frequency realm of the detection band, whereas the phase fluctuations are more evident at high frequencies.

Figure 1

Figure 1 | Light squeezing due to the ponderomotive effect in one arm of a gravitational-wave detector. Gravitational-wave detectors contain optical cavities, which consist of mirrors suspended from pendulums and separated by distances of several kilometres. Light enters the cavity in an ‘unsqueezed’ state — that is, quantum fluctuations related to the phase and amplitude of light (uncertainties in the probability distribution of measurements) do not correlate with each other. The oscillating movement of the mirrors, induced by the radiation pressure of circulating light, causes a phase shift of light trapped in the cavity, and generates quantum correlations between the amplitude and phase (termed the ponderomotive effect). Light exiting the cavity is therefore squeezed; for this example, the phase uncertainty has been reduced, whereas the amplitude uncertainty has increased. At a different observation frequency of the signal, light might be squeezed another way — with increased phase uncertainty and decreased amplitude uncertainty. Yu et al.1 show that this effect can be used to increase the precision of measurements made by a gravitational-wave detector, thereby surpassing an intrinsic limit on precision (the standard quantum limit). The authors also show that radiation pressure noise — the minuscule variation of the force exerted on the kilogram-scale mirrors by light trapped in the cavity — contributes to the motion of the suspended mirrors.

Light that has correlations between the uncertainties of its amplitude and phase is said to be ‘squeezed’. The Heisenberg principle still holds for squeezed light states, but when one of the uncertainties is reduced, the other is increased. Squeezed light can be used in experiments to reduce the uncertainty of one of the correlated parameters. A special case of squeezed light, known as the squeezed vacuum, forms when the average amplitude of the light is zero.

Phase-squeezed light, in which the uncertainty associated with the phase is squeezed, has been used to reduce shot noise for both LIGO3 and Virgo, the gravitational-wave detector located in Cascina, Italy4. And the ponderomotive effect has previously been demonstrated using the mechanical motion of pico- to microgram-scale mirrors in laboratory experiments5,6. Yu et al. now confirm that the ponderomotive effect occurs in the optical cavities of the LIGO interferometer, and have investigated whether it can be used in combination with squeezed-vacuum states to reduce quantum noise below the SQL in measurements of mirror position in the cavities.

The authors measured the noise in the LIGO interferometer under two sets of experimental conditions: one in which squeezed-vacuum states were injected into the output port of the interferometer, and another in which squeezed-vacuum states were not injected. They then plotted sensitivity curves for the data, which chart the noise level in the detector and define the minimum gravitational signal that can be detected as a function of the signal’s frequency. This revealed that, once classical (non-quantum) noise had been subtracted from their data, the uncertainties in the phases of the laser beam and in the positions of the mirrors produce a combined quantum noise below the SQL. Yu and colleagues have therefore demonstrated two fundamental points: that quantum fluctuations of light exert a measurable force on macroscopic objects (the 40-kg mirrors); and that the quantum noise corresponding to these disturbances can be reduced to below the SQL.

One of the main difficulties for these kinds of measurement is thermal fluctuations — which can drive mirror motion and are one of the main sources of noise for gravitational-wave detectors. Cryogenic conditions have therefore been needed in some previously reported experiments7,8 to reduce quantum noise to less than the SQL. Impressively, Yu and co-workers’ measurements were made at room temperature.

Yu et al. are the first to have proved experimentally that a quantum non-demolition technique — a method in which a measurement of a quantum system is performed repeatedly without perturbing it9 — works in gravitational-wave detectors. At present, such detectors use phase-squeezed vacuum states to reduce shot noise, without considering the correlations that are introduced by the interferometer mirrors. This approach improves sensitivity only for gravitational signals in which the frequency is higher than 100 hertz, up to the limit of the detection band6. By contrast, Yu and colleagues’ technique potentially enables broadband detection improvement. However, further work will be needed to reduce the classical noise in the interferometer.

Once better sensitivity has been developed, more gravitational waves could be detected than is possible at present. Future work in noise suppression will therefore take us towards an exciting era of sub-SQL performance of gravitational-wave detectors.



from Hacker News https://ift.tt/3ghBeHb

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.