Optical lattice clock could change the definition of the second
Tucked away in a vault in France is a weight made of platinum and iridium, diligently cared for by a team of highly trained specialists. Its sole purpose is to be, to sit and remain unchanging as the final authority on the meaning of the kilogram. The standard unit of mass in metric measurement, the standard kilogram is an artifact of great importance to many people. But extremely accurate measurements of mass ultimately only affect a small number of specialists, all of whom are researchers in a select array of fields. But time? Accurate measurement of time, standardized all around the world, is of use to everyone, every day.
The devices that sets that standard for time, in units of seconds, are already painstakingly precise. Atomic clocks, with an uncertainty of about 3 x 10-16 are even corrected for the minuscule relativistic effect of time dilation due to altitude. You’d think that would be the end of the issue, that we’d finally have a reliable enough definition of the second, but no! This week, a paper in Nature Communications put forth an alternative method of standardizing the second, one which reduces the variability between clocks, and has implications for increasingly precise experiments.
The current standard atomic clock uses the atomic vibrations of caesium essentially like a grandfather clock uses a pendulum. The period of those vibrations is slower and more reliable when the atoms are cooled, so modern versions use cool their caesium samples to near absolute zero, and probe that sample with a periodically pumped maser, or microwave laser. However, microwaves are relatively fat as radiation goes, and a tighter frequency laser — such as light in the visible spectrum — could produce more accurate results. Though the utility of optical lasers has been seen for some time, the technological difficulties with direct, reliable measuring of such frequencies has traditionally held them back from use in time-pieces.
In their paper, the researchers present evidence that their optical clocks, which use strontium samples rather than caesium, have created a standard almost three times more accurate as the traditional atomic clock.
Currently, our standards will move an average of one second out of whack every 100 million years. Theirs, they claim, can increase that number to one in every 300 million years. In addition, comparing two optical clocks to each other showed that their results are also very stable, not just keeping time over long periods but counting each individual second as having an extremely reliable duration.
Is that a meaningful change? Atomic clocks set the carrier frequency for many broadcasting stations, and their precision allowed global positioning satellites to distinguish between tiny differences in how long it takes for a signal to arrive. Still, as with anything that delves this deep into the land of precision, it’s mostly useful with respect to pure research; ultra-reliable time standards will become more and more necessary as we start firing beams of quantum particles through the Earth, and expecting to collect viable data at both ends.
Precision is necessary if we want to measure so-called “teleportation” of information. It’s especially important for coordinating with increasingly distant actors, be they satellites, rovers, or our first extra-terrestrial astronauts.
The importance of stability is demonstrated in an even more precise form of measurement, the ion clock, which uses individual ions rather than whole samples of its core material. This technology could be many times more accurate even than these optical lattice clocks, but since it uses such a finnicky internal standard, it’s much more difficult to standardize — and ultimately, time only matters if more than one person can agree on its value.