Keeping Time By The Atom
ON THE AFTERNOON OF JANUARY 6, 1949, journalists and scientists from around the nation crowded into the lecture hall at the National Bureau of Standards (NBS), in Washington, D.C., to see the world’s first atomic clock. It looked like a metal cabinet seven feet tall, with a maze of tubes, knobs, and gauges across the front. On top sat a conventional, round face, with hour, minute, and second hands indicating the time. There was no ticking, only a low electronic hum.
“We were all excited,” remembers Wilbert Snyder, who was a young scientist at NBS. “We knew the clock would open a new world of physics.” The clock performed the same two functions as any other clock: it measured passing intervals by counting regular cycles, and it thereby told time. But while a conventional clock might count the swings of a pendulum, at two per second, this clock counted the frequency of radio waves required to change the energy state of the nitrogen atom in the ammonia molecule, at almost twenty-four billion per second. The resonant frequencies of atoms never vary, so the atomic clock could keep time with undreamed-of accuracy.
What’s more, the clock challenged the earth-sun clock as a primary time standard, for it promised to be more consistent than the rotation and orbit of the earth. “For the first time in 4,000 years, telling time by the heavens has been dethroned,” one news report proclaimed. The New York Times called the clock “as near perfect as anything on this imperfect earth can be.”
The public had to be assured that despite its name, the clock was not radioactive, nor was it in any way related to the atomic weaponry developed earlier in the 1940s. “We don’t do fusion and fission,” an NBS scientist explained. “We just count atomic vibrations.”
The clock had been developed over an eighteen-month period by a threeman team of NBS scientists led by a thirty-five-year-old physicist, Harold Lyons. Determined to become a physicist, he had enrolled at the University of Buffalo when he was sixteen; by 1948 he had acquired a series of degrees, including a Ph.D. from the University of Michigan, and was head of the Microwave Standards Section at NBS. The other two physicists on his team were Benjamin F. Husten and Emory D. Heberling.
Dr. Lyons, now retired after a scientific career that spanned a half-century, recalls that the team started out in the “right place” to invent the clock: “We were in the time-and-frequency game.” The National Bureau of Standards, a division of the Commerce Department, is charged with determining exact measurements and standards for mass, length, temperature, and time. The bureau assures American consumers that one gallon of gasoline, or one foot of lumber, is always the same, that zero degrees is always zero degrees, and that the second is identical for all of us. It was NBS’s mission to measure and keep track of time as accurately as possible that led to the atomic clock.
Before the atomic clock, time was calculated according to the earth’s rotation and its orbit around the sun. One full spin marked a twenty-fourhour day; one journey around the sun measured a year. In 1820 the basic unit of time—the second—was defined by French scientists as an 86,400th part of the mean solar day, which in turn equaled one 365¼ part of the mean solar year—the period between two vernal equinoxes. But the earth takes an erratic spin. As it makes its elliptical orbit around the sun, it speeds up and slows down. It also wobbles on its axis, deflected by tides, heavy snows, and a sloshing molten core. As a result, one day can be a thousandth of a second longer or shorter than the one before. Since the earth is generally slowing down, most days are longer.
Throughout the ages all clocks had been set by the erratic celestial master clock, and man-made clocks themselves had rarely been very precise. By the 1920s the pendulum clock had seen enough improvement to keep time to within ten seconds a year. But while inaccurate clocks mattered little to agricultural societies, America entered the twentieth century with a need for ever more precise chronometry by emerging technologies. One was radio broadcasting. Pioneer radio stations struggled to stay on assigned frequencies and often drifted into one another. Since frequency is a number of cycles per second, this was a question of timekeeping. In the early 1920s NBS was given the task of assigning radio frequencies and establishing standards for them.
Searching for something that would vibrate at a constant rate, NBS physicists turned to the quartz crystal. As Dr. Lyons puts it, “Squeeze it and it starts vibrating.” The “squeezing” was done with an electrical current, which set the crystal vibrating at a rate determined by the size and cut of the crystal. Most crystals were made to vibrate at about a hundred thousand beats per second. The vibrations could easily be translated into electromagnetic waves and then counted. (Ever since the German scientist Heinrich Hertz first demonstrated the existence of electromagnetic waves, in 1886, laboratories around the world had been designing and improving different ways of measuring them.) By the 1920s scientists could measure wavelengths of electricity traveling along two parallel wires called Lecher wires. They could then compute frequency by simple arithmetic: wavelength times frequency equals the speed of light.
By the late 1920s quartz crystals were the standard measure of frequency, supplanting more primitive standards, such as electrically driven tuning forks. In 1928 Dr. W. A. Marrison, at the Bell Telephone Laboratories, built the first quartz-crystal clock by linking the crystal to new electronic technology. A key component was the frequency divider, a device developed by Bell earlier that year. After the crystal’s vibrations were converted to cycles of alternating current, the divider worked like a train of gears, cutting down the current’s frequency by precise fractions to sixty cycles a second, at which rate the electricity was used to run the clock. The same technology, miniaturized, is used in the common quartzcrystal wristwatch of today.
The quartz-crystal clock represented a major advance in timekeeping, gaining or losing only one second in three years. By 1929 NBS had built a bank of four quartz-crystal clocks whose average rate set the nation’s frequency standard for thirty years. The clocks were hardly problem-free, however. “Depending on the size, cut, and mounting of the crystals, each clock ran at a different rate—minuscule, but different nonetheless,” says Dr. Lyons. Temperature, humidity, and the age of the crystal all affected the vibration rate; to maintain a constant environment, NBS buried its clocks in vaults twentyfive feet underground. All this meant that the clock was not a primary time standard: it could not be exactly reproduced, and it had to be reset by the earth-sun clock. Nevertheless, it later became a part of the primary standard, the atomic clock.
The invention of the atomic clock depended on decades of breakthroughs in the mapping of the tiny world of the atom and in the nascent technology of microwaves. With the emergence of quantum theory early in the century, physicists gradually began to understand the structure and properties of atoms. The work of Max Planck and others revealed that atoms absorbed and emitted energy only at certain frequencies. In painstakingly complex experiments, the scientists bombarded certain atoms with electromagnetic waves that matched the atoms’ natural frequencies, causing them to change energy states. Those that emitted energy started absorbing it, and vice versa. As the atoms changed states, they produced detectable lines on a spectroscope.
These experiments took place at low frequencies, between one and ten million hertz, until in 1934, C. E. Cleeton and N. H. Williams, at the University of
Michigan, achieved the same results with atoms with resonances in the high-frequency microwave range. Using a magnetron—a vacuum tube that generates microwaves—they shot radio waves into ammonia gas. At about twenty-four billion hertz, the waves’ frequency reached the resonant frequency of the nitrogen atom in the ammonia molecule, causing the atom to absorb the energy and bob up and down through the plane of its three hydrogen atoms.
The experiment caused worldwide excitement. Among other things, it proved the feasibility of detecting energy transitions in the high-frequency microwave range. More than a decade later atomic clocks would use such oscillations to attain their great accuracy. Lyons was a student of Williams’s and recalls, “I was ready for the ammonia clock when the time came because I knew about this experiment.”
In 1934, however, nobody was thinking about building an atomic clock. The technology for generating microwaves was in its infancy, and the magnetron was still experimental. Not until 1939 did two brothers, Russell and Sigurd Varian, invent the first practical klystron, a truly sophisticated device for generating microwaves. And techniques for transmitting, receiving, and detecting microwaves awaited the development of radar.
In 1940 the federal government set up the Radiation Laboratory at the Massachusetts Institute of Technology to design and develop microwave radar transmitters, receivers, and detectors compact enough to fit in aircraft (see “The Road to Radar,” Invention & Technology, Spring 1987). In connection with this work, the Joint Chiefs of Staff ordered NBS in 1944 to establish standards for frequency in the microwave range. Lyons became the chief of the new section. “Our job was to make sure the transmitters and receivers developed at the lab could be exactly tuned to the same wavelength,” he explains. Lyons’s three years of work on this problem gave him new techniques and expertise that he later brought to the design of the atomic clock.
When the war ended, Lyons and other scientists around the nation went “back to work on the ammonia molecule.” Now they had dependable klystrons to generate microwaves at twenty-four billion cycles per second. By 1947 they could synchronize the klystron’s oscillations to the absorption frequency of the ammonia molecule’s nitrogen atom. “As soon as I read about these experiments, I knew I could make a better clock,” Lyons says. “All I had to do was synchronize the quartz clock using the ammonia resonance to keep it from drifting.”
Early in 1947 Lyons, Husten, and Heberling set to work. They decided to stick with ammonia, since experiments had already shown that microwaves could cause it to flip energy states. The equipment for generating microwaves was available, and so was the necessary electronic technology, developed in laboratories over the previous two decades. Still, the team had to redesign and adapt such electronic gear as frequency multipliers (circuitry capable of multiplying lower frequencies into higher ones), discriminators (to detect varying frequencies), and frequency dividers.
On August 12, 1948, the first atomic clock took a trial run. It worked like this: The quartz crystal’s vibrations were converted to current, and then frequency multipliers increased them to about twenty-four billion hertz. The microwaves produced at that frequency were shot into the ammonia gas contained in a thirty-foot-long copper wave guide coiled around the clock’s face. If the microwave frequency matched the atomic frequency, nitrogen atoms absorbed the microwaves; if the microwaves were offbeat, they passed through the gas and hit a detector, setting off an electric current that automatically adjusted the microwave frequency. That in turn kept the crystal vibrating at a uniform rate. With the crystal and atom in lockstep, the frequency divider cut down the crystal’s vibrations to the fifty-cycle current that powered and regulated what reporters called the “world’s most accurate electric clock.” Calculations showed the clock would gain or lose only one second in eight months.
The public announcement in January launched the atomic clock into national headlines. ATOMS RUN NEW SUPER CLOCK , proclaimed one. Another read: BEST CLOCK: THE ATOM . Lyons was interviewed on national radio by Edward R. Murrow, and he and the NBS director Edward U. Condon discussed the clock on Voice of America radio. Critics argued that the traditional quartz-crystal clock, which gained or lost only one second in three years, was still more accurate. Not the case, countered Lyons. Each crystal vibrates at its own rate, while atomic frequencies are constant. Inaccuracies in the atomic clock resulted from the methods used to latch onto the frequency; as these methods improved, Lyons predicted, the clock’s accuracy would jump to one second in three hundred years.
But not with the ammonia cloud. The molecules collided with one another and with the walls of the wave guide, introducing an electromagnetic Doppler effect, with the relative frequencies of microwaves varying depending on their source and direction. Even while the first clock was still being built, Lyons and his team began looking for a better way to latch onto atomic vibrations. One possibility was by using atomic beams. At Columbia University in the 1930s, I. I. Rabi had perfected a method of shooting a single beam of atoms into a vacuum and, with magnets, separating them into two streams, according to their energy state, and he measured the frequencies at which this occurred. In 1944 this work brought him a Nobel Prize.
In 1948 Lyons began visiting Columbia to see about adapting the beam technique to the atomic clock. “With the beam we wouldn’t have to worry about collisions,” he recalls, “and we could eliminate most of the Doppler by shooting the microwaves at right angles to the atoms.” There was a hitch, however. Rabi had used low-frequency atoms; the clock would require highfrequency atoms for accuracy. Poly- karp Kusch, a Columbia professor who worked with Rabi, suggested cesium, a silvery alkali metal with a resonant frequency of about ten billion hertz.
Now Lyons formed a new atomicclock team at NBS, joining with the physicists Jesse Sherwood and R. H. McCracken and several engineers and technicians. Kusch served as a consultant because, in Lyons’s words, “we weren’t beam people.” Because the main buildings were overcrowded, the team built the new clock in a small wooden building on the NBS grounds. The project took three years.
When it was completed, the secondgeneration atomic clock consisted mainly of a metal vacuum chamber almost two feet in diameter, surrounded by coils. “Everything was experimental,” Lyons explains, “so we made the chamber big enough to change parts inside it if we had to.” The cesium clock worked on a different principle from the ammonia clock. Instead of detecting the amount of microwaves absorbed, it worked by detecting the atoms themselves. First the cesium was vaporized in a small oven and shot into the vacuum in an orderly beam. Next it was divided by magnets into two streams. The atoms in one of those streams, all in the same energy state, were then bathed in a radio signal near the atoms’ resonant frequency; again the signal was achieved by multiplying a signal from the vibrations of the quartz crystal. The closer the radio signal matched the cesium’s resonant frequency, the more atoms flipped energy states. The atoms were then divided again by magnets, and all those that had flipped were sent to a detector. The signal from the detector was used to lock in the quartz crystal’s frequency of vibration. Now the atomic clock reached the accuracy Lyons had predicted: one second in three hundred years. And it accurately divided the second into billionths, or nanoseconds.
Within two years the National Company of Boston had introduced the first commercial cesium clock, the Atomichron. Before long atomic clocks began appearing in other countries whose scientists had visited the NBS labs; Switzerland, Britain, and Japan all built clocks patterned on those developed by Lyons and his team. Lyons remarks: “When we used ammonia, they tried ammonia. When we obtained better accuracy with cesium, they switched to cesium.” In 1954 the first cesium clock, NBS 1, was disassembled and moved to a new NBS laboratory at Boulder, Colorado. Still experimental, the clock had not yet replaced the four old quartz-crystal clocks as the nation’s frequency standard.
In 1960 scientists at the United States Naval Observatory and the British National Physical Laboratory linked the cesium atom’s oscillations to the earth’s spin. On the basis of calculations over a three-year period, British scientists determined that the cesium atom’s resonant frequency was 9,192,631,770 cycles per astronomical second. While this important connection was being made, a second cesium clock, NBS 2, was built, reaching an accuracy rate that seemed unbelievable—one second in twenty-five hundred years. On January 1, 1960, NBS 2 became the nation’s frequency standard, finally replacing the quartzcrystal clocks. And in 1967 the cesium clock even replaced the earth-sun clock in the definition of time. The International Bureau of Weights and Measures ruled that henceforth one second would equal the “duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom.” As one journalist remarked, “The world now marks the passing of time by atoms, not the stars.”
Over the next decade NBS developed a succession of cesium clocks, each more accurate than the last, thanks to more sophisticated electronics and better ways of latching onto the atomic frequencies. By 1970 the clock could keep to within one second in six thousand years. Overseeing much of this development was a young physicist, Dr. James Barnes, who had earned a Ph.D. at the University of Colorado with a dissertation on atomic timekeeping.
By 1972 the atomic clock was marking off seconds a hundred thousand times more uniformly than the earth, and the two were slowly getting out of sync. “This created a major problem,” Barnes explains. “We can’t have high noon when it’s dark.” The solution: Divide the two clock functions. Atomic clocks would reckon frequency, or the uniform passing of time, while the astronomical clock would tell the time of day, or the exact location of the earth at any given instant. At the Naval Observatory Dr. Gernot Winkler hit upon a way to coordinate the two functions. Another international agreement—this one took three years to hammer out—stipulated that atomic time should be kept to within 0.7 second of astronomical time. The compromise is called Universal Coordinated Time.
The job of collecting and reconciling the time kept by atomic clocks and observatories around the world fell to the International Bureau of Time (BIH) in Paris, which had begun synchronizing time across national boundaries in 1919. Since 1972 the BIH has decided whether to add or subtract a “leap” second once or twice a year to keep atomic time in sync with the earth. The two leap seconds added in 1972, a leap year, made it the longest year since 46 B.C. , when Julius Caesar introduced the Julian calendar by adding eighty-five days.
With the world depending on atomic clocks, a way had to be found to synchronize laboratory clocks in different locations. In the 1960s HewlettPackard developed a line of 180pound cesium clocks known as “flying clocks.” These portable frequency standards synchronized timekeeping facilities around the world. In the early seventies scientists carried the clocks several times each year between NBS labs in Boulder, the Naval Observatory in Washington, D.C., and the BIH in Paris. The Federal Aviation Administration required the atomic clocks to travel first-class on airlines, and as Barnes explains, “We learned early on to call it an electronic clock. If we said we were carrying an atomic clock, the airline people said, ‘Not on our planes.’”
In 1975 flying clocks revealed that atomic clocks built halfway around the world from one another were keeping the same time to within two-millionths of a second. Today clocks in different locations can be synchronized by atomic clocks that circle the globe in satellites, although the Pentagon still sends traveling clocks to military installations.
Marking the exact rate at which time passes is only part of the job of NBS’s Time and Frequency Division. NBS also disseminates the time publicly, by broadcasting time and frequency signals on what has been called “the most monotonous and the most important of all radio programs.” Three NBS radio stations carry the broadcasts: WWV, which started broadcasting frequency signals in 1923 and is located near Fort Collins, Colorado; WWVB, also near Fort Collins; and WWVH, in Kauai, Hawaii. NBS also promulgates the time by phone. In an average week forty-eight thousand people dial (303) 499-7111 for the time-of-day announcement. NBS time and frequency signals are also beamed from two National Oceanic and Atmospheric Administration weather satellites, and the United States Naval Observatory operates its own bank of atomic clocks.
Today’s national frequency standard, NBS 6, a twenty-foot-long steel tube on a metal bench, would not miss a second in three hundred thousand years. That translates to a few millionths of a second in a year. The quest for greater accuracy continues, and NBS 7 is expected on-line sometime this year. According to Dr. Donald Sullivan, the chief national timekeeper, it will push the cesium clock to its limit. Meanwhile, research is under way for the next generation of atomic clocks, which will use the mercury ion for accuracy that Sullivan predicts will eventually lead to a new definition of the second.
How much more accuracy is needed? “Each time we have increased accuracy,” Barnes says, “science and technology have found a new way to use it.” Researchers detect atomic particles whose entire lives are measured in fractions of nanoseconds; geologists measure minute variations in the earth’s crustal plates, gaining insight into why earthquakes occur; astronomers study the almost undetectable motions of stars and interstellar objects; and physicists have figured the speed of light with one hundred times the accuracy of the classic textbook approximation.
In 1971 atomic clocks even demonstrated an effect of relativity, which holds that a moving clock will record less time than a stationary clock. In an experiment conducted by the Naval Observatory, cesium clocks were flown around the world. As Einstein had predicted, the flying clocks registered fewer or more nanoseconds—depending on their direction of travel—than an earthbound clock. Only clocks counting in nanoseconds could have performed such a feat.
The unvarying regularity of atomic clocks now enables radio and television stations to lock into far more exact frequencies, leaving new room on the nation’s airways. Television cameras and sets, synchronized to the same frequencies, produce clear sound and pictures. Telecommunications equipment handles millions of electrical impulses per second, kept ungarbled and on beat by atomic clocks. Electric-power companies generate electricity at precisely the same sixty-cycle rate, making it possible to safely barter electric power across the continent. In addition, that constant rate keeps household electric clocks on time.
Atomic clocks have also brought new accuracy to navigation. Since the eighteenth-century invention of the chronometer, navigators have relied on timekeeping to plot their paths, comparing time aboard ship with time at Greenwich, England. Today ships sail much more precise routes, submarines plot exact locations underwater, and airliners stay on correct paths to airports thousands of miles from home. At jet speeds an error of twomillionths of a second can send a plane off course by a third of a mile. Spacecraft are launched, maneuvered, and returned to earth by atomic timing, and space centers fix the locations of satellites and rockets to within a few feet in the vast expanse of space. In the world of manufacturing, automatic, standardized frequencies are factoryset on everything from camera shutters to computer data processing.
While the earth continues to wobble and slow, atomic clocks never skip a beat. For all practical purposes the second is now the same to everyone, in every place, for all time.