Back to Top ^

A Solution for Almost Everything: 50 Years of the Laser

In perhaps the most famous scene of any Bond film, secret agent 007 lies strapped to a table with his legs spread. Archvillain Auric Goldfinger directs an industrial laser toward Bond’s manhood, and slowly the thick red beam surgically cuts the table in half. The secret agent calmly convinces his foe to shut off the laser in the nick of time.

The scene as described in Ian Fleming’s 1959 book Goldfinger features a large circular saw—not a laser— because the latter was not invented until the following year. The vast majority of Americans who first flocked to the film in 1964 didn’t even known that such a device could exist outside of science fiction films and comic books. But this remarkable invention was already hard at work remaking the world, even if it would take a few more years for awareness to percolate to the average person.

Fifty years after the world’s first laser was built in May 1960, few aspects of daily life don’t involve the use of a laser, whether it’s talking on the phone, surfing the Internet, buying groceries, or even playing with a cat. Although less celebrated, the laser ranks with the airplane, electronic communications, and the personal computer as one of the most influential inventions of the 20th century—even if its exact origins are still disputed.

Although indispensable in modern life today, lasers were first met with a collective shrug. “When lasers were first made, their characteristics and performance seemed so unusual that some scientists not in the field joked, ‘That’s interesting, but what good is it?’” writes Charles H. Townes, the physicist who, with colleague Arthur Schawlow, not only invented the maser (the laser’s predecessor) but also conceived the idea for the amplification of visible light. Townes would win the Nobel Prize for his work in the same year that the film Goldfinger was released, but other scientists continued to describe the laser as “a solution in search of a problem.”

“What that line really reflects is that it was not something invented to solve some specific problem,” notes Jeff Hecht, the historian of science who has written Beam: The Race to Make the Laser and several other related books. “It was not invented to fit a specific set of application requirements. But once you had it, it made a lot of things possible.”

As with many other technologies, the basic concept of the laser seems quite obvious in retrospect, but it languished in the backwaters of speculation and theory for decades before it found practical applications. The word “laser” is an acronym for “light amplification by stimulated emission of radiation.” A laser, says Hecht, is basically “a very well-controlled lightbulb.”

That control comes from the stimulated emission of radiation, a phenomenon first described by Albert Einstein in a 1916 paper. By the early 20th century, physicists already understood that atoms could have different energy levels or states, and that they jump to higher energy levels when stimulated by external energy. Atoms naturally tend to settle at their lowest energy level, so “excited” atoms almost immediately reemit the extra energy in the form of another discrete particle known as a photon—the basic unit of light—in a process known as spontaneous emission. Einstein realized that if an already excited atom were hit by another photon at the right energy, it would shed two photons, a process he called stimulated emission.

As the basics of quantum physics emerged over the following decade, some scientists theorized that stimulated emission could be used to amplify electromagnetic signals. Any practical applications of this idea, however, remained elusive, in part because of the wide gulf that existed then between theoreticians/experimenters and scientists/engineers. “I believe this is one of the cases where the scientific and engineering communities became locked into a particular set of ideas which, while fruitful in other ways, kept quantum electronics and lasers definitely outside the common paths of thought,” Townes wrote in a 2000 journal article.

World War II brought everyone together to turn the theoretical into the practical, from the atomic bomb to radar technology. “Working under intense time pressures, they generated massive advances in electronics . . . and other technologies,” wrote Stanford University electronics engineer and laser expert Anthony E. Siegman, “and at the same time educated themselves in the techniques and the capabilities of these new technologies. As the war ended, many of these scientists returned to their home laboratories, carrying with them not only these new skills and concepts but in many cases pieces of their wartime apparatus . . . which they were eager to apply to more basic scientific pursuits.”

One of these was Charles Townes. After working on radar and microwave technology for Bell Labs during the war, he moved to Columbia University in 1948 and developed microwave spectroscopy for the study of atoms and molecules. On a spring morning in 1951, while sitting on a park bench in Washington, D.C., where he and his colleague and brother-in-law Arthur Schawlow were attending an annual meeting of the American Physical Society, Townes had a eureka moment.