From Death Comes Life
WORLD WAR II HAS BEEN OVER for half a century, but the medical advances it spawned continue to save lives
LT. ROBERT HARDAWAY III PEERED INTO HIS BOXES OF NEWLY procured medical equipment and sighed with disappointment. It was April 1941, and he had just reported for duty as a surgeon in the 8th Field Artillery Regiment of the Hawaiian Division. Standing in the Schofield Barracks, the Texan doctor uncrated his supplies, neatly wrapped in yellowed newspapers that bore dates from 1918. The rolls and squares of cotton bandages had crumbled in their packages due to terminal dry rot. Medicinal salves, put up in tins like shoe polish, had dried into hard, fissured cakes that shrank from the sides of the containers. “The only usable item was an amputation saw,” he recalled with dismay.
Those crates spoke volumes about our state of preparedness for the war that would be sprung upon Dr. Hardaway and the country at Pearl Harbor. They also revealed a lot about the state of medical practice at the time. As the tieth century progressed, doctors were trying to go beyond mere supportive or symptomatic treatments, in which diseases were allowed to run their fitful courses and massive injuries were often treated with morphine alone. By subjecting the healing professions to extreme demands that would have been inconceivable only a few years before, World War II would propel medicine into a new era of aggressive cure and prevention.
If it were not for the war, medical science in the United States might well be fifteen years behind where it is now. The legacy of this prodigious research effort, much of it conducted right on the battlefield, can be seen to this day in numerous fields of medicine.
Blood
Blood clots into a complicated gel within three to eleven minutes after emerging from the body. This life-preserving property had to be overcome to get blood into a bottle and the bottle to a wounded soldier. The problem was worked out in time for World War I with the discovery of the anticoagulant power of citrate solutions, and citrated blood was administered on a large scale to European war casualties. A bottle of it would last on the shelf for only a few days, but this limitation fit in with the style of fighting that was going on at the time. Huge opposing armies remained in almost permanent stationary contact, in something of a gigantic siege, along a line of trenches that stretched from Switzerland to the North Sea. The battlefront never strayed too many hours from the blood supply—which was mostly other soldiers.
But World War II battles were often fast-moving and far-ranging; more than anything else, World War II was a war of logistics. It was hard enough just to supply rapidly advancing troops with food that wasn’t spoiled. (The infamous “packaged rations” solved the food-preservation problem, bequeathing us instant mashed potatoes as well.) Blood couldn’t follow the troops very well, because it was a living liquid tissue that started to die as soon as it left the donor.
Adding dextrose to the citrate nourished the red blood cells with sugar and almost doubled their survival time, to a week or so. This still didn’t allow enough time for transport to much of the Pacific. The European theater was going to be a problem too, because not a single general expected a repeat of the previous conflict’s static warfare. Battles were going to spill swiftly from the Normandy beaches toward Paris and Berlin, with thousands of casualties desperately needing blood several weeks away from donors in Detroit, Atlanta, or Phoenix.
In England, John F. Loutit and Patrick L. Mollison worked on a different aspect of the problem in 1942: trying to prevent the caramelization of the citrate-dextrose solutions that occurred during sterilization with heat. If they acidified the solution, they found, caramelization would not occur. And much to their surprise, the blood remained useful for an astonishing twenty-one days, more than enough time to airlift it thousands of miles. Red blood cells banked in this acidified citratedextrose (ACD) solution didn’t swell and burst, and they maintained their torus shapes, so they wouldn’t clump together and sludge in small blood vessels or get scavenged in the spleen. Three weeks after donation 70 percent of the red cells in a bottle were still viable enough to carry oxygen to wounded tissues. Loutit and Mollison’s ACD solution could not be improved on for almost thirty years, until the addition of phosphate increased the shelf life of blood by another week.
Since the 1920s medical researchers had known that shock due to blood loss was a reversible stage of dying that doctors could prevent with a transfusion. Yet in April 1943 the surgeon general of the Army refused to send whole blood overseas because, among other reasons, shipping space was so scarce. (Gen. Charles Gross, chief of transportation, often said that if he diverted any tonnage from bringing ammunition in, he could then use all of it for moving the wounded out.) But five days later Dr. Edward Churchill, the top surgical consultant in the North Africa theater of operations, convincingly argued the importance of whole blood (rather than just plasma, which is easier to store and handle) in the treatment of the wounded. The American Red Cross jumped on the ACDsolution discovery and began the effort of collecting half a million bottles of whole blood over the last two years of the war—inventing modern blood banking in the process.
Penicillin
Penicillin didn’t impress anyone early on—including its discoverer, Sir Alexander Fleming, who discounted its ultimate usefulness. Even though it could deactivate staphylococcus and streptococcus germs, penicillin was powerless against many of the deadly bacteria that devastated entire populations with diseases like cholera or bubonic plague. The first patient treated with it, a gravely ill policeman with a fulminant bone infection that had disseminated to clog his lungs and blind an eye, died after initially rallying, because there wasn’t enough of the drug available to finish the treatment (though desperate doctors even resorted to extracting it from his urine).
At the outset of hostilities the United States Army wasn’t overly impressed either. Dr. Chester Keefer of Boston insisted that penicillin’s most useful role would be to treat the inevitable venereal diseases of soldiers on the march. Even in low doses it could cure gonorrhea in forty-eight hours and inactivate syphilis in just eight days. Infected soldiers could go back to full-duty status promptly. Providing the Army with penicillin for VD was felt to be tantamount to adding an entire fighting division to the war effort.
This emphasis on venereal diseases came about because, as one wartime report conceded with resignation, “coitus will retain much of its popularity in spite of its hazards.” Gen. Dwight Eisenhower, in command at Caserta, Italy, understood this basic fact of military life. Penicillin was just starting to reach the area in 1944, and a British doctor stationed there argued that a good use for it would be to treat the “ladies” of Naples. Eisenhower felt obligated to reply, “Impossible. We will be lynched by the mothers of America and Britain.” The doctor, of course, proceeded to administer the new drug to these professionals and significantly reduced the incidence of syphilis and gonorrhea in the Allied armies. Eisenhower later awarded him the Legion of Merit.
Penicillin was seen mostly as a VD cure because war wounds were expected to become infected anyway—so much so that whole medical units sprang up bearing the sad name of Section of Septic Surgery. The standard care for the treatment of war wounds in 1941 was to pare away devitalized and irreparably damaged tissues and then suture the wound several days later once the infection or the threat of it had passed. But the patients from the Pearl Harbor attack in Dr. Hardaway’s septic-surgery ward didn’t get nearly as many infections as those in other wards, in large part because of his practice of sprinkling a powdered antibiotic, sulfanilamide, into them—from a salt shaker, of all things. Clearly, treating wounds with antibiotics like penicillin deserved a chance.
But growing penicillin was still a tedious process that yielded only tiny amounts for investigative use. The vast quantities needed for war required a mammoth commercial effort to produce. The Department of Agriculture referred enthusiastic scientists to its Northern Regional Research Laboratory in Peoria, Illinois. Workers there soon isolated especially active strains of the pénicillium mold that synthesized larger amounts of the drug, including one that a sharp-eyed researcher noticed growing on an overripe canteloupe at a fruit market. They also developed a process to grow the mold in suspension in deep tanks, rather than just as a thin film on a liquid’s surface. And they figured out that corn-steep liquor, a by-product of the manufacture of cornstarch that was easily available in the Peoria area, would generate ten times as much pénicillium when added to the culture medium.
Alfred Newton Richards, who headed the U.S. government’s Committee on Medical Research, was impressed with the lifesaving potential of penicillin and set out to convince the Pfizer, Squibb, and Merck companies to combine their research findings and experiences. At the time of the Battle of Midway, June 1942, there was only enough penicillin in the entire country to treat about a hundred patients. When the Normandy invasion got under way, in June 1944, the United States had enough for all major casualties. By V-J Day, August 1945, the military had more than it could use, and penicillin began to be sold freely on the civilian market.
Heart and Lung Surgery
Projectile weapons became dominant at the Battle of Agincourt in 1415, where the arrows of English archers cut down thousands of armored Frenchmen bearing swords and axes. Warfare would never be the same again, and penetrating wounds of the chest from arrows, bullets, or shrapnel often carried a death sentence.
A few bullet wounds of the heart, surprisingly, were not fatal; sometimes bullets even dropped into one of the heart’s chambers, bouncing around inside with each contraction. The danger with this kind of foreign body was that the heart could pump it out into the general circulation as a sort of wandering metallic blood clot that would wedge itself in a large vessel. They had to be removed.
But centuries before, Aristotle had convinced surgeons not to tamper with the heart: “The heart alone of all viscera cannot withstand serious injury.” Even Theodor Billroth, the courageous Austrian surgical genius and innovator, is said to have insisted that any man who would operate on the heart should lose the respect of his colleagues. Right up to World War II surgeons were reluctant to retract the heart with instruments or touch it with their hands.
Dwight Harken, a Boston chest surgeon, arrived in the European theater of operations in 1942. His laboratory investigations back home had convinced him that the conventional wisdom was probably wrong. “When I saw that largely mechanical heart with a muscle power source and uniflow valves,” he said, “… it seemed incomprehensible that we surgeons, who are considerably mechanically oriented, should not attack this significantly mechanical organ.” The heart had to be sturdier than any other organ; therefore it should easily stand up to surgical intervention.
Harken quickly circumvented a lot of red tape and persuaded his superiors to set up a special cardiac wound clinic to test his theories. He would eventually become bold enough to plunge his fingers into the heart to retrieve enemy bullets—thirteen times in all. The operations were consistently successful. Cardiac surgery was born of bullets.
Meanwhile, in 1943, Capt. Lyman Brewer, a chief of septic surgery, was busy in the Mediterranean theater of operations persuading his superiors to set up the world’s first thoracic-surgery center. Despite their official skepticism, he established the 53d Station Hospital about thirty miles from Bizerte, Tunisia, to treat the wounded from the Sicilian campaign.
The mortality rate from penetrating wounds of the chest had been about 62.5 percent in the Civil War; during World War I it was a still-frightening 24.6 percent. Brewer and his colleagues Thomas Burford and Paul Samson figured out in hostile desert conditions how to peel rigid constricting scar tissue from wounded but otherwise elastic lungs, to allow the return of the normal physiological functions of inflation and deflation. They also quickly adopted penicillin to treat and even prevent infections. Most important, Brewer recognized that traumatized lungs filled up with fluid—even, curiously, when only the brain or abdominal organs were wounded. Brewer had discovered respiratory distress syndrome (RDS) in that unlikely laboratory of a desert. After the team worked out a way to recognize and treat RDS, mortality from chest wounds dropped to an almost unbelievable 9.4 percent.
Harken and Brewer had carried their understanding of the physiology of the heart and lungs to the battlefield. All of Harken’s thirteen patients survived their cardiac bullets, and none of Brewer’s first hundred patients died. Perhaps their most profound contribution was the realization that surgery was no longer solely an extirpative exercise, aimed at removing and discarding broken, tumorous, or infected organs. Surgery was becoming physiologic; organs could be rehabilitated and rejuvenated. Operating rooms were becoming places of real hope.
Crush Syndrome
With Germany having already subdued most of Europe, the Luftwaffe began the fullscale bombing of London on September 7, 1940. One night at what is now Hammersmith Hospital in West London, Eric G. L. Bywaters and his team of shock doctors received two bombing casualties who, curiously, had suffered no lacerations, burns, or fractures—no apparent external injuries at all. Bombs had hit their houses and buried them in the debris; when they were dug out, rescue doctors pronounced them fine and put them aside with little subsequent attention. A few hours later, though, they suddenly collapsed with low blood pressure and were rushed to Hammersmith, where their kidneys failed to make urine properly and they died of uremia.
While the raids continued, casualty teams around London reported similar cases. The common factor was that all the victims had been buried and compressed under debris for several hours. Bywaters’s scientific curiosity was piqued. “I and such others as we could muster set off, usually in the dead of night… through roads without lighting or signposts, piloted by gallant girls from the services. We would arrive at the bombed town with our bottles of citrate, lactate, and glucosesaline, with our syringes and specimen jars, and our hand spectroscope, ready to document and treat any buried victims.”
Bywaters and his colleagues studied the victims’ urine for biochemical clues and performed autopsies on them too, often taking shelter under the autopsy table as more bombs fell. They eventually defined a new disease, which they called crush syndrome. Compressed muscle, deprived of oxygen-rich blood, would die and subsequently burst. The bursting necrotic muscle cells would leak one of their structural components, myoglobin, into the victim’s bloodstream. The myoglobin would travel to the kidneys, where it precipitated in the millions of ultrafiltering units called glomeruli and plugged them with a sludge so thick that no further urine could squeeze through. The remedy was to administer intravenous fluids to combat the shock and alkali to prevent the acidic precipitation of myoglobin —the same treatment used during the recent San Francisco and Kobe earthquakes.
As the war went on, Bywaters discovered that crush syndrome came in varieties. For example, in people intoxicated with alcohol or barbiturates who pass out, the very weight of the limp and unconscious body compresses and kills large amounts of its own muscle, and the deadly leakage of myoglobin begins. Today’s textbooks recognize at least eighty-six variants, including muscle damage from carbon monoxide poisoning or prolonged unconsciousness in the snow, the bite of the Malayan sea snake, and toovigorous fraternity hazings.
Rehabilitation
When U.S. Maj. Gen. Norman Kirk walked through a British recuperation camp hidden in the dunes of North Africa in the spring of 1943, the scene astounded him. Convalescing soldiers weren’t lounging on cots, reading in the shade, or killing time playing cards until another meal was ready. Instead the physical-training sergeant had taken them out of their tents and into the sun, many with their casts still on, to run, do calisthenics, or stretch spastic muscles and sore tendons. He was reconditioning them to be just as strong and fit as when they took the bullet or the shrapnel. What’s more, they were going back to active duty—to combat, if need be—tough and ready, because fighting men were just as scarce a resource as the other ingredients of warfare.
Kirk learned from that trip what the U.S. Army needed: about a hundred physical-medicine doctors, fifteen hundred physical therapists, and twenty physical-reconditioning hospitals. The problem was that there were only about a dozen such doctors in the entire country at the time and just seventyfive physical therapists on active duty in the Army. Physical medicine hadn’t even been invented yet as a legitimate specialty. The Army had to train hundreds of people in a medical discipline that had been seriously neglected before the war—that didn’t really exist, in fact, except as an arts-and-crafts diversion for injured and maimed patients who were considered “chronic” or “crippled.”
Kirk did all of it; he improvised an entire new branch of medicine. Ultimately, in the European theater alone, 375,000 of the 598,000 wounded troops were able to return to duty. Moreover, Kirk’s program of “active convalescence” showed that the serviceman who was too wounded to be reconditioned for combat could still be rehabilitated to function optimally in civilian life. Amputees quickly became proficient with their prostheses because therapists engaged them in intense goal-oriented therapy instead of merely allowing them to work lackadaisically toward becoming slightly less disabled. Surgeons, urologists, and physical-medicine doctors teamed up to rehabilitate paraplegics from wasted, helpless, bedbound individuals into determined men with strong arms and hard, active chest and abdominal muscles. Rehabilitation was a novel idea then—and a spectacular one.
Physical medicine, under the stimulus of war, leaped from craft to science with the recognition that it was really a form of applied biophysics. The Mayo Clinic joined with MIT to train physical-medicine doctors in bioengineering, electronics, and instrumentation. Such developments redefined medicine as not just the science of survival and pain relief but the science of the restoration of function as well.
Rehabilitation and reconditioning generated nationwide enthusiasm. The American Medical Association convened a symposium in 1944 called “The Abuse of Rest in the Treatment of Disease” and followed it in 1946 with one called “Early Ambulation,” after a wartime study had shown that getting people walking after major surgery decreased the incidence of phlebitis and pulmonary embolism. The devastating polio epidemics of the late 1940s and early 1950s provided the first civilian opportunity for physical-medicine practitioners to show what they had learned in the war. The usual treatment for polio, consisting of prolonged immobilization and rest, was scrapped in favor of early active training of feeble muscles until the Salk and Sabin vaccines put an end to the scourge in 1955.
Tetanus
The epidemic of Fourth of July fireworks accidents at the turn of the century illustrates why war and tetanus used to be inseparable. Exploding fireworks propelled tetanus bacteria, which are cor^ monly found in soil, deep into shredded and devitalized tissue. The microenvironment in these mangled wounds was short on oxygen, since the blast had disrupted the local circulatory system. Tetanus bacteria found this a hospitable situation because they thrive where oxygen is scarce. As tissues died, the bacteria flourished and produced a toxin that migrated to the spinal cord. The toxin electrified muscles into deadly spasms that locked the jaw, tightened the chest, and stiffened the heart. In 1903 alone, 406 fireworks victims died of tetanus.
Doctors had recently developed an effective cure. Injecting a sublethal amount of tetanus toxin into an animal would elicit antibodies, and the animal’s serum (now called antitoxin) could in turn be injected into a wounded person at risk for tetanus. Civilian tetanus deaths from fireworks dropped to zero by the beginning of World War I. For military purposes, however, the antitoxin treatment was less helpful, because in the trenches it often could not be administered soon enough after wounding.
World War II provided the opportunity for a giant experimental trial of a substance that many considered better than antitoxin: toxoid vaccine, which consisted of the toxin itself, inactivated by formaldehyde or alum so it lost its ability to produce disease but not to elicit antibodies. Injections of antitoxin had provided a temporary supply of antibodies that lasted only a day or so—if they could be administered in time. Toxoid, on the other hand, turned a person into an antibody factory for years and put the biological equivalent of a memory chip into the immune system, so that antibody production could be switched on immediately whenever tetanus germs invaded the body.
It sounded good on paper, and it had been used effectively in horses since the 1920s, but thousands of people would have to be immunized with toxoid—and put at risk—to prove its efficacy. World War II recruits provided just such a test pool. The U.S. armed forces established a toxoid immunization program in 1941 and dropped antitoxin completely—a hefty gamble considering its relative success in World War I. The result of this experiment was that the U.S. Army recorded only twelve tetanus cases during the entire war, and only four of them involved men who had received toxoid injections. Toxoid immunization remains the backbone of modern preventive strategy.
Typhus and Malaria
It is curious that typhus has not captured the interest of historians the way bubonic plague has, because typhus has regularly accompanied the worst of humanity’s disasters and has been the inevitable and expected companion of war. Napoleon led half a million troops into Russia; almost none returned, and typhus had claimed many of them. Gen. Nathanael Greene of the Continental army came down with typhus in 1776, along with perhaps a third of his men, greatly hampering the Patriot cause in the Long Island campaign.
Typhus thrives where there is filth and disorder, because body lice are the vectors of the disease. The lice jump from soldier to soldier on the march or from soldier to civilian during an invasion. Typhus masquerades as a sort of influenza, with fever and chills at first, but then a purplish brown rash spreads from the shoulders to the palms and feet, the headache becomes excruciating, and delirium sets in.
To eliminate lice is to eliminate typhus, and the most effective method in the extremities of wartime is the use of a powerful pesticide. DDT for insecticide use was formulated in Switzerland in 1939 but ultimately developed by the U.S. armed forces and the Department of Agriculture. It was dramatically effective. Public-health officers dusted the powdered form, known as AL63, into hair, clothing, and bedding and impregnated underwear with the liquid. (A typhus vaccine had recently been developed, but its efficacy was not clear at first, and it was not possible to vaccinate all members of the armed forces.)
DDT remained active for months, eliminating the need for repeat spraying or dusting. (This persistence would later be found to have terrible consequences farther along the food chain, causing DDT to be banned in many areas.) Medicine, entomology, and sanitary engineering emerged in World War II for the first time as a publichealth team with enormous power to control insect-borne diseases. This wartime resource got called to temporary civilian duty in Naples in October 1943, when returning Italian troops brought typhus back from North Africa. The Allies dusted 3,265,786 people with DDT and pyrethrum, another insecticide, over the next six months, and the epidemic simply disappeared. Never before in history had any kind of human action stopped a typhus epidemic.
Another insect-borne disease that afflicted American troops was malaria. During the first stage of the New Guinea campaign, malaria brought down four times as many Allied fighting men as did Japanese weapons.
The anopheles mosquitoes were as hostile an enemy as existed. To fight them, American doctors relocated their laboratories into the malarious Pacific battlefields, devised a strict antimosquito protocol, and vanquished the mosquitoes and the malaria parasites they carried. Their utter triumph over malaria, in less than two years, qualified as one of the stellar victories in the Pacific.
The mosquito itself was publichealth enemy number one. Troops dug drains to eradicate mosquito-breeding areas. Local inhabitants were no longer employed to perform manual labor, such as unloading ships, because they were walking reservoirs of malaria who could easily infect the troops. DDT spraying killed many of the mosquitoes, and nets, insect repellent, and protective clothing kept off those that escaped.
A breach of these simple measures was certain to meet with disastrous results. Unlike the Americans, Australian troops still came down with malaria at alarming rates. In the malarious Markham Valley of New Guinea’s Huon Peninsula, troops often threw away mosquito nets to lighten their pack loads and continued to wear the traditional Australian uniform of tropical shorts and short-sleeved shirts. The Australian 7th Division’s malariaattack rate skyrocketed to 4,840 per 1,000 men, meaning that the average soldier suffered 4.84 attacks a year. Meanwhile, the American rate dropped to 251 by the end of 1943 and to 49 by the end of 1945.
For troops that did come down with the disease, atabrine was a powerful malaria suppressant. Unfortunately, at the start of the war no one knew the proper dosage. American medical experts set up field laboratories in the South Pacific and correlated malariaattack rates with the differing atabrine doses used in recent campaigns. They eventually decided on a dose of one hundred milligrams a day while fighting in a malarious area and for a month after leaving. Gen. Douglas MacArthur sent the dose out as nothing less than a directive to his line officers in March 1944. Still, some troops were not taking atabrine because of occasional violent side effects (such as temporary insanity) and rumors that it turned the skin yellow or caused impotence.
All these measures were collectively known as malaria discipline, and for the regimen to be effective, compliance had to be complete. British Field Marshal Sir William Slim correctly observed that “more than half the battle against disease is fought, not by doctors, but by the regimental officers.” Slim ensured compliance by means of surprise checks. If he was dissatisfied, he would sack the commanding officer. “I only had to sack three,” he said. “By then, the rest had got my meaning.” For the first time in military history, a combat officer could be considered unfit to command for allowing his men to become ineffective through disease.
Malaria discipline did not end with the Japanese surrender, for more casualties can occur after a war than during it. Many scholars maintain that the great influenza pandemic after World War I was a delayed effect of that war, attributable to malnutrition, crowding, and social disruption. In 1945 the Australian government worried that returning troops might carry malaria home and spread it throughout the Australian continent. Sir Earl Page, a physician liaison from Australia’s parliament, observed how MacArthur implemented malaria discipline in New Guinea and rushed his findings back to Australia. Malaria never spread from the Pacific theater to the Australian mainland, and none of the civilian cases reported there in the last several years of the war were caused by returning troops.
America’s public-health experience with malaria control in the Pacific also defeated the mosquito in the rural South of the United States after World War II. The South had suffered about a million malaria cases annually as recently as the 1910s, but by 1955 malaria had virtually disappeared. The United States exported these strategies to Venezuela and British Guiana with equally astounding success.
Hippocrates called the battlefield a school for surgeons. Others have called war a perverse handmaiden to science. The doctors who fought in World War II did not celebrate progress through bloodshed, but they did recognize that doctors can be schooled by conflict, because the strategies of the soldier and the healer are the same.
A glance at the language used in any medical text reveals this sameness. Chemotherapy is a magic bullet; vitamin Bj2 is given as a shot; bacteria colonize the intestines; and the immune system, ever active on surveillance, can mobilize a flying squadron of antibodies to attack foreign invaders. With the Manhattan Project, World War II saw the technology of killing finally and decisively surpass the technology of healing. Yet while atomic weapons have never been used in war again, the medical advances that came out of World War II continue to save, prolong, and improve lives every day.