DNA Takes The Stand
How the genetic code became a revolutionary tool for law enforcement
In July 1986 residents of the English village of Narborough learned that someone had raped and murdered a 15-year-old schoolgirl named Dawn Ashworth. It was a second shock for the bucolic community, which lies about a hundred miles north of London. An equally harrowing crime had rattled the area less than three years earlier. Late in 1983 a passerby had found the violated body of Lynda Mann, also 15. News of the latest atrocity frightened and outraged local citizens and left police desperate.
Authorities focused their investigation on a slow-witted hospital porter of 17, who, under intense interrogation, confessed to killing Ashworth. He denied any involvement in Mann’s killing, but prosecutors were convinced he was lying. They needed to link him conclusively to both crimes.
Two years earlier, Dr. Alec Jeffreys, a professor of genetics at the nearby University of Leicester, had developed a way to identify unique chemical attributes in samples of deoxyribonucleic acid, the DNA that resides in the nucleus of every cell. His method had already helped resolve an immigration case, proving that a young boy from Ghana was indeed the son of a woman in Britain. Though DNA analysis had yet to be offered as criminal evidence, police asked Jeffreys to apply his method to the vexing Narborough case.
They waited for weeks while he performed the delicate steps of extracting DNA from sperm cells found in the girls’ bodies and from white blood cells drawn from the suspect. He added what were known as restriction enzymes, chemicals extracted from bacteria that precisely cut the strands of DNA at targeted locations. An electric current dragged the fragments slowly through a porous gel. The gel acted as a sieve, separating the short ones, which moved more quickly, from the longer ones.
Next he transferred the fragments to a membrane in order to stabilize them. He chemically separated the two strands of the DNA double helix, using a strong alkali. The key step was to wash the resulting array with a solution containing a probe, a section of synthetic DNA formulated to bind with a specific targeted fragment. Each probe contained a radioactive isotope. When he laid the membrane on a sheet of X-ray film, the radiation darkened it in the places where the targeted fragments were concentrated, creating a pattern of fuzzy bars. By washing the membrane with one probe after another, Jeffreys created a series of bars that indicated the length of different types of fragments.
His method was based on “restriction fragment length polymorphism,” or RFLP (abbreviations abound in the field). The term referred to the fact that certain identifiable sections of DNA vary among individuals. They are shorter in some, longer in others. His technique was designed to isolate (using the restriction enzymes), separate (through electrophoresis), and then visually represent (by means of radioactive probes). When he analyzed several of the variable sections, he was able to point with virtual certainty to a single individual. Yet at the end of his rigorous lab work, he thought he must have made a mistake.
“My first reaction was ‘Oh my God, there is something wrong with the technology,’” he said later. His test showed that one man had indeed raped both girls. But, in spite of his confession, the youth in custody was not the culprit.
Further testing confirmed the results, and the frustrated police had to release their suspect, who might well have been convicted had it not been for the DNA evidence. The police then took the unprecedented step of requesting blood from more than 4,000 local men. Samples from those whose blood type matched that of the evidence—about 10 percent—were subjected to Jeffreys’s DNA analysis. The laborious procedure took six months and ended in a cul-de-sac. None of the samples matched the profile of the murderer.
The break in the case came in August 1987, when a woman heard a colleague from the bakery where she worked admit that he had impersonated a co-worker in giving police a blood sample. When the woman reported the conversation to the police, they picked up the co-worker, Colin Pitchfork. Jeffreys’s tests determined that this man’s DNA profile matched the evidence from both murders. Pitchfork confessed and, in 1988, was sentenced to life in prison.
Jeffreys labeled his procedure “DNA fingerprinting” (it is now more commonly called DNA profiling or analysis). This first case highlighted the potential of the technique. It exonerated an innocent man and helped police make their case against a guilty one. The mass screening prefigured the data banks of DNA samples that have today come to include profiles of millions of persons. But because of the slow and cumbersome nature of the process, few imagined that within a decade DNA analysis would become the most important forensic tool ever invented. After what he termed an “entertaining diversion” into forensics, Jeffreys returned to his basic research, noting that “what comes now is technological refinement, and that’s not my job.” He was later knighted for his contribution to science.
Investigators have long searched for methods of identifying culprits beyond simple (and often unreliable) eyewitness testimony. Three systems of picking an individual from a population began to emerge in the late nineteenth century.
In the 1880s a French police official named Alphonse Bertillon took on the problem of reliably identifying individuals in order to track criminals and distinguish recidivists from casual offenders. He devised 11 measurements, including arm span, cheek width, sitting height, and right-ear length. Combined, they could identify a person with precision. The system relied on the fact that while two persons may share one physical trait, the odds of their having identical values for a range of measurements was slim. Add enough traits and the chances of a random match were virtually nonexistent. The approach was considered scientific because it was both precise and systematic, giving it an edge over verbal description or photography.
The weakness of Bertillon’s concept was its application. The measurements were awkward to take, and small errors diminished the system’s accuracy. Nevertheless, it was used by many U.S. police departments into the 1920s. By that time a simpler, faster, and potentially more useful method had arrived: the fingerprint.
It had long been known that the tips of the fingers contained patterns of ridges that varied from one person to another. In 1892 the English scientist Francis Galton determined that these prints were unique to individuals and permanent through a person’s life. What was more, secretions left on items a person touched could contain a record of his fingerprints and later be used to connect him with a crime scene. Galton devised a way of classifying the various patterns. His concept, as revised in 1897, is still used today.
Authorities in the United States first put fingerprint identification to work in 1910 to help convict a Chicago murderer. The Federal Bureau of Investigation began consolidating fingerprint records in 1924 and had accumulated 100 million examples by the end of World War II. For decades fingerprints reigned as the principal means of establishing identity.
The third forensic identifier, blood type, became available in 1901. The Austrian researcher Karl Landsteiner discovered that the reaction of antigens in incompatible types of blood formed clumps and prevented transfusion. He labeled the types A, B, AB, and O. Each was a manifestation of a particular genetic quirk.
Police used blood-type evidence in criminal investigations to provide exclusionary evidence. That is, if a criminal left behind type A blood at the scene, suspects with the other blood types were ruled out. But simply possessing type A did not necessarily mean a suspect was guilty, since a large portion of the population shared that trait.
Over the years, investigators added other blood factors, along with distinctive serum proteins and enzymes, to aid forensic identification. Combined, they could increase discrimination. One person in a hundred might possess a given blood profile. For determining paternity, such evidence could be convincing. But in a criminal case, blood-type evidence could only exclude suspects; it was not specific enough to erase reasonable doubt.
DNA profiling had connections to all three of these older means of identification. Like blood typing, it was based on chemical analysis. Like fingerprinting, it relied on a distinct but purposeless physical feature that was unique to an individual. Like Bertillon’s method, it compared a range of attributes in order to diminish the likelihood of a random match. But by going to the source of human variability, DNA profiling held the potential to surpass all earlier methods of identifying a person or matching a suspect to evidence.
Even as British authorities were struggling to solve the murder of the two teenagers, detectives in Orlando, Florida, were on the trail of their own serial rapist. He first struck in May 1986 and had raped 23 women by the end of the year. In most cases he broke into a home, showed a knife, covered the victim’s face to deter identification, and stole some personal item like a driver’s license.
In February 1987 he raped a young mother while her two children slept in the next room. This time he left behind two fingerprints. In March police apprehended a prowler in the area who matched the prints. He was Tommie Lee Andrews, a 24-year-old warehouse worker.
The fingerprint evidence was likely to convict Andrews for that crime, but authorities wanted to connect him to at least one other rape. It would mean the difference between a few years in jail and life. However, only one victim could identify him, and her evidence didn’t guarantee a guilty verdict. Florida prosecutors turned to DNA profiling, its first use in a U.S. courtroom. Today DNA analysis is routine, but 20 years ago the scientists who set about connecting DNA drawn from Andrews’s blood with that from traces of semen found on the rape victim faced a daunting challenge in trying to pick out, from inside two complex molecules, the tiny variations that were unique to an individual.
DNA, called by one researcher the king of molecules, had long resisted attempts to understand its nature. As early as 1869 the Swiss physician and chemist Friedrich Miescher had detected a slightly acidic compound in the nuclei of white blood cells taken from pus. The molecule’s function wasn’t pinned down until 1944, when Oswald Avery and his colleagues at New York’s Rockefeller Institute for Medical Research determined that DNA was the carrier of genetic information. But scientists continued to puzzle over how this unwieldy molecule transmitted all the instructions on which life depended. The breakthrough came in 1953, when the American researcher James Watson and his British colleague Francis Crick determined both the structure of DNA and the mechanism by which it carried information. An alphabet of simple chemical substructures, they found, was arranged into genetic words that regulated activity in the cell.
DNA was made up of a prodigiously long string of atoms. If the coils upon coils of DNA in a single cell were unwound, they would stretch six feet. We can picture the molecule as a twisted ladder, the famous double helix. Human DNA consists of 46 molecules arranged into 23 pairs of chromosomes (one of each pair is inherited from each parent). The total of six billion rungs on these ladders carries the coded information of life. Each rung consists of one of the four chemical structures referred to as A, T, G, and C, linked to a complementary structure on the other strand (A to T, G to C), forming a “base pair.” Only about 2 percent of the base pairs in a cell’s DNA—the genes—carry information. The rest of the molecule has no known function (though some function may yet be discovered) and is often referred to as noncoding or “junk” DNA.
In the early 1980s Jeffreys, looking for a way to map the gene that produced an oxygen-carrying protein, made a discovery that he admits was “quite accidental.” He began to examine stutters in the DNA sequence, areas in which a pattern of base pairs repeated itself, most of them in the “junk” sections of the molecule. He found that they varied among individuals in the number of times the sequence was reiterated. At a particular site, a pattern of 28 base pairs might be repeated 46 times in one individual, 127 times in another. Because every site occurs on each of a pair of chromosomes, a person will usually have two different values for the repeats. In September of 1984 Jeffreys’s chemical manipulation succeeded in creating a way to separate segments of different length and visualize them on a radiograph.
“It was a ‘Eureka!’ moment,” he has said. Jeffreys had found a way to identify an aspect of the DNA molecule that was unique to an individual. “We could immediately see the potential for forensic investigation.”
It was a major discovery. Although DNA governs all the physical characteristics that make a person unique, turning it into a forensic tool was no easy matter because 99.9 percent of human DNA is the same for everyone. It was Jeffreys’s discovery of a way to identify and measure what became known as variable number tandem repeats (VNTR) that led to further progress.
One quality of DNA that did make it useful for forensic applications was the fact that the entire code of life was contained in the nucleus of almost every cell in the body. That meant that if investigators could compared a stain of semen with the blood of a suspect, a match would be possible.
In 1987 technicians at Lifecodes, a laboratory in New York, were attempting to do just that as they analyzed the DNA samples in the Tommie Lee Andrews case. They used chemical detergents and protein digesters to free the DNA molecules from the cells. After concentrating and purifying the DNA, they subjected it to the painstaking process of analysis.
What they were attempting was to cut from the long DNA molecules specific sections that contained patterns of tandem repeats. The repeating segment, for example, might be 20 base pairs long. One person could have 12 repetitions; another, 80. The segments were like boxcars on a train, and the technicians’ goal was to determine the length between engine and caboose.
The restriction enzyme sliced out the target segment from the DNA molecule (it also produced many other fragments that were irrelevant to the test). The electrophoresis process used agarose gel (derived from seaweed) as a sieve to make the long segments move more slowly than the short ones. Technicians transferred the fragments, now aligned by size, to a nylon membrane and fixed them in place. A strong alkali solution broke apart the fragile bonds that held the two strands together.
The DNA probes they used were synthesized sections of DNA complementary to the targeted area of repeats. They relied on the basic principle of DNA that the base A joins only with T and G only with C (the complement of the sequence GATC is CTAG). When technicians washed a solution containing the probes across the nylon membrane, the probes formed chemical bonds with the targeted sections that were their complements, but not with miscellaneous fragments of DNA. By using more than one probe, they were able to determine the length of several different repeating sites.
Technicians had “labeled” the probes by including a radioactive isotope in each. In the areas where they were concentrated, the probes attached to the target DNA darkened the X-ray film on which the membrane was laid. The radiograph thus obtained (it resembled the bar codes on grocery products) showed the positions of the targeted fragments, which corresponded with their relative lengths.
In the Tommie Lee Andrews case, the bars created by the evidence DNA and by that taken from the suspect’s blood were identical. The jury was charged with weighing a great deal of new science in reaching its verdict, but in February of 1988 it decided that Andrews was indeed a serial rapist. He received jail terms totaling 115 years.
In 1989 David Vazquez, who had pleaded guilty to a Virginia rape and murder in order to avoid a death sentence, became the first American exonerated on the basis of DNA profiling. In the meantime the FBI had established its own laboratory devoted to DNA. The age of forensic DNA, it seemed, was under way.
Yet the technique was still hampered by a number of drawbacks. RFLP profiling was hard work: meticulous, time-consuming, and labor-intensive. The expensive tests took weeks, sometimes months, to perform. The results required careful analysis by experts. Moreover, investigators needed a relatively large sample, a bloodstain the size of a dime, to extract enough DNA to complete the test.
All that was about to change. A discovery by a California biochemist was already in the process of revolutionizing forensic DNA analysis. It would soon shake up the entire field of molecular biology.
In 1983 Kary Mullis had been working with a team at the Cetus Corporation, a company exploring the emerging field of biotechnology, to discover the genetic roots of sickle cell anemia. One Friday evening, driving north from Berkeley toward his weekend cottage in Mendocino County, Mullis was struck by an idea so startling that he pulled his car off the road and scribbled his idea onto an envelope.
“I thought it was an illusion,” he later said. He understood that DNA is an awkward molecule to work with, “like an unwound and tangled audiotape on the floor of the car in the dark.” He knew that to do anything useful with it, scientists had to focus on manageable segments. The problem was that any one short segment of the molecule was so infinitesimally small that it was valueless. Unless …
He imagined a way to harness the natural ability of DNA to replicate itself in order to copy a segment of the molecule. Repeat the process over and over, and the number of copies would increase geometrically, first 2, then 4, 8, 16, and, after 20 cycles, a million. He had come up with a way to “make as many copies as I wanted of any DNA sequence I chose.”
He stayed up all night performing calculations. In the coming months he established the basic elements of what he called polymerase chain reaction (PCR), though the laboratory techniques would take years to perfect. He won the Nobel Prize in 1993 for an invention that was welcomed by biochemists even as it added to the crime-fighting arsenal of detectives around the world.
Mullis’s technique had three steps. First he heated a sample of DNA nearly to the boiling point of water in order to separate its two strands. Next he cooled the mixture and relied on “primers,” synthetic DNA fragments, to mark off the unique section of the chain he wanted to replicate. Like probes, the primers bonded with their complements along the length of the sample. They acted like the “find” feature on a computer, locating a particular sequence of bases.
The critical third step was to warm the solution slightly in order to encourage the polymerase enzyme to add free-floating G, A, T, and C base elements to the primers, thereby spelling out a complementary strand of DNA along the targeted area. When he heated the mixture again, the new strand broke off from its template, yielding two segments ready to be copied again. Repeatedly raising and lowering the temperature of this mixture—a cycle took about five minutes—kept the multiplication process going.
When used in forensic analysis, PCR produced a sufficient amount of DNA after about 25 or 30 cycles. Technicians then proceeded to separate the fragments by length, using electrophoresis, as in the RFLP method. Technicians also began to use fluorescent tags or dyes to visualize the fragments after electrophoresis, eliminating the slower radioactive indicators.
PCR was faster, easier, and cheaper than the earlier method. According to the former Connecticut chief criminalist Henry Lee, PCR was a technology that “vastly expanded the ability of forensic scientists to type the DNA of an individual.”
PCR became even more useful in 1991, when Thomas Caskey, a researcher at the Baylor College of Medicine, in Houston, suggested the use of DNA segments much shorter than the variable number tandem repeats Jeffreys used. These short tandem repeats (STR) consisted of repetitions of patterns that comprised only three to five base pairs. A string of repetitions might be 50 base pairs long, while the sections targeted by Jeffreys were made up of thousands of base pairs.
As with RFLP, an individual would display two variants (known as alleles) for any given STR site, one inherited from each parent. Occasionally the number was the same for each, but usually it differed. So a PCR test of one site would create two bands. One might show 5 repeats, the other 12. This made for an easy comparison with the results of the same test run on an evidence sample.
Because the fragments were short, they produced a clearer pattern when separated than did the longer fragments used in RFLP. Results were unambiguous and easier to interpret. The process also lent itself to automation, an important factor when investigators set about creating databases incorporating thousands of samples.
The most crucial advantage of PCR-based analysis was that investigators could study much smaller samples than they could using the RFLP technique. Starting with a minute quantity, only a few cells, they could “amplify” DNA from the material, creating as much as they needed. This meant that tiny blood specks, the saliva on a cigarette butt, or a minute deposit of skin cells on a ski mask could positively identify a suspect.
Technicians also found PCR ideal for handling old and degraded samples. The small DNA segments that constituted the short tandem repeats were less likely than larger ones to break apart as the DNA aged. This was important. DNA is a durable chemical, but heat, ultraviolet light, and other factors can cause it to disintegrate. Traces of evidence are seldom pristine, so detectives welcomed a method that could handle broken DNA. They knew there was no danger of a false finding. Extreme degradation might prevent a profile from being obtained, but it never created a different profile.
The advantages of the method meant that by the mid-1990s PCR had begun to replace the original RFLP technique in many laboratories.
PCR was just finding its forensic legs in February 1993, when a bomb exploded in the parking garage of the World Trade Center. The blast killed 6 people and injured more than 1,000. Four days later editors at The New York Times received a letter claiming responsibility for the attack.
One of the bombers, Mohammed Salameh, was arrested soon afterward when he went to collect a deposit for the rental truck that had carried the explosive. Authorities quickly focused on Salameh’s associate Nidal Ayyad. Included in the evidence that linked him to the bombing was a trace of saliva on the envelope flap of the letter received by the Times . The PCR technique used then was not as discriminating as it would become later, but experts estimated that only one of 50 people would match the profile found on the envelope, and Ayyad was one of them. The evidence helped the jury convict him.
Because the new form of DNA analysis relied on much shorter repeating segments and therefore lacked the discriminating power of Jeffreys’s original method, investigators began to target more STR sites along the molecule for amplification (thousands are known to exist). To do so, they simply added primers formulated to attach to different DNA regions. Two suspects might show the same pattern for one site, but the chances of a random match diminished exponentially with each site added. The FBI eventually settled on the 13 sites that now make up a standard DNA profile.
The main drawback of PCR was its sensitivity to contamination. If a few skin cells from a lab worker fell into a drop of blood that was to be analyzed by the RFLP method, their DNA would be lost in the much larger quantity of DNA from the sample itself. But if a contaminant was added to a tiny trace of evidence intended for PCR analysis, the process would amplify DNA from the contaminant as well as the evidence, resulting in a confusing or misleading outcome. PCR demanded scrupulous evidence handling and superclean laboratory techniques.
Another difficulty was pinning down exactly how common a particular pattern might be. DNA analysis never deals in certainties, only in probabilities. The results indicate that a particular person’s DNA pattern matches the pattern of an evidence sample. Statisticians then have to calculate the chances of a person chosen at random having a similar match. Is it one in a thousand, one in a million, or one in a billion?
These numbers are obtained by the product rule. Let’s say for simplicity that there are 10 possible variants in the number of repeats at each STR site (the actual number is usually higher). If the variants are randomly distributed, the chance of a match at one site is 1 in 10. The chance of matches at two sites is 1 in 100; at three, 1 in 1,000; at four, 1 in 10,000. Each additional site decreases the likelihood of a random match by a power of 10. Factoring all 13 STR sites into the equation, researchers now set the chance of a random match between two DNA samples at a staggering one in 575 trillion. When prosecutors explain such numbers to a jury, DNA evidence becomes very powerful.
In 1994 a case that would make forensic DNA profiling familiar to millions burst into the headlines. California prosecutors accused the retired football player Orenthal James Simpson of stabbing to death his former wife, Nicole Brown, and her friend Ronald Goldman. DNA became a central issue in the ensuing trial. The jury sat through what amounted to a crash course in DNA analysis, and the technique was mentioned more than 10,000 times in the transcript.
The analysis produced damning evidence against the accused. The blood of both victims stained a glove found outside Simpson’s home. Tests on a stain from Simpson’s Ford Bronco found that it contained Goldman’s blood. Blood on a sock found on the floor of Simpson’s bedroom was identified as Nicole Brown’s. It seemed that the new science had done an admirable job in unraveling the mystery.
But while the defense did not contest the admissibility of DNA evidence, Simpson’s “dream team” of lawyers was able to raise troubling questions about the techniques used to gather and process the evidence. Much of the testing relied on the original RFLP technique, which was subject to laboratory error and interpretation. The newer PCR method, on the other hand, was acutely susceptible to contamination. Technicians were notably careless in collecting and handling samples. One lab worker tested evidence soon after spilling reference blood drawn from Simpson. These procedural deficiencies, along with suggestions that Simpson had been framed by investigators, contributed to the jury’s verdict of not guilty.
Prosecutors and police realized that scrupulous evidence handling and rigorous laboratory methods were needed if DNA evidence was to survive scrutiny. “Based on the O. J. Simpson case,” the New York City police commissioner Howard Safir said in 1998, “we had to design a lab and procedures that are much less subject to challenge.”
In 1996 the federal government began to award grants to states through its DNA Laboratory Improvement Program. The funds went for upgrading and standardizing procedures, and they helped labs switch from RFLP to PCR technology. That same year the National Research Council gave its stamp of approval to DNA evidence, declaring that testing techniques and statistical data had “progressed to the point where the admissibility of properly collected and analyzed data should not be in doubt.” Juries were soon routinely accepting as decisive evidence based on DNA analysis.
In the late 1990s DNA profiling hit the news again, providing evidence in scandals involving two American Presidents. The first was almost 200 years old. As early as 1802 Thomas Jefferson’s enemies had accused him of fathering a child with his slave Sally Hemings. Proof one way or the other seemed impossible until 1998. An examination of the DNA of descendants of Thomas Jefferson’s uncle (Jefferson himself had no known male heirs) and of that of Hemings’s progeny suggested that Jefferson had sired at least one of her children.
Even as it was pointing the finger back into history, DNA evidence caught in a lie the then current occupant of the White House, Bill Clinton. Clinton had assured the nation that he had carried on no sexual relationship with “that woman,” the White House intern Monica Lewinsky. A DNA profile ordered by the special prosecutor, Kenneth Starr, proved otherwise, linking semen stains on Lewinsky’s blue dress to blood drawn from the President. The irrefutable evidence led to a shamefaced confession and laid the groundwork for the first impeachment of an American President in 130 years.
Because of its decisive ability to exclude a suspect, DNA has always been a powerful tool of exoneration. In 1981 Robert Clark, 20, was accused of rape, kidnapping, and armed robbery for a crime that took place in East Atlanta, Georgia. He proclaimed his innocence, even naming the man from whom he had bought the victim’s car, which Clark was assumed to have stolen. But the victim picked him out of a lineup. He was convicted and sentenced to two life terms plus 20 years.
In 2003 Clark contacted the Innocence Project, an organization that the lawyers Barry Scheck and Peter Neufeld had founded in 1992 at New York University’s Benjamin N. Cardozo School of Law. The nonprofit group provides post-conviction DNA testing in cases where there is a question of guilt.
Over the objection of prosecutors, Project lawyers arranged for DNA analysis in the Clark case. The test proved that Robert Clark was not the perpetrator of the crime for which he had been convicted. Instead, it implicated the very man Clark had named two decades earlier. In December 2005, after almost 25 years in prison, Clark walked free.
The role of DNA testing in exonerating the wrongfully convicted has been its most important contribution to justice. Every mistaken conviction perpetrates a double injustice: an innocent person locked up, a criminal unpunished. Robert Clark joined the 175 other defendants who have so far been cleared by the Innocence Project. This success has spawned many similar efforts. It has raised questions about the reliability of other forensic evidence, including confessions and eyewitness identifications. In 2003 the Illinois governor, George Ryan, commuted the sentences of all his state’s death-row inmates after 13 of them were found to be innocent by DNA testing and other means. Nationwide, DNA testing alone has cleared more than a dozen persons wrongfully condemned to death.
Back in 1977, years before DNA profiling was dreamed of, a robber held up a Richmond, Virginia, business called Shakey’s Pizza Parlor. The incident turned violent, and the thief murdered the shop’s owner. In the struggle the assailant left behind drops of his own blood. With little additional evidence and no suspects, police placed the case on the list of those they would probably never solve.
DNA profiling can link a suspect to a crime or exclude him. But what about those cases where there is no suspect? State and federal authorities have long kept fingerprint records of criminals and others so that if a print from an unknown person is found at a crime scene, they can try to find a match. They realized that a similar database containing DNA profiles would allow them to attempt to link evidence from a crime scene to the profiles of potential suspects.
In 1998 the FBI began a program known as the Combined DNA Index System, or CODIS, which consolidates DNA profiles from the states into a national registry. The criteria for collecting the samples vary by state; many require all convicted felons and prison inmates to submit a sample. The PCR technique assigns two figures, indicating the number of repeats, to each of the 13 DNA sites analyzed. One figure is a measure of the version of the site inherited from the mother, the other of that from the father. These 26 numbers represent a complete, compact profile of a person’s DNA, allowing for an easily searchable database.
The system has enabled investigators to match crime scene evidence to the DNA of a potential suspect in more than 30,000 open cases. It has helped solve crimes that are decades old. By 2006 the FBI had collected 3.2 million profiles, along with 136,000 evidence samples from unsolved crimes.
Blood evidence from the Virginia pizza parlor murder was included in that state’s DNA database in 1996. Then, in 2004, police required a man arrested on a felony gun charge to provide a blood sample for DNA typing. The two samples matched. In February 2006 Benjamin Richard Johnson, now 61, was charged with the 29-year-old murder.
A DNA database raises questions of privacy and fairness. Whose DNA should be included? Only convicted criminals’? Everyone’s? What should happen to the DNA sample after it has been analyzed? If a person in a database is a near-perfect match, should his or her relatives, who might share a similar profile, become suspects?
Today DNA profiling does not provide any genetic information about a suspect. Like fingerprinting, it focuses on a unique but meaningless feature. But the rapid evolution of genetic science raises the possibility that analysts will soon be able to infer from DNA physical characteristics like eye and skin color, approximate height, propensity to illnesses, and ethnic background. Even a psychological profile is not out of the question. Is the person prone to anger or violent action? These possibilities will require society to grapple with novel ethical questions, but they also hold promise for effectively using science to bring wrongdoers to justice.
Riding the wave of discoveries in molecular biology and borrowing the tools developed to map the human genome, DNA profiling has progressed rapidly over the past decade. Mechanization has made all aspects of DNA analysis increasingly automatic. What originally took hours or days can now be done in minutes. Instead of amplifying and analyzing a single target section at a time, investigators can look at all 13 standard sites simultaneously. Instead of creating bar graphs, they allow a laser detector to profile segments as the DNA fragments pass down a narrow tube and let computers analyze the information. Within a few years, police will be able to analyze evidence and produce a profile in minutes using hand-held devices right at the scene of a crime. By tapping into a computerized database, they may be able to implicate a specific individual almost immediately.
So powerful is DNA analysis that today a fingerprint may not only reveal a clue to a perpetrator’s identity in itself but also yield to investigators a few sloughed skin cells from which they can extract DNA, develop a profile, and use it to conclusively nail the culprit.
Jack Kelly, a frequent contributor to Invention & Technology, is the author of Gunpowder: Alchemy, Bombards & Pyrotechnics (Basic Books, 2005).