The above images depict an erroneous identification that was
presented in the Scottish courts. Below is comment from outside the forensic science arena.
Above images kindly supplied by Ed German. Visit - www.onin.com
DO FINGERPRINTS LIE? by MICHAEL SPECTER "The New Yorker" Issue of 2002-05-27. Posted 2002-05-20
The gold
standard of forensic evidence is now being challenged.
Late one afternoon in the spring of 1998, a police detective
named Shirley McKie stood by the sea on the southern coast of Scotland and thought about ending her life. A promising young
officer, the thirty-five-year-old McKie had become an outcast among her colleagues in the tiny hamlet of Strathclyde. A year
earlier, she had been assigned to a murder case in which an old woman was stabbed through the right eye with a pair of sewing
scissors. Within hours of the killing, a team of forensic specialists had begun working their way through the victim's house.
Along with blood, hair, and fibres, the detectives found some unexpected evidence: one of the prints lifted from the room
where the murder took place apparently matched the left thumb of Detective McKie.
Crime scenes are often contaminated
by fingerprints belonging to police officers, and investigators quickly learn to eliminate them from the pool of suspects.
But McKie said that she had never entered the house. Four experts from the Scottish Criminal Record Office—the agency that
stores and identifies fingerprints for Scotland's police—insisted, however, that the print was hers. Though McKie held to
her story, even her father doubted her. "I love my daughter very much,'' Iain McKie, who served as a police officer in Scotland
for more than thirty years, told me earlier this year. "But when they said the print was Shirley's I have to admit I assumed
the worst. My entire career I had heard that fingerprints never lie."
Nobody actually suspected McKie of murder, and
in fact the victim's handyman, David Asbury, was charged with the crime. The sole physical evidence against him consisted
of two fingerprints—one of his, lifted from an unopened Christmas gift inside the house, and one of the victim's, found on
a biscuit tin in Asbury's home. The last thing prosecutors needed was for their own witness to raise questions in court about
the quality of the evidence. Yet McKie did just that—repeating under oath that she had never entered the house. Asbury was
convicted anyway, but Scottish prosecutors were enraged by McKie's testimony. As far as they were concerned, McKie had not
only lied; she had challenged one of the evidentiary pillars of the entire legal system. Despite their victory in the murder
trial, they charged McKie with perjury.
Desperate, she went to the public library and searched the Internet for somebody
who might help her. Among the names she came upon was that of Allan Bayle, a senior forensic official at New Scotland Yard
and perhaps the United Kingdom's foremost fingerprint expert. (It was Bayle's expertise and supporting evidence that helped
convict one of the principal Libyan suspects in the 1988 bombing of Pan Am Flight 103, over Lockerbie, Scotland.) He agreed
to review the prints, and what he saw astonished him. "It was obvious the fingerprint was not Shirley's,'' Bayle told me recently.
"It wasn't even a close call. She was identified on the left thumb, but that's not the hand the print was from. It's the right
forefinger. But how can you admit you are wrong about Shirley's print without opening yourself to doubt about the murder suspect,
too?" Bayle posted a comment on Onin.com, a Web site trafficked regularly by the world's fingerprint community. "I have looked
at the McKie case,'' he wrote. "The mark is not identical. I have shown this mark to many experts in the UK and they have
come to the same conclusions."
Bayle's assertion caused a furor. He was threatened with disciplinary action, shunned
by his colleagues, and, after a quarter century with the Metropolitan Police, driven from his job. But in the end McKie was
acquitted, and Bayle's statement helped challenge a system that had, until then, simply been taken for granted.
For
more than a century, the fingerprint has been regarded as an unassailable symbol of truth, particularly in the courtroom.
When a trained expert tells a judge and jury that prints found at a crime scene match those of the accused, his testimony
often decides the case. The Federal Bureau of Investigation's basic text on the subject is entitled "The Science of Fingerprints,''
and a science is what F.B.I. officials believe fingerprinting to be; their Web site states that "fingerprints offer an infallible
means of personal identification.'' The Bureau maintains a database that includes the fingerprints of more than forty-three
million Americans; it can be searched from precinct houses and properly equipped police cruisers across the country. Fingerprints
are regularly used to resolve disputes, prevent forgery, and certify the remains of the dead; they have helped send countless
people to prison. Until this year, fingerprint evidence had never successfully been challenged in any American courtroom.
Then,
on January 7th, U.S. District Court Judge Louis H. Pollak—a former dean of the law schools at Yale and at the University of
Pennsylvania—issued a ruling that limited the use of fingerprint evidence in a drug-related murder case now under way in Philadelphia.
He decided that there were not enough data showing that methods used by fingerprint analysts would pass the tests of scientific
rigor required by the Supreme Court, and noted the "alarmingly high" error rates on periodic proficiency exams. Although Judge
Pollak later decided to permit F.B.I. fingerprint experts to testify in this particular case, students of forensic science
felt his skepticism was justified. "We have seen forensic disciplines which focus on bite marks, hair analysis, and handwriting
increasingly questioned in the courts," Robert Epstein, who had argued for the exclusion of fingerprint testimony in the case,
told me. "But we have accepted fingerprinting uncritically for a hundred years.''
Epstein, an assistant federal public
defender in Philadelphia, was responsible for the first major court challenge to the discipline, in 1999, in U.S. v. Byron
Mitchell. In that case, Epstein showed that standards for examiners vary widely, and that errors on proficiency tests—which
are given irregularly and in a variety of forms—are far from rare. The critical evidence consisted of two fingerprint marks
lifted from a car used in a robbery. To prepare for the trial, F.B.I. officials had sent the prints to agencies in all fifty
states; roughly twenty per cent failed to identify them correctly. "After all this time, we still have no idea how well fingerprinting
really works,'' Epstein said. "The F.B.I. calls it a science. By what definition is it a science? Where are the data? Where
are the studies? We know that fingerprint examiners are not always right. But are they usually right or are they sometimes
right? That, I am afraid, we don't know. Are there a few people in prison who shouldn't be? Are there many? Nobody has ever
bothered to try and find out. Look closely at the great discipline of fingerprinting. It's not only not a science—it should
not even be admitted as evidence in an American court of law."
Fingerprints have been a source of fascination for thousands
of years. They were used as seals on legal contracts in ancient Babylonia, and have been found embossed on six-thousand-year-old
Chinese earthenware and pressed onto walls in the tomb of Tutankhamun. Hundreds of years ago, the outline of a hand with etchings
representing the ridge patterns on fingertips was scratched into slate rock beside Kejimkujik Lake, in Nova Scotia.
For
most of human history, using fingerprints to establish a person's identity was unnecessary. Until the nineteenth century,
people rarely left the villages in which they were born, and it was possible to live for years without setting eyes on a stranger.
With the rise of the Industrial Revolution, cities throughout Europe and America filled with migrants whose names and backgrounds
could not be easily verified by employers or landlords. As the sociologist Simon Cole made clear in "Suspect Identities,"
a recent history of fingerprinting, felons quickly learned to lie about their names, and the soaring rate of urban crime forced
police to search for a more exacting way to determine and keep track of identities. The first such system was devised in 1883
by a Parisian police clerk named Alphonse Bertillon. His method, called anthropometry, relied on an elaborate set of anatomical
measurements—such as head size, length of the left middle finger, face height—and features like scars and hair and eye color
to distinguish one person from another. Anthropometry proved useful, but fingerprinting, which was then coming into use in
Britain, held more promise. By the eighteen-sixties, Sir William J. Herschel, a British civil servant in India, had begun
to keep records of fingerprints and use them to resolve common contract disputes and petty frauds.
Fingerprinting did
not become indispensable, however, until 1869, when Britain stopped exiling criminals to Australia, and Parliament passed
the Habitual Criminals Act. This law required judges to take past offenses into account when determining the severity of a
sentence. But in order to include prior offenses in an evaluation one would need to know whether the convict had a previous
record, and many criminals simply used a different alias each time they were arrested. The discovery that no two people had
exactly the same pattern of ridge characteristics on their fingertips seemed to offer a solution. In 1880, Dr. Henry Faulds
published the first comments, in the scientific journal Nature, on the use of fingerprints to solve crimes. Soon afterward,
Charles Darwin's misanthropic cousin, Sir Francis Galton, an anthropologist and the founder of eugenics, designed a system
of numbering the ridges on the tips of fingers—now known as Galton points—which is still in use throughout the world. (Ultimately,
though, he saw fingerprints as a way to classify people by race.)
Nobody is sure exactly how Mark Twain learned about
fingerprints, but his novel "Pudd'nhead Wilson," published in 1894, planted them in the American imagination. The main character
in the book, a lawyer, earned the nickname Pudd'nhead in part because he spent so much time collecting "finger-marks"—which
was regarded as proof of his foolishness until he astounded his fellow-citizens by using the marks to solve a murder. If you
were to walk into a courtroom today and listen to the testimony of a typical forensic expert, you might hear a recitation
much like Pudd'nhead Wilson's:
Every human being carries with him from his cradle to his grave certain physical marks
which do not change their character, and by which he can always be identified—and that without shade of doubt or question.
These marks are his signature, his physiological autograph, so to speak, and this autograph cannot be counterfeited, nor can
he disguise it or hide it away, nor can it become illegible by the wear and the mutations of time. . . . This signature is
each man's very own. There is no duplicate of it among the swarming populations of the globe!
Some things have changed
since Pudd'nhead Wilson, of course. A few weeks ago, I visited the headquarters of the Integrated Automated Fingerprint Identification
Systems, the F.B.I.'s billion-dollar data center, just outside Clarksburg, West Virginia—a citadel of the American forensic
community. After driving past a series of shacks and double-wides and Bob Evans restaurants, you come upon a forest with a
vast, futuristic complex looming above the trees. (I.A.F.I.S. moved from more crowded quarters in the Hoover Building in 1995,
thanks to the influence of the state's senior senator, Robert C. Byrd.)
Clarksburg is home to the world's largest collection
of fingerprints; on an average day, forty thousand are fed into the system. The I.A.F.I.S. computers, which can process three
thousand searches a second, sort through the database in a variety of ways. For example, they compare complete sets of fingerprints
in the files with new arrivals—as when a suspect is held in custody and the police send his "ten-prints" to I.A.F.I.S. The
computer hunts for shared characteristics, and then attempts to match the prints to a record on file. "We identify about eight
thousand fugitives per month here,'' Billy P. Martin, the acting chief of the Identification and Investigative Services Section,
told me. Martin said that eleven per cent of job applicants whose fingerprints are entered into the system—they could be day-care
workers, casino staff, federal employees—turn out to have criminal records; as many as sixty per cent of the matches are repeat
offenders.
The center looks like a NASA control room, with dozens of people monitoring the encrypted network of fingerprint
machines sending in data from police stations throughout the country. The main computer floor is the size of two football
fields and contains sixty-two purple-and-gray "jukeboxes," each filled with two hundred compact disks containing fingerprints.
(There are three thousand sets on each CD.) When someone is arrested, his prints are initially searched against a state's
computer files. If the search finds nothing, the information is forwarded to the federal database in Clarksburg. To make a
match, the I.A.F.I.S. computer analyzes the many points on the ridges of every fingerprint it receives, starting with the
thumb and working toward the pinkie; only when the data produce prints that match (or several prints that seem similar) is
the original print forwarded to an analyst for comparison.
"We used to go to a file cabinet, pull out paper cards.
If it was all loops—which is the most common type of print—you could spend an hour,'' Martin said. "Now a computer algorithm
does it in seconds. The system searches the electronic image against the database and pulls up the image onto the screen.
The accuracy rate on first run is 99.97 per cent.'' Still, this would mean that the I.A.F.I.S. computers make three hundred
mistakes in every million searches. That is where trained examiners come in. The patterns on fingertips are more like topographical
maps or handwriting than, say, bar codes. They can be so similar that even the most sophisticated computer program can't tell
them apart; it takes a trained human eye to detect the subtle differences.
I sat with one of the examiners in a dim,
nearly silent room lined with what seemed like an endless series of cubicles. At each station, someone was staring at a monitor
with two huge fingerprints on it. No two people—not even identical twins—have ever been shown to share fingerprints. The friction
ridges that cover the skin on your hands and feet are formed by the seventeenth week in the womb; at birth they have become
so deep that nothing can alter them, not even surgery. Look at your fingertips: the patterns resemble finely detailed maps
of the bypasses and exit ramps on modern roads. Experts use the nomenclature of the highway to describe them: there are spurs,
bifurcations, and crossovers. Some people have fingertips that are dominated by "loops," others by "tented arches" or small
circles that examiners call "lakes," or smaller ones still, called "dots." Collectively, these details are referred to as
minutiae—an average human fingerprint may contain as many as a hundred and fifty minutia points. To identify fingerprints,
an expert must compare these points individually, until enough of them correspond that he or she feels confident of a match.
When
fingerprints are properly recorded (inked, then rolled, finger by finger, onto a flat surface, or scanned into a machine that
captures and stores each finger as a digital image), identification works almost flawlessly. The trouble is that investigators
in the field rarely see the pristine prints that can be quickly analyzed by a computer; most of the prints introduced at criminal
trials are fragments known as "latent prints." Crime scenes are messy, and the average fingerprint taken from them represents
only a fraction of a full fingertip—about twenty per cent. They are frequently distorted and hard to read, having been lifted
from a grainy table or a bloodstained floor. "It is one thing to say that fingerprints are unique and quite another to suggest
that a partial latent print, often covered in blood or taken from an obscure surface, is unique, identical, or easy to identify,''
Barry Scheck told me. In the past decade, Scheck, who directs the Innocence Project, has used DNA evidence to exonerate more
than a hundred prisoners, some of them on death row. "We have always been told that fingerprint evidence is the gold standard
of forensic science. If you have a print, you have your man. But it is not an objective decision. It is inexact, partial,
and open to all sorts of critics.''
Police use several methods to discover latent fingerprints. First, they shine a
flashlight or a laser along the clean, solid surfaces on which a print may have been left by the perspiration and oil on a
fingertip. When a print is discovered, detectives use a brush and powder to mark it, much as they did in the nineteenth century;
the powder clings to the perspiration. (The method works best on smooth surfaces, like glass.) The print is then photographed
and lifted with tape.
The technology for retrieving partial and obscure fingerprints keeps improving. On a recent episode
of the television program "C.S.I.," you might have seen detectives using a technique called superglue fuming to reveal the
outline of a face on a plastic bag—an unconventional use of a common practice. In order to find difficult prints on an irregular
surface, such as the human body, crime-scene investigators blow fumes of superglue over it. As the fumes adhere to the surface,
the ridges of any fingerprint left there turn white and come clearly into view. Another common method involves ninhydrin,
which works like invisible ink: when you douse paper with it, the chemical brings out any sweat that may have been left by
fingertips. Ninhydrin is particularly useful with old prints or those covered in blood.
F.B.I. fingerprint examiners
have a variety of computer tools—a sort of specialized version of Photoshop—to help them compare rolled prints with those
in their system. In front of me, an I.A.F.I.S. examiner stared at his computer screen as a training instructor, Charles W.
Jones, Jr., explained the process. "He is looking for ridges that form dots,'' Jones said. "Bifurcations. Usually they look
for six or seven of those.'' The examiners work around the clock, in three shifts, and are required to evaluate at least thirty
prints an hour. They know nothing about the people attached to the fingers on their screens; the prints could be those of
a rapist, a serial killer, Osama bin Laden, a woman applying for a job in the Secret Service, or a bus driver from Queens.
("Yesterday I did fifty-one for a couple hours in a row,'' an examiner told me proudly.)
At the bottom of the screen
there are three buttons—"Ident," "Unable," and "Non-Ident"—and the examiner must click on one of them. If he identifies a
finger, the print goes to a second analyst. If the two examiners independently reach the same conclusion, the fingerprint
is considered to have been identified. If not, it gets forwarded to an analyst with more experience. "We have a pretty good
fail-safe system,'' Jones said. "Computers help immensely. But in the end they can't pull the trigger. That's our job.''
Only
a human being can make critical decisions about identity, and yet the talent, training, and experience of examiners vary widely.
"The current identification system . . . is only as genuine as the knowledge, experience, and ability of the specialist carrying
out the comparison,'' David R. Ashbaugh, a staff sergeant with the Royal Canadian Mounted Police, writes, in "Quantitative-Qualitative
Friction Ridge Analysis," which is considered the Bible of the field. And although fingerprint analysis has been in use for
decades, there has never been any consensus about professional standards. How many distinct characteristics are necessary
to prove that a latent fingerprint comes from a specific person? The answer is different in New York, California, and London.
In certain states, and in many countries, fingerprint examiners must show that prints share a set number of Galton points
before they can say they have made an identification. Australia and France require at least twelve matching Galton points;
in Italy, the number is sixteen. In America, standards vary, even within a state. The F.B.I. doesn't require a minimum number
of points; all such regulations were dropped fifty years ago, because, according to Stephen B. Meagher, the chief of the Bureau's
latent-print unit, the F.B.I. believes that making an identification using Galton points alone can cause errors.
Meagher
says that fingerprint analysis is an objective science; Robert Epstein, the Philadelphia attorney who has led the fight against
presenting fingerprint evidence in court, says it is not a science at all. Neither is exactly right. Examining the many contours
of a human finger is not as objective as measuring someone's temperature or weight, or developing a new vaccine. But it's
not guesswork, either. It involves, inevitably, human judgment, and most people agree that when it is done well it is highly
accurate. The difficulty is in determining whether it has been done well.
Scientific methodology is based on generating
hypotheses and testing them to see if they make sense; in laboratories throughout the world, researchers spend at least as
much time trying to disprove a theory as they do trying to prove it. Eventually, those ideas that don't prove false are accepted.
But fingerprinting was developed by the police, not by scientists, and it has never been subjected to rigorous analysis—you
cannot go to Harvard, Berkeley, or Oxford and talk to the scholar working on fingerprint research. Yet by the early twentieth
century fingerprinting had become so widely accepted in American courts that further research no longer seemed necessary,
and none of any significance has been completed.
David L. Faigman, who teaches at the Hastings College of the Law and
is an editor of the annually revised forensic text "Modern Scientific Evidence,'' has spent most of his career campaigning
to increase the scientific literacy of judges and juries. Faigman likens the acceptance of fingerprint evidence to the way
leeches were once assumed to be of great medical value. "Leeches were used for centuries,'' he told me. "It was especially
common for the treatment of pneumonia and it was considered an effective therapy. It wasn't till late in the nineteenth century
that they did the clinical tests to show that leeches did not help for pneumonia, and they may have actually hurt. Fingerprinting
is like that in at least one crucial way: it is something we assume works but something we have never properly tested. Until
we test our beliefs, we can't say for sure if we have leeches or we have aspirin"—an effective remedy that was used before
it was understood. "One of the things that science teaches us is that you can't know the answers until you ask the questions.''
The
discussion of fingerprinting is only the most visible element in a much larger debate about how forensic science fits into
the legal system. For years, any sophisticated attorney was certain to call upon expert witnesses—doctors, psychiatrists,
Bruno Magli shoe salesmen—to assert whatever might help his case. And studies have shown that juries are in fact susceptible
to the influence of such experts. Until recently, though, there were no guidelines for qualification; nearly anybody could
be called an expert, which meant that, unlike other witnesses, the expert could present his "opinion" almost as if it were
fact. Experts have been asked to testify about the rate at which a tire would skid, and the distance blood would splatter
when a certain calibre bullet smashed into a skull. They have lectured scores of juries on the likelihood that a medicine
could cause a particular side effect; they have interpreted polygraphs and handwriting, and have pronounced on whether a bite
mark was made by one set of teeth to the exclusion of all others.
Although forensic evidence has proved particularly
powerful with juries, it is particularly weak as a science. By the nineteen-eighties, the kind of evidence that was routinely
admitted into court without any statistical grounding or rationale had earned a name: "junk science." And junk science had
become ubiquitous. With the problem growing out of control, in 1993 the Supreme Court took up a lawsuit called Daubert v.
Merrell Dow Pharmaceuticals. The case involved a child who suffered from serious birth defects. His lawyers claimed that the
defects were caused by Bendectin, a drug that was for many years routinely prescribed for morning sickness, which his mother
took while she was pregnant. The company argued that no valid evidence existed to support the claim. The Court's decision
set a new standard for scientific evidence in America: for the first time, it held that it was not permissible for expert
witnesses to testify to what was "generally accepted" to be true in their field. Judges had to act as "gatekeepers," the Court
said; if an expert lacked reliability he was no longer allowed in the courtroom. The ruling, and others that expanded upon
it, laid down clear guidelines for the federal bench, requiring judges to consider a series of questions: Could a technique
be tested or proved false? Was there a known or potential error rate? (DNA identification has provided the model, because
experts have gathered enough statistical evidence to estimate the odds—which are astronomical—that one person's DNA could
be traced to another.) The Court also instructed judges to consider whether a particular theory had ever been subjected to
the academic rigor of peer review or publication.
The Daubert ruling forced federal judges to become more sophisticated
about science, which has not been easy for them. "Daubert changed everything," Michael J. Saks, a law professor at Arizona
State University, who has written widely on the subject, told me. "And it is pretty clear when you look at those criteria
that fingerprinting simply doesn't satisfy any of them.'' Since the Daubert ruling, federal courts have judged handwriting
evidence and hair identification to be unscientific. The use of polygraph data has also been curtailed. Questions have been
raised about ballistics—say, whether a bullet can be traced back to a particular gun. Somehow, though, until Judge Pollak
came along, challenges to fingerprinting continued to be regarded as heresy.
Relying largely on testimony presented
by Robert Epstein in U.S. v. Byron Mitchell, the first post-Daubert case involving fingerprint testimony, Judge Pollak ruled
in January that an expert could say whether he thought fingerprints belonged to the people accused of the crime, but he could
not say that the fingerprints he had examined were, beyond doubt, those of the defendant.
Pollak is one of the federal
judiciary's most respected judges. Federal prosecutors were so concerned that any ruling he issued would carry a significance
even greater than its legal weight that they asked the Judge to reconsider his precedent-shattering decision. Pollak agreed.
Late
in February, Pollak held a hearing on the reliability of fingerprint evidence. For three days, several of the world's most
prominent experts discussed their field in his courtroom. The F.B.I.'s Stephen B. Meagher testified that no Bureau analyst
had ever misidentified a person in court, and that the Bureau's annual proficiency test was among the reasons that the Judge
should be confident about admitting expert testimony. Allan Bayle, the British forensic specialist, flew in from London at
the request of the defense. He had a different view. He told Pollak that the F.B.I.'s proficiency test was so easy it could
be passed with no more than six weeks of training. "If I gave my experts [at Scotland Yard] these tests, they would fall about
laughing," he told Pollak in court. Later, in conversation with me, he expanded on those comments. "The F.B.I. are conning
themselves and they are conning everybody else,'' he said. "They don't even use real scene-of-crime marks for the fingerprint
tests." He pointed out that the fingerprints used in the exams were so different from each other that almost anybody could
tell them apart. "Let's say I asked you to look at a zebra, a giraffe, an elephant, and a lion. Then I asked you to find the
zebra. How hard would that be? What the Bureau should be doing is comparing five zebras and selecting among them." Bayle and
other critics stopped short of calling fingerprint evidence junk science, but they noted that there are few data showing how
often latent prints are properly identified.
By February 27th, the final day of the hearing, the fissures in an old
and accepted discipline had become visible, and Judge Pollak promised to issue a final ruling within a couple of weeks.
A
few days after Pollak's hearing ended, I flew to Cardiff to attend the annual meeting of the Fingerprint Society. It was raining
in Wales, and the members of the society were deeply unsettled because their profession was under assault. Each year, the
society gathers for a few days to listen to lectures and to talk about developments in the field. The society has always been
a club—the type where you might expect to stumble upon Sherlock Holmes or G. K. Chesterton. The bar at the Thistle Hotel,
where the conference was held, was filled with police officers from Sussex, Aberdeen, and most places in between. The conference
was well attended by representatives of the United States Secret Service and the F.B.I. There were also a few stray academics
interested in the latest obscure technology, such as magnetic nanoflake powders, which are able to capture fingerprints without
disturbing whatever traces of DNA may be present. (With conventional methods, an investigator has to choose: either swab a
mark to harvest the DNA or lift it to find the print.)
By the time I arrived, the society was preoccupied by two issues:
the Pollak hearings and the lingering ill will from the McKie case, in Scotland. One of those in attendance was Meagher, the
lead F.B.I. witness in Judge Pollak's courtroom. I introduced myself, and told him that I understood he couldn't discuss the
Philadelphia case while it was under review, but asked if we could talk about the field in general. "No,'' he said, without
a moment's hesitation.Iain McKie had also come to Cardiff that weekend, as had Allan Bayle. McKie, a tall, reedy man with
a great nimbus of curly white hair, presented a lecture on the ethics of fingerprinting. He remained livid about the fact
that a fingerprint had destroyed his daughter's career; although she had been acquitted of perjury, she felt unwelcome on
the police force after having been strip-searched and jailed by her colleagues, and had resigned soon after her trial. She
never returned to work. Today, she spends much of her time trying to force Scottish authorities to admit that what they did
to her was wrong. "I believe a person made a mistake, and instead of admitting it they were prepared to send me to jail,''
Shirley McKie said after she was acquitted of perjury. "It ruined my life, and now I am trying to pick up the pieces."
The
Scottish Criminal Record Office has never acknowledged the error, nor has the Fingerprint Society issued any statement about
the incident. (David Asbury, the man convicted of the murder, was released in August of 2000, pending an appeal. As expected,
the judge in the case questioned the validity of the fingerprint evidence that had led to his conviction.) In Cardiff, McKie
told the Fingerprint Society that the system they represented was "incestuous, secretive, and arrogant. It has been opened
to unprecedented analysis and it's sadly lacking. It pains me to say that, because I was a police officer for thirty years.
You are indicted on the basis of a fingerprint. You are not innocent till proven guilty; if the police have a print, you are
assumed to be guilty. We need to start a new culture. The view that the police and fingerprint evidence are always right,
the rest of the world be damned, has to end.''
Afterward, the corridors and conference rooms were buzzing; it was as
if somebody had challenged the fundamentals of grammar at the annual meeting of the Modern Language Association. But McKie
was far from the only speaker at the conference to raise questions about the field. Christophe Champod, who works for a British
organization called the Forensic Science Service, has long attempted to apply rigorous statistical methods to fingerprinting.
Champod spoke in an understated and academic manner, but what he had to say was even more forceful than McKie's presentation.
He told the audience that they had only themselves to blame for the state of the field, that for years they had resisted any
attempts to carry out large trials, which would then permit examiners to provide some guidance to juries about the value of
their analysis, as is the case with DNA. "What we are trying to do in this field is reduce, reduce, reduce the population
so that there is only a single individual that can possess a set of fingerprints. . . . But we can never examine the fingerprints
of the entire universe. So, based on your experience, you make an inference: the probability that there is another person
in the universe that could have a good match for the mark is very small. In the end, it's like a leap of faith. It's a very
small leap, but it is a leap nonetheless."
Half an hour had been allotted for questions, but there was only silence.
Afterward, one of the organizers explained it to me: "He was using the terms of religion to describe our science. That's just
not fair."
Allan Bayle invited me to visit him in London after the meeting. Bayle is six feet five with sandy hair
and flecks of gray in his blue eyes. He had recently married and he lives with his wife, child, and mother-in-law just steps
from the M1 motorway entrance in Hendon, on the northern edge of the city. We sat in his conservatory on a cloudy day while
his five-month-old boy slept in a stroller beside us.
Bayle was frustrated. For the past five years, he had worked
mostly as a lecturer on fingerprints for the Metropolitan Police. "I taught advanced forensic scene examination, and I loved
it. Once I said I would give evidence in the McKie case, though, I was no longer allowed to go to meetings. But that is not
why I left. They did nothing about this mistake in identity. When you know something is wrong, how can you stay silent?" He
told me he was particularly upset that Shirley McKie's career as a police officer had ended for no reason. Bayle's life, too,
has changed. He now works as an independent consultant. Although he has been portrayed as a critic of fingerprint analysis,
he is critical only of the notion that it should never be questioned. "It's a valuable craft," he said. "But is it a science
like physics or biology? Well, of course not. All I have been saying is, let's admit we make errors and do what we can to
limit them. It is such a subjective job. The F.B.I. want to say they are not subjective. Well, look at what David Ashbaugh—certainly
among the most noted of all fingerprint analysts—said when he testified in the Mitchell case." Ashbaugh had clearly stated
that fingerprint identification was "subjective," adding that the examiner's talents are his "personal knowledge, ability,
and experience."
Bayle took out a large portfolio containing dozens of fingerprints, as well as gruesome pictures of
crime scenes. "Look at the mess,'' he said. He showed me a series of photographs: jagged fingerprints—black smudges, really—recovered
from the scenes of several murders he had investigated. "With all that information, you then come to your conclusions. You
have to somehow match that to this clean image''—he handed me a picture of a perfect print, taken at a police booking—"and
say, finally, it's one man's print. You have got to look at everything, not just points. The Bureau has not had a missed ident
in all their years of working, and I applaud that. But they are not testing their experts' ability. And that is dangerous.''
The
following week, Stephen Meagher agreed to speak with me at the F.B.I. headquarters, on Pennsylvania Avenue in Washington.
Meagher is perhaps the best known and most forceful advocate for the view that fingerprint evidence is scientifically valid
and that it ought to be welcome in courts.
"But is it really a science?" I asked as soon as we settled down to talk
in his office. Meagher said that he didn't think of science as a term that could be easily defined or tailored to fit all
disciplines in the same way. "There is academic science, legal science, and forensic science,'' he told me. "They are different.
You can be an expert in the field and give testimony without having an academic level of scientific knowledge. . . . It is
not achievable to take pure science and move it into a legal arena.'' This seemed surprising, since Meagher had often argued
that, when performed correctly, fingerprint analysis is an "objective'' science. In 1999, when he was asked in court whether,
based on the unique properties of fingerprints, he had an opinion of the error rate associated with his work, he said, "As
applied to the scientific methodology, it's zero." (Scientists don't talk this way; it is an axiom among biomedical researchers
that nothing in biology is true a hundred per cent of the time.)
Later, when I asked David Faigman, the Hastings law
professor, whether it made sense to divide science into legal, academic, and forensic subgroups, he laughed.
"Of course
it makes no sense,'' he said. "Mr. Meagher operates on a sixteenth-century notion—a Francis Bacon idea—of what science is
all about. To me, the analogue for law is meteorology. It deals with physics and chemistry—the most basic sciences. Yet it
has to make predictions and empirical statements regarding complex reality. That is because so many factors determine the
weather that it's really a probabilistic science. And I think fingerprinting is the same."
"Most fields of normal science
could pull from the shelf dozens or hundreds, if not thousands, of studies testing their various hypotheses and contentions,
which had been conducted over the past decades or century, and hand them to the court,'' Michael Saks wrote in "Modern Scientific
Evidence." For fingerprinting there was nothing. In 1999, the F.B.I. conducted its study in preparation for the Byron Mitchell
trial. The study asked examiners to match the two actual latent prints taken from the car in the Mitchell case with the known
set of fingerprints of the man on trial. Both sets of prints were sent to the crime laboratories of fifty-three law-enforcement
agencies. Of the thirty-five agencies that examined them and responded, most concluded that the latent prints matched the
known prints of the accused; eight said that no match could be made for one of the latent prints, and six said that no match
could be made for the other print. The F.B.I., realizing it had a problem, sent annotated enlargements of all the prints to
those examiners who had said the fingerprints couldn't be matched. In these photographs, the points of similarity on the fingertips
were clearly marked. This time, every lab adopted the F.B.I.'s conclusions.
When I asked Meagher about the study, he
told me that the test was supposed to demonstrate the uniqueness of the prints; it was not meant to be a test of competency.
He claimed opponents have used the data unfairly. At the same time, he conceded that it would not matter how clean a fingerprint
was if the person examining it hadn't been trained properly. "Our system is a huge statistical-probability model, but it doesn't
make identifications, because it doesn't have all the information that is needed," he said. "It's a job for human beings."
On
March 13th, Judge Pollak vacated his earlier order. He issued a new opinion, in which he stated that the defense had succeeded
in raising "real questions about the adequacy of the proficiency tests taken annually by certified F.B.I. fingerprint examiners."
Yet he was persuaded by the F.B.I.'s record of accuracy, and wrote that "whatever may be the case for other law-enforcement
agencies" the Bureau's standards seemed good enough to permit F.B.I. experts to testify in his courtroom. "In short,'' he
concluded, "I have changed my mind.'' It was, naturally, a blow to the opposition—though Pollak was careful to rule only on
the case before him and only with regard to the F.B.I.
I met with the Judge shortly after he issued his decision. Having
arrived early for our meeting, I watched as he led the jury-selection process in the case in which Meagher will now be permitted
to testify. Like most courtrooms, it was decorated with an American flag, but it was filled with art as well: prints by Matisse,
Cézanne, and Eakins and drawings by Victor Hugo lined the walls.
|