Skip navigation

Crime Labs Still in Crisis

Crime Labs Still in Crisis

by Matt Clarke

The October 2010 Prison Legal News cover story, “Crime Labs in Crisis: Shoddy Forensics Used to Secure Convictions,” provided an extensive examination of problems at crime labs nationwide. Apparently, and unfortunately, little has changed since that time.

In 2012, the U.S. Department of Justice (DOJ) announced a review of thousands of old cases in which FBI agents presented testimony about forensic hair comparison analysis. The announcement followed a series of articles in the Washington Post reporting that FBI crime lab experts had exaggerated results of hair analysis, resulting in the wrongful convictions of at least three defendants. The review will encompass all cases processed by the FBI lab’s hair and fiber unit prior to 2000 – which includes thousands of convictions. Not included in the review are cases processed by forensic examiners at state and local crime labs who were trained by FBI agents to use the same methods and standards now being examined at the FBI lab.

Additionally, a number of state and local crime labs are under investigation for a variety of issues, ranging from blatant misconduct by lab analysts to improper testing methods.

Previous FBI Lab Scandal

Long considered an unassailable bastion of forensic sciences, the sterling reputation of the FBI lab was tarnished when allegations of sloppiness, exaggerated results and poor science were first reported in the mid-1990s. That was when Fredrick Whitehurst, an FBI chemist, revealed that he had seen colleagues contaminate evidence and exaggerate their results. Whitehurst said the first thing he noticed about the crime lab when he was hired was that it was a “pigsty,” with outdated equipment and a black soot the agents called “black rain” coating all the surfaces. He was also surprised that tours of the lab allowed outsiders into what should have been a controlled environment.

After he complained about these issues another lab worker told him, “Before you embarrass the FBI in a court of law, you’ll perjure yourself. We all do it.”

During his years at the FBI crime lab, Whitehurst acquired nearly new equipment from the National Institutes of Health and formulated lab protocols, something the lab did not previously use. When pressured by supervisors to invent a media story about the explosive used in the 1993 World Trade Center bombing and testify that it was a urea nitrate bomb, Whitehurst refused. The FBI found another lab employee willing to testify.

Whitehurst wrote a total of 257 letters to the DOJ’s Inspector General complaining about the crime lab, the longest of which was 640 pages. His letters dealt with issues such as the fact that for nine years one of the lab workers had been rewriting his forensic reports to favor the prosecution with the approval of supervisors. Nothing happened in response to his complaints.

“You get patted on the head if you’re the guy who saves the case,” Whitehurst said, to explain the motives of lab technicians who provide exaggerated or incorrect scientific testimony. “They get promoted; they’re the guys everyone crowds around. It’s a very tight family. A scientist who asks a question and doesn’t go along, he gets isolated.”

In 1995, Whitehurst testified in a New York U.S. District Court that “There was a great deal of pressure put upon me to bias my interpretation” of forensic tests performed in various cases, including the O.J. Simpson murder trial and World Trade Center and Oklahoma City bombings. He publicly testified about the FBI crime lab’s shortcomings, which resulted in something that finally led to changes: national media attention.

DOJ/FBI Task Force and Inspector General Report

Whitehurst’s revelations led then-FBI Director Louis J. Freeh and Attorney General Janet Reno to form a task force to investigate criminal cases involving the crime lab, and from 1996 to 2004 the task force reviewed and made findings in thousands of cases. Despite initial promises of transparency, the findings were kept secret from the public and revealed only to prosecutors; thus, the very officials who brought the original charges and tried the cases were expected to determine whether the new evidence should be given to the defense. Unsurprisingly, few prosecutors opted to share the findings with defendants or their attorneys.

According to the Inspector General’s office, the DOJ “failed to ensure that prosecutors made appropriate and timely disclosures to affected defendants, particularly in cases where the prosecutor determined that lab analysis or testimony was material to the conviction and the report of the independent scientists established that such evidence was unreliable.”

The Washington Post’s review of more than 10,000 pages of task force documents obtained through Freedom of Information Act requests and litigation revealed that high-ranking DOJ and FBI officials took steps to control the information uncovered by the investigation. In the end, the task force didn’t even publicly release the names of the defendants whose cases were examined, much less the results of the reviews. The names in around 400 of the cases were disclosed for the first time in July 2014 – a decade after the task force had concluded its investigation.

Although the task force’s report was kept secret, a report generated by DOJ Inspector General Michael R. Bromwich following his investigation of the crime lab was not. The 517-page report, released in April 1997, concluded that FBI lab managers had failed to respond to warnings about the conduct of agents assigned to the lab and concerns about the lab’s scientific integrity, in some cases for years.

Among other problems, Bromwich’s report noted that the supervisor of the crime lab’s explosives unit “repeatedly reached conclusions that incriminated the defendants [in the Oklahoma City bombing case] without a scientific basis.” Similarly, the report concluded that the lab’s toxicology supervisor lacked judgment and credibility, and had overstated results in the O.J. Simpson murder case. Also, according to the report, the key FBI witness in the World Trade Center bombing “worked backwards” and tailored his testimony to reach the result desired by prosecutors. Another finding of the report was that some agents employed at the lab “spruced up” their notes for trial, altered the reports of other agents without the authors’ permission or did not document their forensic findings.

Initially, the FBI promised that the task force would release any exculpatory information found during its review of cases involving the agency’s crime lab.

“We are undertaking that review,” FBI assistant general counsel Jim Maddock said during an April 15, 1997 news conference. “And when it is done, we will give a full accounting of our findings.”

While that may have been an early goal of the task force, interviews and documents indicate that a decision was soon made at the highest levels to exclude defense attorneys from the review process and not release the findings to the public. As the investigation plodded on, files on the cases under review were lost or destroyed and prosecutors quit or retired, making it harder to determine how important testimony from the FBI crime lab had been in a particular case. Therefore, investigators canceled their original plan to have state and local prosecutors sign a statement that a discredited lab agent’s testimony was pivotal to a case, or explain why it was not. Some prosecutors, such as Harry Lee Cole III in Tampa, Florida, refused to cooperate. Cole said the attorneys in his office were too overworked to review the cases, which included death penalty prosecutions.

Further, even when defendants or defense attorneys were notified of the task force’s findings, the letter of notification often did not contain the name of the defendant, referencing only FBI and DOJ file numbers and unexplained technical language stating the scientific results. One example was a letter sent by a Tampa prosecutor concerning an 18-year-old conviction that stated, in its entirety, “Please find enclosed a copy of Attachment to Independent Case Review Report for CDRU#6480 Case File #95-253567, which we received, from the U.S. Department of Justice.”

Foot-Dragging and Limiting the Scope of Review

The DOJ and FBI also tried to limit the scope of the task force’s investigation. In June 1997, John C. Keeney, head of the Department of Justice’s criminal division, wrote a memo stating that his division and the FBI would “arrange for an independent, complete review of the Laboratory’s findings and any related testimony” in every conviction in which there was a “reasonable probability” of work by discredited lab agents that affected the case. Two months later, Lucy L. Thompson, the senior DOJ attorney in charge of the task force, informed Keeney’s deputy that the FBI planned only a “cursory paper review” and did not intend to reexamine evidence.

In an August 19, 1997 memo, Thompson told Deputy Assistant U.S. Attorney General Kevin V. DiGregory the FBI had requested that the focus of the investigation be diverted from the most vulnerable cases, by removing from consideration those cases still being litigated or on appeal. The following year, Thompson complained to DiGregory that “no scientists had been retained to this date” to review cases in which defendants might have been wrongfully convicted due to misconduct or errors by the crime lab. She said that such reviews were “needed as soon as possible to avoid possibly undercutting prosecutors’ arguments ... and to ensure that defendants will not exhaust opportunities to file post-conviction relief motions.”

The DOJ and FBI set up strict rules about what would be revealed to defendants and prepared to counter challenges to convictions, according to a January 4, 1996 memo authored by Keeney. Between secrecy, cursory reviews and delaying tactics, little was done before the events of 9/11 swept the crime lab issue onto the furthest back burner for the FBI and DOJ.

Delays in investigating problems at the FBI lab remain a problem today.

According to a July 29, 2014 report by the Washington Post, the FBI halted the review of convictions involving hair comparison analysis by the crime lab in August 2013 after “nearly every criminal case reviewed by the FBI and the Justice Department ... included flawed forensic testimony,” including cases where lab analysts had “exceeded the limits of science.” At that time, the agency had reviewed around 160 cases; more than 2,400 convictions remain to be examined.

The FBI resumed reviewing cases 11 months later after another report on problems at the crime lab, released by the Inspector General’s office in July 2014, noted that during the period that the review was being conducted, three prisoners whose convictions involved evidence presented by FBI lab analysts had been executed and another died on death row. It took the task force nearly five years to identify a total of 64 death penalty cases involving potentially faulty testimony by FBI experts.

“Failures of this nature undermine the integrity of the United States’ system of justice and the public’s confidence in our system,” the report stated.

FBI officials claimed that the delay in reviewing the cases was due to “a vigorous debate that occurred within the FBI and DOJ about the appropriate scientific standards we should apply when reviewing FBI lab examiner testimony – many years after the fact.”

“[W]ith the benefit of hindsight, the Department agrees that certain aspects of the Task Force review could have been more efficient or effective,” stated Brette L. Steele, a senior adviser on forensic science with the DOJ.

U.S. Senator Chuck Grassley was more blunt, stating, “The initial shoddy lab work was compounded by the Justice Department’s unconscionable failures to fix the problems it caused in hundreds of cases. The FBI and the Justice Department will have a lot of work to do to restore the public’s confidence in their integrity following these revelations.”

It is unlikely that such confidence will be restored anytime soon.

In January 2015, federal district court judge Jed Rakoff, who served on the National Commission for Forensic Science, resigned from the Commission in protest after the deputy attorney general of the Department of Justice determined that pretrial forensic discovery – disclosing forensic data and testing methods to defense counsel – was not within the Commission’s purview. Judge Rakoff called the decision a “major mistake.”

Hair Comparison Analysis Problems

The idea of forensic hair comparison was first popularized by Arthur Conan Doyle’s stories about his fictional detective, Sherlock Holmes, in the late 1800s, though hair comparison did not become an established forensic tool until the 1950s. Prior to the advent of DNA testing, hair analysis could at best narrow the pool of criminal suspects, not accurately match a hair to a specific individual. Nonetheless, even before TV programs such as “CSI” made juries have near-absolute belief in forensic science, jurors had come to expect the prosecution would present conclusive scientific evidence to help them reach a verdict. Testimony by an FBI Special Agent who worked at the crime lab and specialized in forensics made convictions a nearly sure thing, and prosecutors came to rely on such testimony.

“The public has this idea about forensics from what they see on TV that simply does not correspond with reality,” stated former FBI agent Jim Fisher, author of the 2008 book Forensics Under Fire: Are Bad Science and Dueling Experts Corrupting Criminal Justice?

The problems at the FBI crime lab brought to light by the Washington Post revolved around the testimony of the lab’s hair and fiber experts, with respect to how much evidentiary weight microscopically similar hair samples should be given. To perform a hair sample comparison, the analyst places an unknown hair alongside a known hair under a microscope. The analyst then determines whether the microscopic characteristics of the hair samples are similar.

All of that is acceptable science. What is unacceptable and not scientific is when hair experts testify that two samples “match,” or cite statistics indicating it is improbable that two hairs would have the same microscopic characteristics and come from different individuals. There is no scientific basis for such statements, as a scientifically-valid statistical analysis of microscopic hair characteristics has never been performed.

Thus, when FBI crime lab experts said things like, “I’ve compared tens of thousands of hairs and never found two from different people that matched,” or prosecutors made statements like “the odds could be one in two million that the hair came from someone else,” they were attributing a level of accuracy to forensic hair comparison analysis that simply does not exist. In truth, no one knows, to a scientific certainty, the probability that two different people have microscopically-similar hairs.

That is the only thing a hair comparison expert should say when testifying about the probability of similar hairs from different sources, according to Myron T. Scholberg, who led the FBI crime lab’s hair analysis unit from 1978 to 1985, and Alan T. Robillard, the unit’s director from 1988 to 1990. Scholberg and Robillard admitted they had failed to properly train their agents regarding this type of testimony and never trained them to respond to the question about how often hairs from different people matched with the correct response, “there is no scientific way to know.”

Some experts disagree with this criticism of the testimony given in hair comparison cases. Edward L. Burwitz, who led the FBI’s hair analysis unit from 1985 to 1988, called the critiques “Monday morning quarterbacking.”

Morris Samuel Clark, who was head of the FBI’s hair analysis unit in 1973, when it began training state and local crime lab technicians, said he long believed that crime scene hairs could be traced to a specific individual with a high degree of probability. Clark, who admitted it was not scientifically possible to prove his belief, retired in 1979.

“This was not fly-by-night stuff, not idle conclusions on our part. I think we made a very significant contribution to the criminal justice system,” he stated. “If [lab examiners] made a mistake, it’s a personal mistake, and it’s not a matter of [our] training them ... nor the whole science of microscopic hair exams, because we did our best.”

Unfortunately their best was not good enough. In 2002 the FBI finally conducted DNA testing in 80 selected cases, which determined that crime lab experts had incorrectly matched hair samples more than 11% of the time. That failure rate, extrapolated to the tens of thousands of cases in which FBI analysts had testified about hair comparison “matches,” meant that incorrect forensic evidence had been presented in thousands of prosecutions.

Former FBI agent Michael P. Malone was singled out as the lab analyst who had made the “most frequent exaggerated testimony” in criminal cases, according to the Washington Post. After retiring in 1999, Malone worked as a contractor for the FBI conducting security reviews until June 2014.

The Consequences of Secrecy: Wrongful Convictions

Perhaps the most egregious miscarriage of justice involving the FBI crime lab was the April 21, 1997 execution of Benjamin Henry Boyle in Texas. At the time Boyle was put to death, the DOJ task force had been investigating the lab for a year and it was known that, but for the flawed testimony of a lab agent, Boyle would not have been eligible for the death penalty. Nonetheless, the DOJ and FBI allowed the execution to go forward without informing Boyle or his attorney about problems with the testimony of the FBI hair and fiber expert.

Then there are the cases of Santae A. Tribble, 52, and Kirk L. Odom, 50, which were highlighted in the Washington Post series. Tribble was convicted of the 1978 murder of a D.C. taxi driver while Odom was convicted of sexual assault in 1981. In both cases, an FBI agent testified that hair associated with the crime scene came from the defendant. However, DNA testing performed in 2012 completely cleared Odom and virtually eliminated Tribble from the list of suspects. Both men had their convictions overturned and were officially exonerated; Tribble served 28 years in prison while Odom served more than 22. Neither case was included in the DOJ task force’s review.

A third case examined by the Post was that of Donald Eugene Gates, 60, who was imprisoned for 28 years following his 1982 conviction for raping and murdering a Georgetown University student based on an FBI expert’s “match of his hair to hair found on the victim.” His case was included in the task force investigation but prosecutors never informed him of flaws in the expert’s trial testimony. In 2009, DNA tests established Gates’ innocence; he was officially exonerated in July 2012.

Tribble’s case serves to illustrate the errors made by hair comparison analysts. While working on the case, Sandra K. Levick, chief of special litigation for the D.C. Public Defender Service, was able to obtain the hair samples used to convict Tribble and submitted them to a private lab. FBI Special Agent James A. Hilverda had testified at Tribble’s trial that all the hairs recovered from a stocking found by a police dog about a block from the murder scene were human, and one matched Tribble “in all microscopic characteristics.” When forwarding the hair samples to the private lab in August 2012, Harold Deadman, a senior hair analyst with the D.C. police who worked for 15 years at the FBI crime lab, submitted a report stating he had found 13 hairs, one of which had the characteristics of a Caucasian.

The private lab’s DNA test results, however, showed that 12 of the hairs came from three different human sources, all of African origin, and the other was a dog hair. None had come from Tribble.

“Such is the state of hair microscopy,” Levick wrote in a court filing. “Two FBI-trained analysts, James Hilverda and Harold Deadman, could not even distinguish human hairs from canine hairs.”

In Odom’s case, FBI Special Agent (and crime lab hair analysis unit director) Myron Scholberg testified that a hair from the victim’s nightgown was “microscopically like” Odom’s, and that he had been unable to distinguish between hair samples from different individuals only “eight or ten times in the past ten years.” But DNA testing at the same private lab used in Tribble’s case, Mitotyping Technologies, determined the hair could not have come from Odom. A second lab, selected by prosecutors, tested stains left on a pillowcase and robe at the crime scene; the results also excluded Odom as the perpetrator.

State Forensic Hair Examiners

If there is a flood of flawed forensic hair comparisons by the FBI lab, there must be a tsunami of faulty comparisons by state labs – especially those trained by the FBI. The FBI crime lab trained over 1,000 state hair and fiber analysts; Deadman alone trained 600. No one knows how many cases the state analysts handled, but it could be hundreds of thousands.

Not only were the state examiners trained in the flawed methods used at the FBI crime lab, even worse, they received only cursory training compared to that received by FBI analysts. An FBI agent is required to spend an entire year working with an experienced analyst before testifying in trials. State and local analysts underwent a one-week course at the FBI academy, then were turned loose with credentials of having been trained by the FBI. Many immediately began testifying in criminal cases about the “matches” they had made despite having received no other training. Some were used as “criminalists” and told to compare everything imaginable, from paint chips to glass shards, though they had no instruction in those areas.

FBI spokeswoman Ann Todd said the training of state and local crime lab analysts was “continuing education,” intended to supplement formal training at other labs. She noted the FBI did not qualify the examiners, which was the responsibility of individual labs and certifying organizations. But neither did the FBI monitor what was being done by analysts with the coveted credentials of having received training by the FBI crime lab.

“There was no monitoring of people.... The whole thing for something this complex was ill-conceived and maybe [the FBI] should have recognized that,” said one expert, referring to the training of state and local analysts in the early 1990s.

Former Montana crime lab director Arnold Melnikoff, who later worked at the lab for the Washington State Patrol, was fired in 2004 after investigators reported egregious scientific errors regarding the accuracy of hair comparison analyses in over 700 cases dating back to the 1970s. [See: PLN, Aug. 2008, p.34; March 2006, p.28; Nov. 2004, p.12]. Melnikoff defended his testimony, saying he was FBI-trained and had testified to the same extent as FBI lab agents.

Melnikoff’s flawed testimony led to the reversal of at least three convictions. In another case, a man was exonerated by DNA tests after serving over 15 years in prison; he had been convicted of sexual assault after Melnikoff testified there was less than one chance in 10,000 for a coincidental microscopic hair match. He had come up with that official-sounding statistic by multiplying by 100 the 1 in 100 frequency he claimed to have been unable to distinguish head and pubic hairs. Yet Melnikoff was correct in that he was simply following the practices taught at the FBI crime lab.

“I took the [FBI] class in 1982 and was not advised to avoid the use of probabilities.... We were taught that our own experience was most important and that is what Mr. Melnikoff was doing,” stated 24-year Oregon State Police veteran Michael A. Howard. “They didn’t say, ‘Use it,’ and they didn’t say ‘Don’t use it.’” Instead, they said, “You’re going to have to decide for yourself, based on your experience, how strong you can state it.”

Joyce A. Gilchrist, the supervisor of Oklahoma City’s police crime lab, was fired in 2001 after an FBI reviewer questioned her testimony in 1,400 cases, saying she made claims about hair matches that were “beyond the acceptable limits of science.” In court filings, it was shown that Gilchrist had no hair comparison training other than the one-week FBI course, and was not supervised at the city crime lab.

Testifying in the case of David Johns Bryson, who spent 16 years in prison for rape before being exonerated by DNA in 2003, Gilchrist said, “I would think it would be impossible not to be able to distinguish hairs from two different individuals.”

Although Bryson filed a lawsuit over his wrongful conviction and settled with Gilchrist for $16.5 million, Oklahoma City refused to indemnify her and the Tenth Circuit ruled against Bryson’s efforts to force the city to pay the settlement. [See: PLN, April 2012, p.48].

Forensic Bite Mark Analysis

Another controversial branch of forensic science is bite mark analysis, in which experts try to match suspects’ teeth to bite marks left on victims. Using DNA tests, the Innocence Project of New York recently helped prove that bite mark analysis had resulted in the wrongful conviction of Gerard Richardson for the 1994 murder of Monica Reyes in Elizabeth, New Jersey. The Innocence Project has identified 17 other similarly flawed cases and believes this is only the tip of the bite mark wrongful conviction iceberg. According to Innocence Project director of strategic litigation M. Chris Fabricant, bite mark analysis should be abolished because it is “based on subjective speculation” and “responsible for more wrongful convictions than any other forensic discipline that’s still in practice today.” [See, e.g., PLN, April 2006, p.20].

An article in the peer-reviewed journal Science & Justice agreed, concluding that there was a “lack of valid evidence to support many of the assumptions made by forensic dentists during bite mark analysis.”

Two prisoners who were sentenced to death, in part based on bite mark evidence, were subsequently exonerated by DNA – including Ray Krone, called the “snaggletooth killer,” who served over a decade in prison.

Dr. Gregory Golden, former president of the American Board of Forensic Odontology, supports the use of bite mark analysis. He estimates that it has been used to convict defendants “in nearly 1,000 cases” and is accurate 98% of the time. Of course, even with his optimistic statistics, that means 20 people were wrongly convicted based on incorrect bite mark evidence.

But the flawed analysis rate is likely much higher than Golden’s estimate. In Mississippi there has been a 20-year-long scandal in which dentist Michael West, supported by coroner Dr. Steven Hayne, performed bite mark analyses that no one else could duplicate. [See: PLN, Jan. 2011, p.1; Oct. 2010, p.1].

Two Mississippi prisoners whose convictions involved bite mark comparisons by Dr. West, Kennedy Brewer and Levon Brooks, were exonerated through DNA tests. Brewer served 13 years in prison; Brooks, 15. Both received $500,000 in compensation from the state.

According to an article published by The Clarion-Ledger, Dr. West stated in a 2011 deposition that he “no longer believe[s] in bite-mark analysis.” He added, “I don’t think it should be used in court. I think you should use DNA. Throw bite marks out.”

West said he had been involved in 16,000 cases and testified as a bite mark expert 81 times over almost three decades. Hayne was barred from conducting autopsies for the State of Mississippi in 2008, and lawmakers finally agreed that his work requires scrutiny.

“Our experience with Dr. Hayne is extremely troubling, and there needs to be a look at anything he is involved in,” state Senator Hob Bryan, chairman of the Judiciary Committee, said in June 2013.

Asked about the longstanding reluctance of Mississippi officials to address issues related to errors made by West and Hayne in criminal cases, forensic pathologist Vincent Di Maio stated, “This is a political problem, not a medical problem. The government needs to do the right thing. But doing the right thing could hurt political careers, subject people to lawsuits, and force people to admit to making mistakes. Governments are made up of people. And people don’t like to put themselves in those positions.”

The American Board of Forensic Odontology, which supports the validity of bite mark comparison analysis, sponsored a “Bitemark Workshop” on February 15, 2015.

“Bitemark analysis, involving every phase of the process, from evidence collection of a patterned injury, the analysis of that patterned injury, the comparison of bitemarks to suspected biters and the linkage conclusions made there from, has been and continues to be reliable and accepted within the relevant scientific community,” the Board said in a statement.

Other controversial methods of forensic analysis – such as “dog scent” evidence, early arson investigation techniques and comparative bullet lead analysis (CBLA, which was discontinued by the FBI crime lab in 2005) – have been widely discredited, though all have been used to secure convictions.

State Crime Labs in Crisis

Beyond problems with hair compari­sons, bite mark analysis and other questionable forensic investigatory methods, there is a general crisis with respect to crime labs. In 2009, the National Academy of Sciences, the nation’s premiere scientific institution, released a comprehensive report on crime labs in the United States. Overall, the report found serious deficiencies and made a number of recommendations for improvements; six years later, however, few of the labs have implemented those recommendations. Consequently, problems continue to persist.

 

California

Former San Francisco police crime lab analyst Deborah Madden, 64, was sentenced in July 2013 to one year of home confinement, five years of probation, a $5,000 fine and 300 hours of community service after pleading guilty to a federal misdemeanor charge of cocaine possession. Following her arrest, Madden admitted she had taken cocaine from the crime lab in an attempt to control an alcohol problem. Her misconduct led to the dismissal of more than 700 drug cases and the temporary closure of the lab’s drug unit.

In separate state criminal proceedings, Madden pleaded guilty to possessing .09 grams of cocaine found when her home was searched, and was sentenced to participate in drug counseling. She had previously been tried twice for felony obtaining cocaine by deception or fraud, which resulted in hung juries.

 

Colorado

On June 7, 2013, Colorado Attorney General John Suthers announced that a report on an internal review of the Colorado Department of Public Health and Environment’s Laboratory Services Division, which tests evidence in criminal cases, was highly critical of the lab.

The report found that analysts felt they were “not adequately trained to provide fact or expert testimony in court,” blood-alcohol training protocols were inadequate and the toxicology lab was understaffed. It also found that refrigerators used to store blood and urine samples were not secured, making them accessible to unauthorized personnel. Further, a supervisor “made statements that suggest s/he is biased against defendants in criminal cases” and “impose[d] unreasonable burdens on toxicology analysts by making excessive accommodations for prosecutors and law enforcement agencies.”

Defense attorneys questioned why the report was withheld for over two months before being released.

An internal review of the lab also revealed that former supervisor Cynthia Burbach had operated with a bias “against individuals accused of crimes who were seeking exoneration at trial” and “in favor of getting convictions over doing justice and of helping prosecutors win trials over advancing science and the truth seeking process.” Burbach, who was the lab’s supervisor for more than a decade, abruptly resigned shortly before the review was made public.

 

Florida

Former Florida Department of Law Enforcement crime lab chemist Joseph Graves, 33, was arrested on February 4, 2014 on charges of grand theft, tampering with evidence and drug trafficking, in connection with stealing and selling prescription drugs from the lab. He was accused of switching the stolen drugs with over-the-counter medications. Graves was arraigned on 41 additional counts of drug trafficking in May 2014; previously freed on bond, he was booked into the Bay County jail with a new bond of over $1 million. He had reportedly worked on 2,600 criminal cases from 25 counties over an eight-year period.

As a result of Graves’ arrest, some convictions involving drug evidence may be called into question. Further, the Florida Department of Law Enforcement announced that it is considering policy changes, including policies related to employee drug testing.

 

Massachusetts

A massive scandal at the Massachusetts state crime lab in Boston revolved around Annie Dookhan, 36, a chemist. Dookhan was arrested in September 2012 on 26 felony counts of obstruction of justice and one count of misdemeanor perjury. The obstruction charges stemmed from allegations that she had improperly removed drug samples from the evidence room, forged co-workers’ signatures on forms to cover up those removals, added weight to drug samples to meet the whims of prosecutors, added cocaine to samples that had tested negative for that drug and “identified” drugs in evidence samples by their appearance rather than using chemical tests. The perjury charge related to an allegation that Dookhan had falsely testified she held a Master’s degree in chemistry from the University of Massachusetts.

Dookhan was considered a “superwoman” by colleagues, testing over 500 samples a month when 150 was typical. But her co-workers were also suspicious, according to a 100-page report issued by the Massachusetts State Police. A supervisor said he “never saw Dookhan in front of a microscope” and a chemist stated she “would submit a cocaine sample and it would come back heroin, or vice versa.” Regardless, Dookhan continued to testify in criminal cases even after being suspended during an investigation, and lab officials failed to tell defense attorneys about her suspension.

During her nine-year career at the state crime lab, Dookhan tested over 60,000 samples involving more than 40,000 defendants. Those cases are being reviewed and already hundreds of prisoners have been freed due to Dookhan’s involvement in their convictions.

Special courts were set up to review the cases. Boston Mayor Thomas Menino requested $15 million from the state to pay for emergency housing assistance, mental health services and drug treatment needed for the hundreds of former prisoners returning to the city after being released due to the crime lab reviews. Governor Deval Patrick ordered the state crime lab closed in August 2012, and asked the legislature to appropriate $30 million to cover costs associated with the scandal.

“Any person who’s been convicted of a drug crime in the last several years whose drugs were tested at the lab was very potentially a victim of a very substantial miscarriage of justice,” said John Martin, a defense attorney who represented David Danielli, the first prisoner freed as a result of the scandal. “Talented defense attorneys will be able to strongly suggest that any results from that lab are tainted, and people who deserve to be incarcerated for a very long time are going to walk – and that’s the reality of it.”

The drug evidence unit at the Massachusetts state crime lab was closed and its duties transferred to the state police lab. At least five officials charged with overseeing the state crime lab resigned or were fired, including Department of Public Health Commissioner John Auerbach, who stepped down in September 2012. Norfolk Assistant District Attorney George Papachristos resigned one month later after emails revealed that he had had an improper, nonsexual personal relationship with Dookhan that may have influenced the forensic results she reported in the cases he prosecuted.

On April 1, 2013, Sonja J. Farak, a drug analyst at the Massachusetts State Crime Laboratory at Amherst, was charged with four counts of tampering with drug evidence samples. In two of the cases, prosecutors allege Farak added counterfeit drugs to the samples to cover up her theft of drugs. In the other two cases the samples disappeared altogether. She was also charged with possession of cocaine.

Farak once worked with Dookhan at the state crime lab in Boston, known as the William A. Hinton State Laboratory Institute. The Amherst lab closed in January 2013 when Farak was initially arrested; it was replaced with a state police lab in Springfield. Farak pleaded guilty to theft, tampering with evidence and drug possession in January 2014, and was sentenced to 2½ years in prison with 18 months to serve and five years’ probation.

Dookhan pleaded guilty to 27 counts of evidence tampering and obstruction of justice for falsifying forensic test results and received a three-to-five-year prison sentence plus two years of probation on November 22, 2013.

“The chapter in Annie Dookhan’s life is over, but the story goes on for the thousands of individuals who have been affected by her conduct, by the conditions of the lab, and by the effect that it has had on the criminal justice system as a whole,” declared Anthony J. Benedetti, chief counsel for the Committee for Public Counsel Services.

A report issued by the Office of the Inspector General in March 2014 concluded that while no other analysts had engaged in misconduct at the crime lab, “The management failures of [] lab directors contributed to Dookhan’s ability to commit her acts of malfeasance. The directors were ill-suited to oversee a forensic drug lab, provided almost no supervision, were habitually unresponsive to chemists’ complaints and suspicions, and severely downplayed Dookhan’s major breach in chain-of-custody protocol upon discovering it.”

 

Minnesota

The police crime lab in St. Paul, Minnesota has also experienced a major scandal. Drug testing at the lab was suspended in July 2012 after employees testified there were no written protocols or standard operating procedures for technicians to follow and one of the drug testing machines was clogged with cocaine but still being used to test samples. Lab director Sgt. Shay Shackle was removed from his position after he testified that he did not understand how the lab’s machines worked and had no knowledge of national standards for crime labs.

Lab workers further stated they never calibrated or verified testing equipment or even checked to see if it was operating properly, didn’t keep a log of when materials used in the testing had expired and didn’t consider the margin of error when weighing drug samples. One employee said some evidence had been stored in a hallway, not in a secured room.

A state Bureau of Criminal Apprehension expert testified that the crime lab and the evidence samples kept there might be contaminated with drugs and toxic chemicals from two improperly-vented gas chromatography/mass spectrometry machines, and that employees never tested lab surfaces for contamination. One worker said she didn’t change gloves after each test and sometimes reused disposable equipment, possibly contaminating subsequent evidence samples.

The problems at the St. Paul crime lab were exposed by public defenders Lauri Traub and Christine Funk. The lab provides drug testing and other services for Ramsey, Washington and Dakota counties. Drug samples from those counties were sent to the Bureau of Criminal Apprehension lab until August 2013, when the St. Paul crime lab resumed operations with improved training and procedures, and with four police officers assigned to the lab instead of the previous two.

“In some ways, this is even worse than what has happened in Boston and elsewhere,” said Traub. “These people didn’t know what they were doing. They had no business running a lab in the first place. And yet they came into court every day and acted as if they did.”

The county attorneys from the three affected counties announced that all drug samples would be retested in 350 pending cases. They did not say what they would do about prisoners who had been convicted based on tests previously performed by the crime lab.

In November 2013, Traub filed a federal lawsuit against various police departments and sheriff’s offices in Minnesota, claiming that officers had improperly accessed her driver’s license and motor vehicle information several years before she exposed problems at the lab.

 

New York

In New York City, the medical examiner’s office announced in January 2013 that it was reviewing more than 800 rape cases spanning a 10-year period after an internal audit uncovered possible mishandling of DNA evidence by Serrita Mitchell, a technician who resigned in 2011. At the time of the announcement, the review had already turned up 26 cases in which DNA was present but had not been detected by Mitchell and 19 cases in which DNA from two different cases was mixed together. One person was arrested in a decade-old case due to the newly-discovered DNA evidence.

The medical examiner also revealed that in 55 cases, the perpetrator’s DNA profile had been uploaded to the state database and checked for matches, but was not checked against the federal DNA database.

In Nassau County, New York, serious problems in drug analyses performed at the county’s crime lab, which officials knew about but did nothing to correct, led to the lab’s closure in February 2011. Twice in the previous four years, the crime lab had been placed on probation following highly critical reports by a private accrediting agency, the American Society of Crime Laboratory Directors/Laboratory Accreditation Board (ASCLD/LAB).

The police officials supervising the lab had failed to report its deficiencies to the district attorney or county executive. New York Governor Andrew Cuomo ordered the state’s Inspector General to investigate. Her 166-page report, released in November 2011, found “chronic managerial negligence, inadequate training, and a lack of professional standards,” and said the lab “lacked formal and uniform protocols with respect to many of its basic operations, including training, chain of custody and testing methods.” The report further criticized the New York State Commission of Forensic Sciences for abdicating its responsibility to accredit and monitor the state’s crime labs.

“The confluence of these failures in oversight enabled the [lab] to operate as a substandard laboratory for far too long,” Inspector General Ellen N. Biben said in the report. “In doing so, these failures deprived Nassau County, the criminal justice system and the public of their right to have complete and unfettered confidence in forensic testing.”

 

North Carolina

An independent audit of the North Carolina State Bureau of Investigation crime lab in 2010 found that analysts had distorted or withheld evidence in more than 230 cases over a 16-year period. The audit was triggered by the exoneration of a prisoner who served 16 years of a life sentence for a murder he did not commit. His wrongful conviction revealed that the lab had a long-standing policy of reporting preliminary tests for the presence of blood as positive even when more-accurate confirmatory tests were negative or inconclusive. An investigation by the Raleigh News & Observer revealed a crime lab that was unabashedly pro-prosecution, where prosecutors made performance evaluations of lab analysts based upon how favorable their testimony was to the prosecution’s case.

The crime lab had been inspected and accredited by ASCLD/LAB five times during the period that the serology section was misrepresenting the results of blood tests. This called into question the value of such accreditation, as did the fact that ASCLD/LAB is based in Raleigh and run by two former state crime lab agents. Even the organization’s executive director, John Neuner, noted that “Accreditation can’t prevent wrongdoing by corrupt individuals.”

“ASCLD/LAB could more properly be described as a product service organization which sells, for a fee, a ‘seal of approval’ ... which laboratories can utilize to bolster their credibility,” said Marvin E. Schechter, a member of the New York State Commission of Forensic Science.

 

Texas

In March 2013, the Texas Court of Criminal Appeals – the state’s highest court over criminal cases – began overturning convictions due to a scandal at an accredited crime lab in Houston operated by the Texas Department of Public Safety.

According to a report by the state’s Forensic Science Commission, it was known for years that lab analyst Jonathan Salvador had a high error rate on drug tests and a poor understanding of the chemistry required by his job. Nevertheless, his supervisors promoted him due to a culture at the lab that only took into account time in service. “The section supervisor indicated that ... he did not feel it was appropriate to deny a promotion unless that person was totally inept, which Salvador was not,” the report stated.

Salvador was fired after an investigation by the Texas Rangers found he had falsified drug test results in one case, using the results from testing in a different case – a practice known as dry-labbing. He had performed tests in 4,944 drug-related cases from 36 counties. Then-Harris County District Attorney Pat Lykos submitted the findings of the investigation to a grand jury, which did not return an indictment.

The Court of Criminal Appeals first began overturning convictions involving drug tests by Salvador in which the evidence had been destroyed, but in Ex Parte Patrick Lynn Hobbs, 393 S.W.3d 780 (Tex. Crim. App. 2013), the Court wrote, “While there is evidence remaining that is available to retest in this case, that evidence was in the custody of the lab technician in question. This Court believes his actions are not reliable; therefore custody was compromised, resulting in a due process violation.”

However, this does not mean that all criminal convictions involving evidence tested by Salvador will be reversed. On June 4, 2014, in Ex Parte Coty, 432 S.W.3d 341 (Tex. Crim. App. 2014), for example, the Court of Criminal Appeals denied habeas relief in a case where retesting of the drug evidence confirmed that it was cocaine.

Incentives for Misconduct at Crime Labs

A corrupt lab technician might tamper with evidence to cover up the theft of drugs. Another analyst may falsify test results to help the police or prosecutors obtain a conviction, or to seek personal glory and acclaim, or to achieve professional advancement. In those kinds of cases, errors made by crime labs are often portrayed as the work of a “bad apple.” Rarely is it considered that well-intentioned lab analysts might also skew their forensic conclusions.

Yet a 2013 research study published in the journal Criminal Justice Ethics, authored by Roger Koppl and Meghan Sacks, found that fingerprint analysts will change the results of their comparisons almost 17% of the time if they are presented with biased information, such as a police officer saying, “We know this is the guy, we just need you to confirm it.” That isn’t surprising, because fingerprint analysis contains a subjective component in which the analyst compares two prints and decides whether they are the same. Hair, fiber, firearms and tool mark comparison – and even DNA analysis – also contain subjective components and presumably are subject to influence by information from outside sources. While this usually involves individual analysts being prejudiced, the closer a crime lab works with police and prosecutors, the more lab workers one would expect to succumb to such influences.

“People say we’re tainted for the prosecution,” said an FBI crime lab veteran. “Hell, that’s what we do! We get our evidence and present it for the prosecution.”

Renowned FBI fingerprint expert Bruce Budowle acknowledged that “a latent print examiner tends to approach the comparison to ‘make an ident’ rather than to attempt to exclude” a suspect. That might lead the examiner to focus only on similarities in two prints while ignoring differences, yet there only needs to be one significant difference to show that the prints came from different people. This type of error could affect the accuracy of an entire section of a crime lab.

As for the “bad apple” argument, according to New York University law professor Erin Murphy, “It’s not as though this is one bad apple or even that this is one bad-apple discipline. There is a long list of disciplines that have exhibited problems, where if you opened up cases you’d see the same kinds of overstated claims and unfounded statements.”

But what might prejudice a crime lab to skew results of forensic testing in favor of the prosecution? The answer is a financial interest in convictions.

A dirty open secret about crime labs in many jurisdictions is that they are paid only if the defendant is convicted. This is because some labs are funded through fees assessed as court costs, and those fees are paid following a conviction. By statute, public crime labs in at least 14 states receive all or part of their funding in this manner.

One could assume that an individual analyst might not skew results just because the lab will or won’t make money. However, budget-strapped crime labs are more likely to reduce the number of analysts they employ and less likely to grant promotions and raises, thus each employee’s job is dependent on the financial health of the lab in a concrete way.

“Collection of court costs is the only stable source of funding for the Acadiana Crime Lab. $10 is received for each guilty plea or verdict from each speeding ticket and $50 from each DWI (Driving While Impaired) offense,” said Ray Wickenheiser, director of the New York State Police Crime Lab System. Likewise, according to Koppl and Sacks’ study, funding for the crime lab in Broward County, Florida comes “principally [from] court costs assessed upon conviction of driving or boating under the influence ($50), or selling, manufacturing, delivery or possession of a controlled substance ($100).”

A North Carolina statute requires payment of a $600 fee to the crime lab upon conviction if DNA testing was performed. Illinois law mandates the payment of fees upon conviction in a sex offense, DWI or drug case. Alabama, Kentucky, Michigan, Mississippi, New Jersey, New Mexico and Virginia also have crime lab funding statutes.

Washington state law requires a $100 fee for any conviction involving crime lab analysis; in Kansas, the fee is $400. Similar fees exist in Arizona, California, Missouri, Tennessee and Wisconsin.

Koppl and Sacks wrote that their research indicated “the very choice to submit a suspect’s sample to the lab makes the lab more inclined (than it would be otherwise) to announce a match, indicating that the suspect is guilty.” Add to that a financial interest in the outcome of a case, and it is easy to see how an entire crime lab – especially one that is under the control of police or prosecutors – might become prejudiced against defendants.

The question then becomes how to avoid such bias.

The Path Forward

Two basic strategies have been widely proposed for improving the performance of crime labs: oversight and uniform standards. Texas is the leading state in terms of oversight.

In 2002, a news series by a Houston television station reported mishandling of DNA evidence at the city’s police lab. Retesting of 3,500 samples found that analysts had fabricated or misrepresented test results, which led to the release of at least three wrongly convicted prisoners, including Ronald Taylor, who served 12 years for rape before being exonerated by DNA evidence. The scandal spurred the legislature into action.

Lawmakers created the Texas Forensic Science Commission in 2005 to investigate claims of misconduct in the state’s crime labs and make recommendations on how to conduct forensic testing and other scientific aspects of crime investigation. By 2012 the Commission had investigated 40 complaints and made many recommendations for improvements.

The budget for the agency is only around $250,000 a year, and the Commission has not escaped controversy – including when then-Governor Rick Perry replaced three members of the Commission days before a hearing on a report that an innocent man may have been executed based on faulty arson evidence. [See: PLN, Oct. 2010, p.40].

Both the Dallas County and Harris County District Attorney’s offices operate conviction integrity units that investigate allegations of misconduct in forensics and other aspects of criminal investigations; similar units exist in Philadelphia, Chicago, Manhattan and elsewhere. [See: PLN, Nov. 2014, p.1]. The Texas legislature also appropriates about $9 million a year for continuing education in new research and practices to avoid wrongful convictions.

The District of Columbia’s crime lab is leading the charge when it comes to the implementation of standards. The $220 million D.C. lab, which opened in 2012, is the first crime lab in the nation to incorporate many of the recommendations made by the National Academy of Sciences. This means, among other things, that firearm analysts maintain a “reference library” of bullets recovered from crime scenes and use a water tank for known bullet recovery, and that the lab conducts its own DNA testing rather than outsourcing such tests. Most important is the fact that the crime lab is civilian-run and independent of police and prosecutors; it has its own crime scene investigators who collect samples and usually do not know who, if anyone, has been arrested for a particular crime. This independence should help prevent even inadvertent improper influence of lab workers by the police and prosecution.

But even the model D.C. crime lab has been accused of wrongdoing. The Washington Post reported in March 2015 that prosecutors had stopped sending DNA samples to the lab, alleging that analysts made mistakes when interpreting DNA test results. Independent experts have been hired to review 116 cases, and D.C. Mayor Muriel Bowser recently ordered an audit of the lab.

Further, some critics argue that the error rate at crime labs decreases significantly when analysts know their results could be subjected to retesting. Therefore, they advocate for a right to forensic testing by the defense, in which defense counsel is allowed to have retesting performed by an independent lab. If the defendant is indigent, such retesting would be paid for in a manner similar to the payment of court-appointed attorneys.

Another useful reform for crime labs generally would be to require lab directors and employees to have scientific credentials, such as degrees in the forensic fields in which they practice. In some cases there is no requirement that lab personnel have scientific backgrounds, nor is that a requirement for accreditation by ASCLD/LAB.

“If they started requiring labs to be led by scientists, 60 to 70% would have to close down immediately,” said Brent E. Turvey, an adjunct professor at Oklahoma City University and the author ofForensic Criminology.

Joseph P. Bono, former president of the American Academy of Forensics, agreed: “In an ideal world, all labs would be run by scientists in government agencies, but we don’t live in an ideal world.”

Conclusion

There are 389 publicly-funded crime labs nationwide; 210 are regional or state labs, 84 are run by a county, municipalities operate 62 labs and 33 are federal. Yet there are no standardized protocols or policies, leaving each lab a kingdom unto itself. Hence, the widespread scandals described above will likely continue so long as crime labs remain largely unregulated, lacking in uniform standards and beholden to law enforcement.

If one accepts an estimated wrongful conviction rate of 4.1%, based on a study of exonerations in death penalty cases published by the National Academy of Sciences’ journal in April 2014, then with over one million felony convictions in the United States each year there are at least 41,000 wrongful convictions annually.

Many of these injustices are due to errors by crime labs, whether intentional or not, and public officials who oversee our justice system have a responsibility to take action to reduce the number of wrongful convictions. Adoption of the recommendations made by the National Academy of Sciences as mandatory uniform standards for crime labs, establishing stringent regulation and oversight, isolating crime labs from the influence of police and prosecutors, and recognizing a right to forensic testing by defense counsel are good first steps to achieve that goal. Assuming, of course, that those in charge of our nation’s criminal justice system care more about achieving justice than simply obtaining convictions.

 

Sources: “The Criminal Justice System Creates Incentives for False Convictions,” by Roger Koppl and Meghan Sacks, Criminal Justice Ethics (July 2013); “An Assessment of the 1996 Department of Justice Task Force Review of the FBI Laboratory,” DOJ Office of the Inspector General (July 2014); The Washington Post; Boston Herald; The Boston Globe; Detroit News; Associated Press; www.abajournal.com; www.npr.org; www.cbsnews.com; www.pewstates.org; CNN; www.huffingtonpost.com; www.fbi.gov; www.minnesotapublicradio.org; www.wctv.tv; www.correctionsone.com; www.kdvr.com; The New York Times; www.wuwf.org; http://gritsforbreakfast.blogspot.com; www.theverge.com; www.houstonchronicle.com; www.johntfloyd.com; www.forejustice.org; www.startribune.com; www.hattiesburgamerican.com; The Texas Tribune; www.dailyreportonline.com; www.businessinsider.com; www.innocenceproject.org

 

Related legal cases

2014, in Ex Parte Coty

Ex Parte Patrick Lynn Hobbs


 

Freebird Publishers

 



 

InmateMagazineService.com

 



 

Freebird Publishers

 



 


 

Prisoner Education Guide side