Detail Archives    Discuss This Issue    Subscribe to The Detail Fingerprint News Archive       Search Past Details

G o o d   M o r n i n g !
via THE WEEKLY DETAIL
 
Monday, August 22, 2005

 
The purpose of the Detail is to help keep you informed of the current state of affairs in the latent print community, to provide an avenue to circulate original fingerprint-related articles, and to announce important events as they happen in our field.
_________________________________________
__________________________________________
Breaking NEWz you can UzE...
compiled by Jon Stimac

Fingertip Transactions ST LOUIS POST-DISPATCH, MO - Aug 19, 2005 ...no cash, no credit card or no checks equals no problem -- just use your finger...

Marine Who Deserted In 1965 Caught In Florida MARINETIMES.COM, VA  - Aug 17, 2005 ...fingerprints matched those of a 40 year deserter...

Madrid Bombing Suspect Arrested in Serbia   LEDGER, FL  - Aug 17, 2005 ...sample of his fingerprints to Interpol, where they matched prints provided by Spain...

Britain's Thickest Shoplifter   GLASGOW DAILY RECORD, UK  - Aug 20, 2005 ...he's caught nicking a computer from a shop selling CCTV cameras...

__________________________________________
Recent Message Board Posts
CLPEX.com Message Board

I need input, ALL trainers, Kasey, Pat, Glenn, Ron, Jon, etc.
Guest Sat Aug 20, 2005 3:54 pm

AFIS
stacy Thu Aug 18, 2005 3:19 pm

ThermaNin
Michelle Wed Aug 17, 2005 4:39 am

Fluorescein
Mike French Tue Aug 16, 2005 2:51 pm


(http://clpex.com/phpBB/viewforum.php?f=2)

UPDATES ON CLPEX.com

No major updates on the website this week.


_________________________________________

Last week

we looked at the recent NPR radio program that was critical of forensic science.  It was based on data from the innocence project, as is this week's Detail.

This week

Glenn Langenburg brings us three points in reference to a recent article that appeared in Science magazine.  You have to pay $10 to access the article, unless you are a member of Science online: www.sciencemag.org/cgi/content/abstract/309/5736/892

Abstract: Converging legal and scientific forces are pushing the traditional forensic identification sciences toward fundamental change. The assumption of discernible uniqueness that resides at the core of these fields is weakened by evidence of errors in proficiency testing and in actual cases. Changes in the law pertaining to the admissibility of expert evidence in court, together with the emergence of DNA typing as a model for a scientifically defensible approach to questions of shared identity, are driving the older forensic sciences toward a new scientific paradigm.

Glenn relates:
"I am responding to the article by Saks and Koehler, from the August issue of Science magazine. This is a rough draft of the letter I am sending the editors. It is longer than I had anticipated, and it needs to be trimmed. I may send two versions to them and just see what happens. Any comments would be appreciated. I am trying to stick to the big points, because there was a lot of little "nit-picky" stuff I had to just chalk up to "Saksian statements".

I think it is very important for the community to write the editors and respond to this article. Unlike the critics' usual platform of legal and forensic science journals, they are now biasing a general science community with their one-sided, and at times blatently false, arguments. We need to show them that if they are going to step into a scientific arena, then they have to play by the same rules that we do and "ante-up" if they wish to play at the table with the Big Boys (and Girls of course). I would encourage others to write Science magazine and let them know your opinions. I would also encourage Science magazine to contact us for an informative article on forensic science and the issues that we face."

_________________________________________
Response of a Scientist to Msrs Saks' and Koehler's article entitled "The Coming Paradigm Shift in Forensic Identification Science" in Science, vol. 309, Aug 2005.
by Glenn Langenburg

The author holds a Masters of Science in Chemistry, and is currently a Ph.D. candidate at the University of Lausanne in Switzerland, where his thesis is the statistical analysis of fingerprint identification methodology.  He is currently employed as a Forensic Scientist by the Minnesota Bureau of Criminal Apprehension and instructs introductory forensic science at two universities in Minnesota.

Having followed the works of Michael Saks for years in the forensic science and law review journals, I read with surprise and interest that he would publish in a general scientific publication read by millions of scientists and “scientifically intrigued minds” world-wide. Much was my dismay when I realized that Mr. Saks has once again published an article critical of forensic science, with little proof, substance, or research to substantiate his attacks. The data and proof he does offer, is at best woefully “misinterpreted”, and at worst, deliberately misrepresented. I recognize that Mr. Saks is not the sole author of this paper and from this point I will simply refer to the authors in tandem.

I agree with the basic premises of the authors’ article. There IS a paradigm shift in forensic identification science. Research opportunities have become available in the last decade in part due to Daubert (admissibility) challenges. Other influences not discussed by the authors would include: 1) affordable and powerful computer systems and software that was never available in the past to analyze large databases of images 2) an increase in scientific education and an evolution of the modern forensic scientist 3) the biometric community’s interest in identification methods and 4) research grant money that traditionally has been pumped into DNA research has now become available. Research is vital to improving the forensic identification sciences and improving the methodologies that we employ.

Beyond this basic message, the authors, who do not work within this field and have a superficial knowledge of forensic science, made numerous misstatements. I wish to address three major areas of concern to inform your readers of the gross misinformation they have read.

“Lacking theoretical or empirical foundations, [there is an] assumption of discernible uniqueness”

This is not true. There are for all the established identification sciences (i.e. fingerprints, firearms examinations, handwriting comparison, etc.) established theoretical and empirical foundations for the determination of images having a common source (a match). The manner in which information is transferred from a unique biological or physical source are well-known and researched in each discipline. What the authors have failed to distinguish and state clearly is that that HOW the expert makes the determination is the issue. The BASIS and foundation is there and not an assumption, as the authors state, the real issue is how do we make matches, and can we improve our methodologies. Research is investigating our standards, criteria, and methods for declaring matches.

The technology, databases, and opportunities have not been available. For 100 years, humans have done the best we can with training and experience. We are using these new tools to investigate better methods and criteria for forensic identifications. Our experiences in this field have shown that yes, errors can and do happen and like any scientific method, the method is only as fallible as the practitioner. However, the number of reported errors has been relatively miniscule compared to the big picture, but there is always room for improvement.

The authors’ “study” and graph

Perhaps through no fault of their own, the popular media that has seized onto this article, has referred to the authors’ data from the Innocence Project as a “study”. My response to this misnomer is near vitriolic. The very thing that Mr. Saks in the past has criticized forensic scientists for, he apparently feels he is above.

The data shown in their graph has not been published in any scientific journals. It has not been peer-reviewed. There is NO discussion of their sampling techniques, methods, procedures or otherwise. There simply is no foundation to this rather damning data.

The Innocence Project currently has 160+ cases of wrongful convictions overturned—a true testament to the need for this valuable organization. Yet the authors cited only 86 cases? How did they choose these 86? Forensic testing and testimony was a major cause for the erroneous convictions. Were they from a handful of scientists or many, i.e. a few “bad apples” or rampant incompetence?

Did the authors actually do a study, the way such a study should be done? Did they retest the evidence in these cases and then have the appropriate expert review the original scientist’s work and testimony to determine whether the test was improperly conducted or the testimony was “false or misleading” based on the original scientist’s results? OR, were these cases and testimonies reviewed by lay people, law students, and Innocence Project volunteers and a scientific determination made by these unqualified individuals? And just what were the authors’ criteria for “false/misleading testimony” by forensic scientists, having no experience themselves within this field?

Of the “forensic testing errors”, were these actual testing errors or simply limitations of the tests and technology of the era? Mr. Koehler, in his August 12, 2005, NPR interview regarding this article, referred to serology tests that were inclusive, yet more discriminating DNA tests performed years later could exclude the individual. Are these the types of “errors” counted in this “study”?

In the article text the authors state that “erroneous forensic science expert testimony is the second most common contributing factor to wrongful convictions, found in 63% of those cases”. Yet in the graph, 63% of errors were due to forensic testing errors. Which is it?

I find these numbers outrageous and implausible. Having worked with our local chapter of the Innocence Project myself, my experiences and conversations with attorneys have led to discussions of errors and causes. Eyewitness testimony has always been cited as the leading cause (with which the authors seem to agree). Additionally, attorneys have lamented to me about the lack of scientific knowledge (and funding) within the legal defense community to recognize and properly challenge inappropriate or false forensic evidence and testimony. Yet in the authors’ graph, incompetent defense representation was one of the LOWEST causes!

I challenge these authors to submit these data to a peer-reviewed scientific journal with their methods, procedures, and source data clearly listed. If the data and methods hold up under proper peer review and professional scrutiny, then so be it. Until then, I caution the readers to view these data with skepticism, as they simply do not appear accurate.

Error Rates and Proficiency Testing

I disagree with the assertion that error rates can be calculated for comparisons performed by human examiners and then these error rates can be used as a “predictive” measure to assess the probative value of the identification and predict the probability that an error (false match) occurred.

In fact, one of the very “blue ribbon panels”(the National Research Council), touted by the authors for their evaluation of forensic DNA use, concluded that using error rates, especially error rates gathered from proficiency testing, is inappropriate! The National Research Council stated in great detail the fallacy of using error rates from proficiency tests as a measure of probability of error.

Were I to use the authors’ logic in this, I could look at the number of errors committed in a baseball game between the Mets and Yankees, and let us say it was quite high at 5 for each team. The next time these two teams meet, then there is an equal chance that again each team will commit five errors. Obviously if this were the case, bookmakers would be out of business. The fact is, humans learn from mistakes. In some instances mistakes will not be made as the task is quite simple and the human is quite skilled and experienced. In some instances the conditions are quite challenging and a mistake could be made. In forensic casework, the conditions are varied and we are human and fallible. Proper quality assurance and quality control is imperative to reducing (but not eliminating) the chance of error.

The authors referred to proficiency test errors of fingerprint experts were “about 4 to 5%” false-positive errors on at least one fingerprint comparison. This statement is patently false. The manufacturer of these proficiency tests did not report a 4 to 5% “false-positive” error rate (erroneous matches) but rather they reported that 4 to 5% of the answers “differed from the manufacturer’s expected results”.

This is a critical distinction. If an examiner reports “inconclusive” (perhaps they lacked the training and experience to make the match, or the match did not meet their agency’s criteria for a match, etc.) this will be reported as “differing from the manufacturer’s expected results”. This is not a false match as the authors are reporting. Additionally, an examiner may have identified the correct individual, but recorded the incorrect digit. This is a common transcription error. Fingerprint experts have been known to identify the correct individual, but inadvertently record the incorrect finger. The gravity of such an error is miniscule when compared to making a false match. The authors did not make any of these distinctions. Sadly, Mr. Saks has been made aware of these issues in the past, but simply chooses to mislead and misrepresent. Again I take exception to this gross, unethical, and inappropriate use of these data.

I cannot in good conscience read another Saks diatribe, this time reaching a general science population. It is unfortunate, because the basic message of the need for research and the amazing Kuhnian paradigm shifts that are occurring—challenging old dogmatic beliefs and supporting new theories with research and statistics—is wonderful and a great benefit to this discipline. Yet the authors’ message is lost to the forensic science and jurisprudence community as it is immersed in a sea of misrepresentations and false statements.

I challenge the authors to actually perform a research experiment designed to test their theories and statements. I challenge the authors to attend the identification conferences and become involved in the community that already performing the research for which they are calling. They will find a new generation of scientifically gifted and objective scientists, skilled at what we do, but interested in discovering new ways to improve it.

Sincerely,

Glenn Langenburg
 

Responses may be submitted at the following link:

http://www.sciencemag.org/cgi/eletter-submit/309/5736/892

______________________________________________________________________
Feel free to pass The Detail along to other examiners.  This is a free newsletter FOR latent print examiners, BY latent print examiners. There are no copyrights on The Detail, and the website is open for all to visit.

If you have not yet signed up to receive the Weekly Detail in YOUR e-mail inbox, go ahead and join the list now so you don't miss out!  (To join this free e-mail newsletter, send a blank e-mail from the e-mail address you wish to subscribe, to: theweeklydetail-subscribe@topica.email-publisher.com)  If you have problems receiving the Detail from a work e-mail address, there have been past issues with department e-mail filters considering the Detail as potential unsolicited e-mail.  Try subscribing from a home e-mail address or contact your IT department to allow e-mails from Topica.  Members may unsubscribe at any time.  If you have difficulties with the sign-up process or have been inadvertently removed from the list, e-mail me personally at kaseywertheim@aol.com and I will try to work things out.

Until next Monday morning, don't work too hard or too little.

Have a GREAT week!