Updated the Detail Archives
No major announcements this week
we looked at US v. Keita, a new Motion to Exclude Fingerprint Testimony based on the NAS report. I have heard through private communication that several other jurisdictions have already received news that additional challenges similar to this are in the pipeline in the near future, so preparation
I had the opportunity to attend two latent print related meetings that NIST hosted, that are the subject of this week's Detail. The first meeting was a presentation by the Co-Chairs of the NAS Subcommittee, Judge Edwards and Dr. Gatsonis. Attendees arrived on the NIST campus the morning of December 9th and assembled in an auditorium for the presentation. Attendees were welcomed and the speakers were introduced by the Co-Chairs for the NSTC Subcommittee on Forensic Science. In his introduction, Mr. Stanley introduced the session with the following comments. (Comments in this transcript are the typed notes of an attendee, and therefore are not intended to be direct quotations of the speaker): Forensics isn't infallible, as portrayed in the media. In reality, it is limited by the science and by the practitioner skills and ecxperience in interpreting the data they analyze. There is an uncertainty in FS, which can lead to error, which has raised concerns about accuracy of fornesicc science. Because of this, Congress asked the NAS to investigate FS in teh US and to issue a report for improvement. These recommendations have started a serious dialogue within NIST and the forensic community.
Comments of Judge Edwards:
On Feb 18, 2009, the NAS committee on identifying needs in forensic science issued a report entitled "Strengthening Forensic Science in the
A Path Forward". I am here today to share reflections on and implications of the report. Reactions to the report have been nothing short of extraordinary. It’s not surprising that it garnered media attention in first few months. What is much more interesting is that attention increased in succeeding months even as the country was subsumed by larger issues such as the economy and health care reform. Committees have been formed and conferences have been organized and held to address the issues laid out in the report. The reason for such a high level of interest is simple: the problems implicated by the report are incredibly serious. No preconceived views about the forensic science community were taken into the formation of the committee. We all assumed that forensic science was well grounded in scientific methodology and that practitioners followed proven practices. I was surprisingly mistaken in what I assumed. In reality, practices vary and work suffers because of that variation.
There are many examples that underscore the importance of the recommendations in the report. Often, forensic scientists overstate the claims allowed by the method they use. For example, there is a dearth of scientific research to show that the error rate in latent print examination isn't essentially zero. At one point in testimony to the committee, when asked what was the scientific basis for a match in latent prints, a SWGFAST member conceded that the research has yet to be done. The committee concluded that when there is no scientific research to support the discipline, and an expert can't articulate certainty, evidence is often overstated or even fabricated. Contextual bias studies show this, such as Dr. Itiel Dror's studies in latent print examination where one third of examiners reached different conclusions under the influence of context.
There is inconsistency in crime labs – many lack standards and have an absence of QA measures. Examples of these have been seen recently in the media - Houston,
I would like to address 4 issues this morning: Law, science, practice, and federal oversight. Dr. Gatsonis will amplify some of the the science issues, so I will spend more time on the others.
The work of the FS community is critically important in criminal justice. Forensic science experts in evidence are routinely used in the CJ system. So it matters if they qualify to testify and whether the evidence is sufficiently reliable to support the truth that testimony purports to support. (Melendez description was given to emphasize the importance of the testimony of analysts). The Melendez decision is seminally important to the forensic science community. The Supreme Court went out of its' way to reject the court’s suggestion that testing was "neutral". The NAS Report indicated otherwise - that cross examination of forensic analysts is important in the criminal justice process. The Court’s statements in Meledez are a not-so-subtle indictment of our present forensic science system. The court said that confrontation is one means of insuring accurate forensic analysis. Cross examination is a minimal constitutional safeguard that helps to assess forensic science. The judicial system may be encumbered by the lack of the court's ability to interpret and challenge scientific evidence. And the system embodies a case-by-case approach that isn't well suited to address the systemic problems in many forensic science disciplines. It may be that defense attorney’s are better able to challenge forensic science in a particular case, but admissibility under the Federal Rules of Evidence is a flexible standard, and trial judges have great discretion. More important is the realization that the question of whether forensic evidence in a case is admissible is not paternal with whether particular disciplines are valid or reliable. Success in cases doesn't remedy the paucity of evidence.
There are dramatic reforms needed to fix the problems in the forensic science community. Good science includes 2 attributes: validity and reliability. Methodologies and practices are needed that minimize the risk of results being affected by bias. Without them, the forensic science community cannot consistently serve the judicial system as well as it might. We need the forensic science community, not the legal system, to improve forensic science to better serve justice. Simply increasing crime lab staff won't solve these problems. What is needed is interdisciplinary peer reviewed scientific research to achieve technological advancements and validity and reliability of the disciplines, and training and education to pursue research in forensic science – to add a culture of science to the forensic science community. We still have a long long way to go.
We need to adopt and enforce better and more consistent practices. Overhaul is essential if we expect forensic science practitioners to support the goals of justice. This will take time - so the question is how we can ensure practices until then. If we can't quantify measures of uncertainty, how can we establish best practices? The question is not whether we can require mandatory best practices, but by whom and on what terms. As studies are being conducted, the committee had 3 recommendations: 1) use standardized and clear terminology. FS experts should offer nothing more than they know. The concern is that forensic science practitioners may not know what they don't know, so there needs to be training. 2) Model lab reports regarding minimum information are needed. They are intended to facilitate the ability of court to interpret the forensic evidence offered in the case. 3) The removal of all public facilities from the administrative control of law enforcement. The simple point here is that forensic science should function independently of law enforcement administrators. The Supreme Court in Melendez affirmed this – they said that forensic science experts sometimes respond to pressure or incentives to alter the evidence in a manner favorable to the prosecution.
The committee concluded that what is needed to correct the problem is a new, strong and independent entity that is as objective and free of bias as possible to take on the task of implementing a fresh agenda, etc. (Quoted NIFS recommendation from the report) I do not believe that truly meaningful reforms will be adopted without the support and oversight of such a committee. Much more is needed than what the forensic science community will do with the NAS report on its own. I hope congress puts into effect the full package that we need.
Focus of my presentation: 1) Research, education and standards; 2) errors and error rates; 3) evaluation of accuracy in FS.
(Professor Gatsonis conducted a review of the NAS report sections)
More and better research is needed to address the accuracy, reliability, and validity in the forensic science disciplines. Primarily this has not been done – and where it has, it has been in disciplines that have linked their technologies to the techniques – DNA. We need quantifiable measures of reliability, accuracy and uncertainty in conclusions. This is difficult to achieve, but the answer is that we need to work on it rather than show up and state 100% accuracy. For example, at a conference in
Standards are important and should reflect best practices and serve as accreditation tools for laboratories. Enforcement of standards should have teeth, unlike the current situation. Some reports are more rhetoric than a scientific report. The notion of standard term and model lab reports are two things that came out loud and clear in the report. AFIS and interoperability are not there yet. Calling for substantial development of educational programs to goes hand in hand with the development of research.
Assessing and quantifying the accuracy of forensic analysis. First item: Define the task. The term "error rates" are used without reference to what they mean, who calculated them, how they were calculated, etc. One task is individualization to a particular source; the other task is classification to a particular class of sources. Some people will say only the first is important. I don't think the answer is as simple as that. Very often, classification plays an important role because very few disciplines have the capability to individualize - DNA and maybe fingerprints. But there are many that have the potential for classification. The research on either of these has not been done in an organized, fully-developed fashion. The question then is to keep it focused to the task and to avoid mission creep. Only then can you develop the test for error. We can use a standard screening/diagnostic 2X2 table for dichotomous test and truth for classification studies. One way of thinking about error is sensitivity and specificity - assume you know the truth and put in the numerator what the tests say - how easily does the discipline detect the truth. Another way is the predictive value - positive and negative. Already we have defined 4 types of error rate. Then how do we go about using this type of thinking for individualization studies? Experiment with pairs of tests where truth is known, asking the question of whether there is a match or not. Large studies like this have not been done. This brings up several types of questions about error rates in the forensic sciences. How do we go about these types of studies? Design them considering case difficulty, experience and training of the analysts, and contextually available information. And ideally, we need to know the average accuracy and range (variability) of accuracy across analysts or laboratories. We are very far from this goal at this point in time.
This paradigm doesn't address other important questions such as the definition of a "match", the estimation of random match probabilities, or other questions. The accuracy paradigm addresses performance over repeated instances of the analysis, and does not necessarily guarantee the correct answer in a specific case. We may know that a medical diagnostic test has a particular sensitivity or specificity, but for any particular case it is either a zero or a one. So there is still the question as to what will be accepted as adequate accuracy in a court of law.
I will close with some experiences from diagnostic medicine. In the moving target problem, technology evolves so rapidly that you can’t pinpoint an error rate. There is also the modality (discipline) performance versus the reader (analyst) performance. Also, in terms of delivering information to the criminal justice community, we need to think about the effectiveness of the forensic discipline - how it's operating in real practice. In the medical field, you may end up knowing whether there was cancer there. In forensics, you don't know the truth very often. You may know it about classification case, but not for individualization cases. So this will be a difficult question - not to say we shouldn’t study it, but first we need to find data - or obtain it in the research community. A significant research effort and infrastructure in forensics is required to do this.
The studies may also highlight sobering realities. For example, mammography detecting cancer (showed chart demonstrating poor performance). So if we start interpreting analysts interpreting fingerprints, what is going to happen when it goes into a court of law. Certainly there will be variability. Even experts may not agree. For example, radiology - measures of agreement between real experts whether a particular kind of cancer has progressed. (showed chart of data showing embarrassing data to that community) High technology is not necessarily better. For example, ROI curve of which part of the prostate is cancer of MR and MRSI for all readers showed less than anticipated results. But we should still push for that.
In summary, we need to know what error rate we are talking about. We need to define the task and measures, conduct experiments, and monitor practice.
Q&A for both presenters:
Question for Judge Edwards: Since the release of your report, have you seen any movement through congress that would give you the idea that they might move with some of your recommendations to establish a specific agency for forensics?
A: There has been movement in the Senate Judiciary Committee. Personally, I'm not concerned with precise movement now, because of the health care and economic issues. But it has not died in the interim, as there are staffers there that are very dedicated to the forensic science cause. Our history is to be wary of establishing new government agencies, but I think people who have seriously considered this in the committee are taking serious steps to fix it. I think it will be next year until we see if they will proceed with legislation.
General Q: With respect to NIFS, we have heard a number of stakeholders suggest where it might reside - do you care to comment on where it should reside or be situated in the departments?
Gastonis Answering: We walked through several agencies that could take on the role of NIFS, and what we decided was that none of them had the wherewithal to take care of the entire scope we were asking of NIFS. We discussed DOJ, and they could take on regulatory but they don’t have the scientific depth or play in the scientific arena, and it was part of law enforcement. We talked about NIST, but the understanding was that this agency would have to develop and guide research like an NSF or NIH. It was our understanding that perhaps this wasn't something NIST would do because you have to develop, guide, fund, and judge research, etc. We also looked at NSF - perhaps they could take care of the development of a research program, but they have no experience on standards or enforcement or anything else. So all the tasks envisioned for NFIS needed a new entity that could combine all of these - that was the reason we went towards proposing an independent agency that would have links to all the other players, but will have forensics as its' core and be able to move on to the main axes - research and education, regulatory and standards.
Q: The mandatory degree you emphasized regarding accreditation and certification - what is the reason you think a voluntary program wouldn't work.
Edwards Answering: because it's voluntary now, and it's respectfully a disaster. There are many good practitioners that work very hard and do incredible work, but the material we saw was stunning and disastrous - it's a stunning world. And the reason I say this is because it really matters - it's the world I live in. It's the evidence that puts people in jail. The reason for mandatory - the present system isn't working, and there are a lot of people out there giving testimony - and the reason they are winging it is because the research isn't there. Doing a long time? Better at doing it bad. The system makes no sense. The simple answer – a way is needed to insist that the people who come in to court and practice are trained, have follow up education, and that someone has certified that they know what they are talking about.
Gatsonis Answering: 1) given that we have certification, it won't solve all problems, but without it we are worse off. 2) There are voluntary certifications in other areas where there is a lot of education and research. Forensics is very different – it’s often done by apprenticeship not rigorous training, with varied backgrounds, etc.
Edwards Answering: On the research topic, we have had people insist there is lots of research and that we just didn't find it. However, we pulled everything there was to pull - it's just nothing you would consider research. You have peer reviewed systems that expose your findings to your world, and you move forward. That is not the culture of forensic science. They may have stuff in the drawer, but it's not scientific research the way I understand it's supposed to happen.
Q: Given that NIST isn't a regulatory agency, yet voluntary compliance is often used by industry as a point of individualization within industry - why doesn't voluntary practice work even now?
Edwards Answering: There are problems with approaching this from the law side – there are not enough people to assess it in the legal community - not enough judges, jurors, prosecutors that would understand SWG standards. The legal community is not a good place for this to happen. Some of the standards are just mush. Some are quite good, others are not so good. Some of you wouldn't understand what the standard was.
Q: Setting that aside, what prevents a defense attorney today from raising it as a point today?
Edwards Answering: You can raise it, and assume the jury understands it. But the problem is that SWG standards haven’t achieved a level of credibility where people say, if it didn’t' satisfy the standard, that's a problem. That's not what' happening.
Q: What type of forensic programs exist at
Gatsonis Answering: Nothing. For example research programs are in very few places. What exists is in specific departments - there could be someone who is interested in forensic X,Y,Z - but there is no program that is called Forensic Science. We don't educate people in forensic science – there is no interdisciplinary unit on FS. That's why I was saying we have a long way to go to establish this that needs it’s own focus, centers, and funding.
Q: I'm surprised to hear that an academic center doesn't latch on to this and start these types of programs when there are students watching CSI that would pay to come in to take them.
Gatsonis Answering: Good point. Actually, CSI has had an effect. Members of the committee that run training programs in universities say that they see a lot of interest from students in forensic science. But the universities don't have developed programs in that area. Part of it has to do with the availability of professors that are really knowledgeable in forensics as an interdisciplinary area. If you want to establish a graduate program and fellowships, there is no obvious place to go. Unless what you are doing is so important in that discipline, it will be one of those things that falls through the cracks. Or there has to be specific funding for interdisciplinary forensics.
Edwards Answering: There has been a lot of disdain among the groups that have said that this isn't science. So from the universities, there have been no movement on the PhD's to pull together their disciplines to create forensic science because they don't take forensics seriously. That's one of the big hurdles - there has to be a central entity to give life to the idea. Now if there is money, then there will be a lot of interest. And they aren't working in non-sciences and they won't do it without funding.
Question for Gastonis: Expand on homeland security
Gastonis Answering: We didn't want to say anything about Homeland Security, but it was part of our task, so we said something general. We felt it was very important, but the committee had enough on it's plate with everything else that it needed a lot of depth that we didn’t get into it.
Q: This is more of a comment and amplifying the discussion: I sit on an ASTM committee – a well accepted body – and up for ballot is something related to GCMS of flammable liquids. As it came across my desk, it's interesting that the document says it's not required to analyze flammable gasolines. It’s easy to get samples - yet the document called out that they are not required. Here's an obvious case, why tell the practitioners not to use standards? Even when NIST provides materials, it doesn't always mean they will be accepted by the community. In the future, is there some way to make those types of things more required?
Edwards Answering: Yes. You have to have a central entity. Until we can generate from the top some movement, it's too fragment. We need strings attached to stop seeing what you're seeing.
Gastonis Answering; If there was a resurgence of education and highly qualified people were getting involved in these disciplines, you would see a very different picture.
Question for Edwards: Do you forsee certification as a requirement whether for defense or for prosecution?
Edwards Answering: Yes - standards across the board, absolutely.
Concluding remarks – dismissal
The second meeting was the 4th AFIS Interoperability working group meeting, held in
Many state and local budgets are shrinking due to the state of the economy, while at the same time casework and accreditation demands are become more demanding from the Daubert challenges, the NAS report, and calls for AFIS interoperability. Grant funding is becoming a more important way for agencies to purchase the new technology and consulting services necessary to meet these challenges and remain effective. At Complete Consultants Worldwide (www.clpex.com/CCW) we recognize this fact and have put together a guide with a list of grant resources to help you find and obtain additional funding.
Agencies are much more likely to obtain a grant if the project aligns with the priorities of the funding organization. By doing research, talking directly with the contact personnel listed for the grant, and asking the right questions, you will increase the likelihood of getting the technology and services from the grant you are seeking to obtain.
On February 13, 2009 both the Senate and the House passed the American Recovery and Reinvestment Act (H.R. 1) which was signed into law by the President on February 17, 2009. $4 billion was included within this stimulus bill to provide support to state, local and tribal law enforcement. Of this $4 billion, $2 billion was allocated for the Byrne Justice Assistance Grant (JAG) Program, $225 million was allocated for Byrne competitive grants and $1 billion was allocated for the Office of Community Oriented Policing Services (COPS) hiring grants. A list of available funds associated with this stimulus bill can be found at the Library of Congress webpage http://thomas.loc.gov/home/approp/app09.html.
In addition to those directly associated with the stimulus bill, numerous other funding sources are available to the forensic science and law enforcement communities. Extensive resources are available on the websites below. While a majority of the websites listed are related to federal sources, do not overlook the opportunities within your State or Agency funds collected through confiscated monies or court technology funds.
http://www.ojp.usdoj.gov/nij/topics/forensics/welcome.htm -- Office of Justice Programs funding opportunities, resources, publications, training and technical assistance
www.fedstats.gov -- statistics and census data to help substantiate your proposal
www.grants.gov -- a listing of federal grants
www.sfda.gov -- a catalog of Federal Domestic Assistance
www.guidestar.org -- learn how much an organization might fund
www.dhs.gov/xgovt/grants -- about $3 billion for grants available through the Department of Homeland Security
www.nlectc.org/virlib/infolist.asp?strtype=funding -- research available grant money with emphasis on equipment, testing, evaluation, and technology improvements
www.federalgrantswire.com -- free resource guide to federal grants and loans
www.psfa.us -- the Public Safety Foundation of America, providing grants with emphasis on public safety functions including planning, equipment procurement and training
www.foundationcenter.org -- reflects the top 50 donors in your state
My company, Complete Consultants Worldwide, provides consulting services to the friction ridge examination community. We have successfully coordinated AFIS outsourcing projects for the federal government by accurately and efficiently formatting, encoding, submitting and comparing latent prints to multiple AFIS systems and reporting results in a timely manner. We have contracted over 40 examiners, most IAI certified, and we are ready and willing to assist your department in obtaining grant funding to assist with backlog or workflow challenges. We eagerly await your phone call or e-mail at the number or email address listed below.
Feel free to pass The Detail along to other examiners for Fair Use. This is a not-for-profit newsletter FOR friction ridge examiners, BY friction ridge examiners. The website is open for all to visit!
If you have not yet signed up to receive the Weekly Detail in YOUR e-mail inbox, go ahead and join the list now so you don't miss out! (To join this free e-mail newsletter, enter your name and e-mail address on the following page: http://www.clpex.com/Subscribe.htm You will be sent a Confirmation e-mail... just click on the link in that e-mail, or paste it into an Internet Explorer address bar, and you are signed up!) If you have problems receiving the Detail from a work e-mail address, there have been past issues with department e-mail filters considering the Detail as potential unsolicited e-mail. Try subscribing from a home e-mail address or contact your IT department to "whitelist" the Weekly Detail. Members may unsubscribe at any time. If you have difficulties with the sign-up process or have been inadvertently removed from the list, e-mail me personally at firstname.lastname@example.org and I will try to work things out.
Until next Monday morning, don't work too hard or too little.
Have a GREAT week!