lawyermonthly 1100x100 oct2024eb sj lawyermonthly 800x90 dalyblack (1)

Understanding of Forensic Science Is Poor in UK Criminal Justice System

In this Article
Reading Time:
3
 minutes
Posted: 2nd November 2018 by
Lawyer Monthly
Last updated 1st November 2018
Share this article

The level of understating of forensic science among lawyers, judges, and juries is poor, according to evidence submitted to parliament by a group of researchers from Queen Mary University of London.

The researchers suggest that forensic science is contributing to injustices because of misunderstandings about matching trace evidence to a particular person.

The group have submitted evidence to a House of Lords Science and Technology Committee inquiry into Forensic Science.

The inquiry was set up to explore the role of forensic science within the UK Criminal Justice System in light of concerns over the weaknesses of current forensic methods in the delivery of justice.

The researchers involved include Professor Norman Fenton (School of Electronic Engineering and Computer Science), Dr Primoz Skraba (School of Mathematical Sciences), Amber Marks (School of Law), and Dr Ian Walden (Centre for Commercial Law Studies).

When asked about the level of understanding of forensic science within the criminal justice system amongst lawyers, judges and juries, Professor Fenton believes that there needs to be much greater awareness that all evidence is subject to potential errors.

He noted: “Errors can and do occur at every level of evidence evaluation: sampling, measurement, interpretation of results, and presentation of findings. Forensic scientists should articulate, and attempt to quantify, all such possible sources of error. And legal professionals should understand and expect this information, and probe for possible sources of uncertainty when it is not presented by the experts.”

Professor Fenton also believes that injustices are occurring widely because of misunderstandings about the probative value of forensic match evidence.

He advised: “Because many forensic traces from crime scenes are only ‘partial’ and may be subject to various types of contamination, the resulting ‘profile’ is not sufficient to ‘identify’ the person; many people would have a partial profile that matches.

“I have been involved in cases where such assertions have a dramatic impact on the judge and the jury, while even defence lawyers assume their case is impossible to defend. But to interpret this as ‘proof’ that the defendant must have been at the crime scene may be to grossly exaggerate the probative value of the evidence in favour of the prosecution case.”

Furthermore, Professor Fenton argues that the meaning of the word “match” in the context of forensic evidence needs to be re-evaluated. Currently, “a match” between two pieces of evidence is understood to mean that they come from the same source but two pieces of evidence are branded “a match” when their measured characteristics are the same (within an agreed tolerance).

The committee was also advised that lawyers and the judiciary should receive basic training in probability and statistics because the current training available is ‘suboptimal’. This would enable them to understand the statistical analyses presented, to identify any weaknesses in the analyses presented, and to avoid common fallacies such as the prosecutor’s fallacy. In forensic investigations “there is virtually always some degree of uncertainty,” Professor Fenton added.

Elsewhere in the submission, Dr Primoz Skraba highlights an emerging gap is the increasing use of demographic and personal data by companies to identify individuals, which is likely to be used in forensic science in the future.

According to Skraba: “While a company’s misidentification may result in a misplaced advertisement, the consequences in forensic science may be more severe.”

This is not limited to use by private companies; forensic technologies are being used now by agencies such as the Metropolitan Police through its Gangs Matrix, which has raised concerns around the legitimacy of using predictive tools in criminal justice.

Amber Marks noted: “Risk scores generated by police algorithms are shared with multiple agencies and this results in often stigmatic and punitive repercussions for the individual involved, including in policing, educational and medical settings, decisions on benefits and housing entitlements and deportation proceedings, while obviating the procedural safeguards of the criminal trial.”

(Source: House of Lords Website)

Sign up to our newsletter for the latest Government Updates
Subscribe to Lawyer Monthly Magazine Today to receive all of the latest news from the world of Law.

About Lawyer Monthly

Lawyer Monthly is a news website and monthly legal publication with content that is entirely defined by the significant legal news from around the world.