Funding cuts lead to dodgy digital forensics
Efforts to clear a growing backlog with automation threaten quality of evidence
Stringent funding cuts are threatening the safety of prosecutions based on digital evidence, experts have claimed. Police officers failing to understand complex digital evidence, a huge backlog in evidence processing and a lack of legal aid for expert defence witnesses are all contributing to miscarriages of justice.
Digital devices, particularly mobile phones and PCs, are routinely used in criminal cases and can provide detailed, time-stamped evidence that’s useful to prosecutors and defence teams, but what happens when the system fails?
“Simultaneous budget cuts and reorganisation, together with exponential growth in the need for new services such as digital evidence, have put forensic science providers under extreme pressure,” a House of Lords select committee said in its Forensic science and the criminal justice system: a blueprint for change report last year. “The result is a forensic science market which is becoming dysfunctional.”
With swingeing cuts to the Ministry of Justice, experts fear the situation is unlikely to improve.
Bulk collection
Police forces have habitually been criticised for a lack of digital crime expertise. Most police forces now have digital units of their own, with official data showing that 80% of digital forensics is performed in-house. However, the police units are largely set up to process routine actions, such as capturing hard drive data or phone images, and often lack the expertise to delve deeper or perform reconstructions of actions logged on a device.
“The issue is cost,” Peter Sommer, who works on criminal cases and lectures on digital forensics at Birmingham City University, said. “Most units are operating at more than 100% capacity and they are still trying to deal with backlogs. They have expanded dramatically, but they are still doing a lot of volume processing and are heavily reliant on particular tools.
“What they don’t have is the luxury of people who can spend a day thinking about implementing tests or the latest standards,” Sommer said.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
Mark Stokes, chief of the Metropolitan Police’s Digital, Cyber and Communications Forensics Unit, has admitted his unit is working with a seven-month backlog and, despite renewed efforts, told a House of Lords committee last year that the service “cannot meet the demand currently with what we have”.
Amid this backdrop, experts believe the police are becoming over-reliant on the output of automated systems, which streamline evidence gathering, but issue reports that untrained officers don’t understand.
“The volume side of things has got to a level where it’s becoming dangerous,” says Angus Marshall, a lecturer in digital evidence, forensic investigations and quality standards at the University of York and another independent forensic expert witness.
“Most police forces have adopted an automated digital device examination workstation that is deployed in police stations. Regular police officers are trained to use the kiosk, which spits out a report and it’s left to the officer to interpret it. The officers often don’t understand what they are seeing.”
Marshall cited a case where police were ready to charge a suspect they believed had been using his mobile phone in the moments before a serious accident – something that could lead to a lengthy prison term.
“The officer was convinced that was what had happened because the report showed a lot of files being changed in the time of the collision,” he explains. “On closer inspection, what it actually showed was that the mobile phone had disconnected from the Bluetooth system of the car when it lost power and had shut down some applications – that’s a major difference and could have gone forward as a prosecution and resulted in a wrongful conviction.”
The officer had noted files labelled “plists” and assumed that they were playlists, which would suggest user interaction with the handset before the crash, while the files actually referred to Properties, which changed when the handset disconnected.
“That motorist was particularly lucky,” says Marshall. “It could have ended with all manner of problems.”
Data black hole
Collection of data is only the first step in the prosecution process, and according to experts the way that data is handled and shared is equally problematic – especially when it comes to clarity over what evidence from handsets might be used by the prosecution, as well as data that’s been collected but isn’t part of the prosecution’s case.
“It should be automatic that any evidence that’s been found that’s not being presented by the prosecution should be disclosed, at least in list form, so the defence can ask for it,” says Marshall. “This is again symptomatic of cost-cutting within organisations like the CPS [Crown Prosecution Service], that they don’t have the time and resources to make sure the job is done properly.”
The disclosure situation came to a head when a series of rape cases collapsed after it became clear prosecutors had not provided the defence with all the evidence they had collected.
In one example, the defendant pleaded not guilty in a rape case claiming the act had been consensual and part of an ongoing sex game. At the last moment, the defence was given access to 57,000 messages that showed the pair had a relationship, evidence that saw the accused acquitted after a fraught two-year investigation.
“There have been a number of cases where the disclosure of evidence has been highly problematic and led to the collapse of cases,” says Ruth Morgan, director of University College London’s Centre for the Forensic Sciences.
“There were a number of cases that collapsed in 2017 due to issues with the disclosure of data that led to a review by the CPS of all rape cases in 2018, and this is an ongoing challenge. It’s a particular challenge as the amount of data available is increasing so rapidly and finding what is relevant can be difficult when time frames are limited.”
Unaffordable expertise
In addition, not all defendants will have access to an expert witness to delve into the investigation’s reports, as the legal aid budget has been cut by £350 million since 2013.
“If the defence hasn’t engaged an expert and taken advice, they might not know that a dataset is worth looking at,” says Marshall. “Whether you get the funding depends on the strength of your argument, but it can be a bit hit and miss.”
The Legal Aid Agency requires three quotes from forensic experts and for the defence to justify reasons why forensic testing is necessary. It will usually offer the work to the cheapest company regardless of quality, leaving defendants at risk.
“The only way you can challenge a [digital forensic] reconstruction that’s been done for the prosecution is to say, ‘Haven’t you been making assumptions?’ or ‘Maybe we could rearrange the evidence and present another story’,” says Sommer. “It works on an adversarial system – prosecution and defence – and for this reconstruction work you need to actually have an expert hired by the defence to test the assumptions made by the prosecution.
“The great weakness is that the funding available for defence experts is pretty miserable – the legal aid authority will have arguments about how much is necessary and their grant may not be sufficient for everything that a defendant would like.”
AI to the rescue?
The solution often proposed by senior officials is artificial intelligence: that an ever more automated service will reduce backlogs and streamline the system. But there are fears amongst practitioners that more automation could lead to more mistakes.
“AI in the criminal justice system terrifies me,” says Marshall. “At least with a human being, you can challenge them and ask how they reached their conclusions – with AI systems, good luck. For triage it’s fine, for suggesting devices to look at it’s fine, but producing primary evidence, please no.”