Quantum computing could help fight terrorism, says UK gov
Counter-terrorism strategy embraces tech, but warns of future extremist digital capabilities
Machine learning and analytics are set to play key roles in the UK government's new counter-terrorism strategy, unveiled yesterday.
Technological developments are helping terrorists co-ordinate and spread propaganda, the government's CONTEST report states, but makes clear that technology can help keep the UK safe as well.
The report, a review of the UK's counter-terrorism strategy in light of last year's London and Manchester attacks, outlines innovations as diverse as cryptocurrencies, drones and stronger encryption as aiding extremists' bids to launch attacks.
"As the threat evolves so must our response," said home secretary Sajid Javid. "Ultimately, our approach is about ensuring that there are no safe spaces for terrorists to operate - internationally, in the UK or online."
The new strategy will seek to use AI to "filter and identify crucial information", and even use machine learning with quantum computing to find relevant data from huge datasets.
It will also deploy new screening and detection technologies at airports, borders and in crowded places.
"The threat from cyber terrorism may increase in the future, but the current technical capability of terrorists is judged to be low," the report reads.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
New legislation will underpin the revised strategy, updating terrorism offences to reflect new practices online, and making it an offence to repeatedly view streamed terrorist video content.
The government will also beef up the sentencing framework for terrorism, increasing the maximum penalty for some offences.
It also promised "robust action" to leave terrorists with no safe places online to spread propaganda, and it will work with the tech sector on software that can identify and remove terrorist content before it's widely seen.
The Home Office has already created an AI tool with London-based ASI Data Science to detect terrorist content in online videos, sporting a 99.9% success rate with 95% of Daesh propaganda, it revealed in February.
In a reference to the government's long-held stance that messaging tools must create backdoors to allow security services to intercept suspicious conversations, the report said that "we do not want unfettered access to communications, but we must ensure our agencies are able to access the information they need to keep us safe".
The review referred to the Investigatory Powers Act (IPA), saying the government wants to protect police powers within a "robust legal framework", citing rules like forcing telecoms companies to keep histories of websites people visit, and the requirement for them to remove encryption when a request to do so is authorised by the home secretary and a judicial commissioner.
However, the IPA's central tenets of keeping people's browsing histories, emails and phone usage were found to be unlawful in a judicial review that reached the High Court in April. The government has until 1 November to rewrite it.
Image: Shutterstock