Less than a week ago, at the second AI FORUM Hungary conference in Budapest, one of the speakers provided some insight into possible malicious applications of Artificial Intelligence (AI). 

The presentation was reinforced by describing a range of AI applications which could easily be manipulated to support crime or terrorism, ranging from the use of drones and cyber tools to the manipulation of biological life sciences.  Indeed, with regard to terrorism, it was suggested that certain forms of AI applications could enable the design and deployment of terrorism-related weapons of mass destruction (WMD).

This suggestion – which is neither new nor unique amongst security specialists – was reinforced with the recent publication this month of the United States new ‘National Strategy for Countering Weapons of Mass Destruction Terrorism’.

The new US policy fully acknowledges the risk that such forms of AI pose for national security, especially in the field of terrorism and acknowledges that the full panoply of WMD counter proliferation as well as counter terrorism means will be required to dampen and suppress such threats and challenges.

However, two key points emerged from the strategy, which should encourage further study of this phenomenon within our own Forum. The first is the clear linkage between such weapons and avoiding technological surprise. This point was raised tangentially last week at our conference as we discussed the embedding of ethics in AI design. The second point concerned the exploitation and utility of AI applications in countering such malicious intent through the enhanced use of data analytics, especially in fields such as predictive intelligence or technology trend analysis.

As the AI Forum Hungary begins its quest to deepen its research, looking at ways to keep society safe must be an attractive field of study. Indeed, given that we seem to be keeping pace with developments in the AI field, at least in terms of trend recognition, perhaps we need to consider a workshop in the not too distant future to explore the subject in more depth?