AI Use in Terrorist Plots and Attacks Surges in 2025
AI Use in Terrorist Plots and Attacks Surges in 2025
By: Clara Broekaert and Lucas Webber for Militant Wire
The piece begins as follows:
It has been over three years since OpenAI launched ChatGPT, the large language model (LLM), which is now the fastest consumer application to reach 100 million users and has become synonymous with the term “artificial intelligence” (AI). Since then, a vast array of generative AI tools have entered the market, including chatbots, image, video, and audio generation tools. Much of the research on AI and terrorism has focused on these generative applications, examining how extremist groups might use them to create propaganda, amplify narratives through bot networks, and experiment with personalized “radicalization bots”. While this body of work has illuminated its impact on the digital realm, more attention needs to be paid to how AI tools, including those under the generative umbrella, are concretely used for operational planning and use by terrorists and violent extremists. This article addresses that gap by exploring how terrorists and violent extremists have leveraged AI in the operational planning of attacks and examines what this tells us about the incentives and benefits of AI use as perceived by perpetrators.
The year 2025 has witnessed a notable rise in incidents where terrorists and violent extremists have leveraged AI tools to plan, research, and prepare attacks. Just as terrorists and violent extremists have long relied heavily on internet forums, social media platforms, and messaging applications to acquire operational knowledge and guidance to execute plots, especially among lone wolves and inspired actors, we note an uptick in perpetrators and plotters turning to freely available AI products to optimize their operational toolkit. Specifically, according to our database of plots and attacks in which AI was used for operation planning in 2025, it serves a role in learning (ranging from operational security to the details of composing explosives), visualizing scenarios (e.g., creating images of the planned attack), and refining tactics through conversational, personalized guidance (e.g., step-by-step guidance on how to acquire the necessary chemical precursors for explosives). This trend underscores the urgent need for states, technology companies, and social media platforms to anticipate and adapt to the new realities of digital-enabled extremist activity and implement strategies to disrupt misuse.
The piece continues as follows:
- Operational Use of AI in 2025 by Terrorists and Violent Extremists
- Threat Assessment and Forecast
The post AI Use in Terrorist Plots and Attacks Surges in 2025 appeared first on Small Wars Journal by Arizona State University.