20 June 2025

The Gospel

"The Gospel," also known as "Habsora" (meaning "The Gospel" or "Good News" in Hebrew), is an artificial intelligence system reportedly employed by the Israeli military for rapidly identifying and generating targets during military operations, particularly in the Gaza Strip. Similar to the "Lavender" system, "The Gospel" signifies a critical advancement in the automation of warfare, prompting considerable discussion about the ethical implications of AI in modern conflict.

Unlike "Lavender," which primarily focuses on identifying individual human targets, "The Gospel" is designed to pinpoint infrastructural targets. This includes buildings, equipment, and other structures believed to be associated with enemy combatants or militant groups. The system achieves this by autonomously analyzing vast quantities of surveillance data. This data can encompass a wide range of intelligence, though specific details about its inputs are not publicly disclosed. By identifying patterns and anomalies within this data, "The Gospel" can quickly flag potential targets, significantly accelerating the targeting process.

The operational efficiency attributed to "The Gospel" is a key aspect of its reported use. Military sources have indicated that the system can generate targets at an unprecedented rate, potentially producing dozens or even a hundred targets per day, a stark contrast to the human capacity of identifying far fewer targets annually. This rapid generation of targets allows for a much more extensive bombing campaign, removing previous limitations on targets that could be identified by human analysts alone. Once "The Gospel" identifies a potential target, it presents a recommendation to a human analyst who then makes a decision on whether to proceed with the strike.

However, the rapid and automated nature of "The Gospel" raises serious ethical and legal concerns. While the system is presented as a tool for efficiency, critics question the thoroughness of human oversight when targets are generated at such high volumes. The potential for misidentification, reliance on potentially flawed data, or the inherent biases in algorithmic decision-making could lead to the targeting of civilian infrastructure or areas, resulting in increased civilian casualties and destruction. International humanitarian law requires meticulous distinction between combatants and civilians, and between military objectives and civilian objects. The high-speed, AI-driven targeting process, even with human review, introduces complex challenges to upholding these principles.

Furthermore, "The Gospel" operates within a larger ecosystem of AI-powered military technologies used by Israel, including other targeting and surveillance systems. This integrated approach suggests a systematic reliance on AI to manage and execute military operations, transforming the landscape of warfare. The implications of such extensive automation extend beyond the immediate conflict, contributing to the broader debate about autonomous weapons systems and the future of war.

"The Gospel" project illustrates the ongoing integration of artificial intelligence into military targeting. While it offers efficiency in identifying a large volume of infrastructural targets, its deployment underscores critical questions about accuracy, human accountability, and adherence to international legal frameworks in a rapidly evolving technological battlefield.