- The Algorithm of War: Inside Palantir’s Role in Israel’s AI-Driven Gaza Conflict
- Who is Palantir? From CIA-Backed Startup to Pentagon Powerhouse
- The Man Behind the Machine: Alex Karp’s Unapologetic Vision
- A “Strategic Partnership”: Palantir’s Deepening Ties with Israel
- “Lavender” and “Where’s Daddy?”: The Chilling Logic of AI Targeting
- Lavender: The AI “Kill List” Generator
- Where’s Daddy?: Targeting Families at Home
- The Human Cost: When Algorithms Erase Humanity
- A Future Decided by Code
The modern battlefield is no longer defined solely by tanks, missiles, or soldiers. Today, wars are fought with algorithms, data, and the cold, hard calculations of artificial intelligence. At the centre of this new frontier stands Palantir Technologies, one of Silicon Valley’s most secretive and powerful companies, and at its helm, a CEO who makes no apologies for his company’s role in global conflict.
As Israel’s military campaign in Gaza has intensified, reports have surfaced of AI-powered systems generating thousands of potential targets, often with devastating consequences for civilians. While Palantir denies direct involvement in specific systems, its deepening ties to Israel’s defence apparatus and its broader role in developing military-grade AI raise urgent questions about accountability, ethics, and the future of warfare itself.
This article offers a closer look at the rise of Palantir AI warfare, the influence of its CEO Alex Karp, and the deadly consequences of handing life-and-death decisions to algorithms.
Who is Palantir? From CIA-Backed Startup to Pentagon Powerhouse

Palantir Technologies was born over two decades ago, not in a garage, but with initial funding from In-Q-Tel, the venture capital arm of the CIA. From its inception, the company was designed to serve the clandestine world of espionage and intelligence. Its purpose was to sift through massive, disparate datasets to find patterns and connections that human analysts might miss.
Co-founded by controversial billionaire Peter Thiel and led by CEO Alex Karp, Palantir quietly embedded itself within the U.S. defence establishment. It became an indispensable tool for organisations like:
- The Federal Bureau of Investigation (FBI)
- The National Security Agency (NSA)
- The Central Intelligence Agency (CIA)
- The Pentagon
This private tech company was granted unprecedented access to America’s most sensitive intelligence networks, a level of integration few firms have ever achieved. But its reach would not stop at America’s borders.
The Man Behind the Machine: Alex Karp’s Unapologetic Vision
CEO Alex Karp is an outspoken and controversial figure who sees no conflict between technology and lethal force. He has publicly stated that Palantir is dedicated to the service of the West and the United States.
“Our product is used on occasion to kill people… I am proud that we are supporting Israel in every way we can.” – Alex Karp
Karp believes the survival of Western democracies depends on an unbreakable bond between military power and private tech giants. He dismisses critics, such as student protestors, as adherents to a “pagan religion infecting our universities.” His unwavering stance has drawn both praise from defence circles and condemnation from human rights advocates.
A “Strategic Partnership”: Palantir’s Deepening Ties with Israel
In January 2024, Palantir announced a “strategic partnership” with the Israeli Ministry of Defence to supply its technology to Israeli forces. This solidified a relationship that has been growing for years. As Israel has increasingly relied on technology in its military operations, dubbing its 2021 assault on Gaza the ‘world’s first AI war,’ Palantir’s expertise in data analysis has become highly valuable.
This partnership has led some to label Palantir the “AI arms dealer of the 21st century.” The concern is that its powerful platforms, which aggregate surveillance data, phone records, and intelligence feeds to create detailed digital profiles, are being used to generate target lists in the ongoing Gaza conflict.
The ethical implications have not gone unnoticed. KLP, one of the Nordic region’s largest pension funds, divested all its shares in Palantir, citing the risk of complicity in war crimes and violations of international law.
“Lavender” and “Where’s Daddy?”: The Chilling Logic of AI Targeting
While Palantir’s exact role remains opaque, investigations by publications +972 Magazine and Local Call have shed light on two specific AI-powered systems used by the Israeli military in Gaza.
Lavender: The AI “Kill List” Generator
Lavender is an AI system reportedly developed by Unit 8200, Israel’s elite signals intelligence division. Its function is to rapidly analyse data and flag thousands of Palestinians as potential militant operatives.
- Mass Target Generation: At one point, Lavender had marked as many as 37,000 Palestinians as suspected militants.
- Known Error Rate: Sources within the Israeli military revealed that they were aware the system had an error rate of approximately 10%, meaning thousands of individuals marked for assassination were not Hamas militants.
- Human Oversight: The speed of target generation often left human analysts with only about 20 seconds to review and approve a strike, effectively turning them into a “rubber stamp” for the algorithm.
Even if Palantir did not build Lavender directly, its close collaboration and advanced AI tools have fueled suspicions that its technology may have supported or enhanced the system’s operations.
Where’s Daddy?: Targeting Families at Home
A second, chillingly named AI system called “Where’s Daddy?” was designed to track individuals on the Lavender-generated kill list and identify when they had returned to their family homes.
According to an intelligence officer who spoke to investigators, the military preferred to strike targets in their homes.
“It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
The strategy was to bomb homes at night, often wiping out entire families, women, children, and relatives, along with the intended target. The sanitised, almost childlike name masks a brutal logic that turns family homes into designated combat zones and civilians into collateral damage.
The Human Cost: When Algorithms Erase Humanity
The result of this AI-assisted warfare is a staggering human toll. Palantir’s technology turns civilians into targets, families into coordinates, and entire neighbourhoods into data points to be erased. Sold as innovation, this software becomes a silent executioner, enabling decisions to be made far from the people who bear the consequences.
As one intelligence source bluntly put it, “The algorithm made the decisions. We just signed off.” This detachment from consequence is one of the most dangerous aspects of Palantir AI warfare. It sanitises killing, transforming it into a data-driven process devoid of empathy or ethical reflection.
A Future Decided by Code
The story of Palantir’s involvement in the Gaza conflict is a stark warning about the convergence of data, power, and war. While technology itself is neutral, its application is not. In the hands of unchecked power, it can be used to annihilate the most vulnerable and systematically destroy entire communities with terrifying efficiency.
The atrocities in Gaza highlight a critical question for our time: who is accountable when an algorithm gets it wrong? As we move deeper into an era of automated warfare, we must demand transparency, oversight, and a renewed commitment to human dignity.
Because in this new age, technology doesn’t just predict the future. It decides who gets to have one.