Google Employees Arrested in Protests Against $1.2B Israeli Military Contract

Google employees protest $1.2B "Project Nimbus" contract with Israel, citing human rights concerns over AI-powered military operations. Arrests highlight growing tensions between workers and the company's business dealings.

author-image
Bijay Laxmi
Updated On
New Update
Google Employees Arrested in Protests Against $1.2B Israeli Military Contract

On Tuesday, nine Google employees were arrested during sit-in protests at the company's offices in New York City and Sunnyvale, California. The protesters were demanding an end to Google's $1.2 billion "Project Nimbus" cloud computing contract with the Israeli government and military.

In New York, four workers occupied the 10th floor common area for nearly 8 hours before police arrived and arrested them for trespassing when they refused to leave. The protesters hung a banner that read "Google Worker Sit-In Against Project Nimbus. No Tech for Genocide." Similarly, in Sunnyvale, five workers occupied the office of Google Cloud CEO Thomas Kurian and were also arrested after refusing to leave voluntarily.

The protesters argued that Google's work on Project Nimbus, which provides the Israeli government with artificial intelligence and machine learning services, is aiding in human rights abuses. Software engineer Billy Van Der Laar, one of the arrested protesters in Sunnyvale, stated during a live-stream on Twitch: "We did not leave voluntarily. We were arrested."

Why this matters: The protests highlight the growing tensions between Google employees and the company's business dealings, particularly the controversial Project Nimbus contract with Israel. The arrests highlight that some Google workers are willing to take direct action against what they see as unethical use of technology.

The Israeli military has been using an advanced artificial intelligence system called "Lavender," developed by the elite intelligence unit Unit 8200, to identify potential targets associated with Hamas during the ongoing airstrikes in Gaza. Lavender is capable of rapidly processing large amounts of data to flag up to 37,000 Palestinian men with alleged links to Hamas for targeting. Experts have highlighted the association between the use of the Lavender AI system and the rising civilian casualties, with Hamas-controlled health ministry figures reporting 33,000 Palestinian deaths amidst the nearly six-month long conflict.

The Kill List: To build the Lavender system, information on known Hamas and Palestinian Islamic Jihad operatives was fed into a dataset, along with data on people 'loosely' affiliated with Hamas, such as employees of Gaza's Internal Security Ministry. Lavender was trained to identify "features" associated with Hamas operatives, including being in a WhatsApp group with a known militant, changing cellphones every few months, or changing addresses frequently. That data was then used to rank other Palestinians in Gaza on a 1–100 scale based on how similar they were to the known Hamas operatives in the initial dataset. People who reached a certain threshold were then marked as targets for strikes.

Collateral impact: Intelligence officers were given wide latitude when it came to civilian casualties. During the first few weeks of the war, officers were allowed to kill up to 15 or 20 civilians for every lower-level Hamas operative targeted by Lavender; for senior Hamas officials, the military authorized "hundreds" of collateral civilian casualties. Suspected Hamas operatives were also targeted in their homes using a system called "Where's Daddy?" which put targets generated by Lavender under ongoing surveillance, tracking them until they reached their homes — at which point, they'd be bombed, often alongside their entire families.

The use of AI technology like Lavender in warfare raises significant legal and ethical concerns. Intelligence officials have noted that Lavender's statistical approach is relied upon more than the judgment of emotionally impacted soldiers, representing a shift towards a more dispassionate manner of conducting military operations. The arrested Google employees argue that the company's AI and machine learning services are supporting these controversial military actions through the Project Nimbus contract.

Key Takeaways

  • 9 Google employees arrested in sit-in protests against Project Nimbus contract with Israel
  • Protesters demand end to $1.2B cloud computing contract, citing human rights abuses
  • Israeli military using advanced AI "Lavender" to identify and target Palestinians in Gaza
  • Lavender trained on data from known Hamas operatives and 'loosely' affiliated individuals
  • Intelligence officers allowed to cause significant civilian casualties when targeting Hamas
  • Experts raise legal and ethical concerns over AI's role in warfare and civilian casualties
  • Protests highlight growing tensions between Google employees and company's business dealings