On March 4, a Google Cloud Engineer was escorted out of Google’s “Mind the Tech” conference when he stood up and declared, “I refuse to build technology that powers genocide.” In April 2021, Google and Amazon launched a $1.2 billion contract, Project Nimbus, in partnership with the Israeli government. According to Google, the project aimed to “deliver cloud services to all government entities from across the state, including ministries, authorities and government-owned companies.” As it turned out, the technology would come to be utilized heavily in military affairs.
More and more Google workers are voicing their discontentment with their company through the No Tech For Apartheid campaign, a coalition co-founded by former Google Product Marketing Manager Ariel Koren.
“I learned about (Project Nimbus) during the May 2021 seizure of Gaza by Israel where 250 Palestinains were killed,” Koren said. “There was a lot of violence within the company where they were silencing workers’ voices who were speaking up for Palestine. And I learned that the company, during the same time as the siege on Gaza, put out this contract where they were going to be partnering with the Israeli military and government to profit a billion dollars off of the apartheid, settler colonial violence of Israel.”
As the implications of Project Nimbus were revealed, Koren grappled with her conscience. The opportunity to speak out bore the possibility of inspiring substantial change, but also held grave repercussions for Koren’s position in the company.
“As soon as I spoke out publicly, Google moved my role overseas, and gave me 17 days, and told me if I didn’t move, I would be ousted from the company and lose my role,” Koren said. “It was obvious that this was an act of retaliation.”
Koren also reported facing heavy backlash from HR and even threats on her life from Zionist coworkers attempting to intimidate her into silence. Tower reached out to Google for an interview and they have declined to provide a statement at this time.
“There’s a horrible culture of censorship within Google where they do not allow workers to speak up for Palestine,” Koren said. “And workers who do speak up for Palestine are called into HR and punished and threatened. It’s a very deeply entrenched culture of censorship.”
The technology employed in Project Nimbus allows for the heightened and potentially illegal surveillance of not just members of HAMAS, but also of the Palestinian people. With technology powered by AI, like facial recognition, sentiment analysis, and behavior detection, Project Nimbus grants Israel greater access to the personal data of Palestinian civilians and enables the Israeli military to exert unregulated control over Palestine.
“The way that I was feeling,” Koren said. “Was that the bare minimum that workers who are based in the United States who are working in large companies, the bare minimum that workers can do is speak out when their company is actively violating human rights and profiting off apartheid, settler colonial and genocidal violence. That’s just the bare minimum and it should be a protected right that all workers should have to speak out.”
There is frequent debate over the ethical applications of AI surveillance technology, and as an Engineering Technical Adviser working under the Tactical Technology Office of the US Department of Defense, Sam Nazari is uniquely positioned. Please note,Nazari wanted to make it clear that he is not speaking on behalf of The Tactical Technology Office or The US Department of Defense as a whole, but as an individual. None of Nazari’s comments are intended to be representative of the beliefs or intentions of his employers.
“What’s happening specifically in this project is that they’re using these techniques, and essentially tuning those techniques for specific behaviors they want to catch,” Nazari said. “I don’t think there’s anything out of the ordinary there, this is a time when the world is in a very difficult place, there’s a lot of conflict and turmoil, lots of people are suffering on both sides of this issue. We can’t say without too much contradiction that one country cannot defend itself, but at the same time, the question arises, what is the limit of the use of that technology?”
With family in both the United States and Lebanon, Mia Fakih ’25 sees the duality of circumstances clearly. Based on the experience of her relatives and the previous actions of the Israeli government, Fakih feels the lack of defined limits for Project Nimbus’ application is troubling, yet expected.
“I’m not surprised (by Project Nimbus) at all,” Fakih said. “There have been multiple examples of Israel propaganda and extreme use of force against the Palestinian civilians. They have broken multiple laws, including laws that are placed within the UN and the only two (countries) who aren’t objecting to these violations of basic human rights are the U.S. and Britain.”
As the conflict between Israel and Palestine continues, more and more reports of human rights violations against Palestinian civilians is emerging, and there is growing concern that the technology in Project Nimbus could exacerbate what some are calling a genocide of the Palestinian people.
“I think it’s disgusting, and I think it’s a huge invasion of privacy,” Fakih said. “But nobody really has the right to privacy, especially in Arab countries.”
Given the developing nature of AI, especially in Project Nimbus, Nazari emphasizes the potential implications of bugs in the technology and the subsequent need for the vetting of the system.
“Will it be classifying people as hostile if they are just engaging in, for example, feeding their kids?” Nazari said. “Will it be classifying people as hostile if they have some sort of neurobiological disorder and they don’t walk the same way as everybody else (which) could look suspicious? These are the kinds of questions that must be brought up.”
Still, Nazari points out that the tools Project Nimbus provides are somewhat of a mixed bag. For Israeli citizens, the technology could provide a sense of security, although that security isn’t necessarily granted to Palestinians.
“(Surveillance technologies) does provide the citizens that live nearby a sense of security,” Nazari said. “In other words, there’s cameras, there’s things pointed at where, quote—unquote, ‘the bad guys’ are coming from, and it’s a computer that’s monitoring that place.”
While the technology employed in Project Nimbus may give Israelis a sense of security, its ability to analyze situations has yet to outdo the capabilities of a human.
“If you compare a human being, and some sort of artificial intelligence agent, the human will be much less susceptible to being duped, but they were still duped,” Nazari said. “So I don’t think surveillance systems are going to prevent major attacks, and I don’t think that they will provide a framework for us to have a lasting peace.”
Although there is no basis for handling conflicts in a world of burgeoning AI technology, Nazari says it’s important people continue to ask questions about developments like Project Nimbus.
“What it really takes is people who are curious about the world and care about the matter and to ask questions to keep the world headed in the right direction,” Nazari said. “My personal opinion is that there is no magic solution. Artificial intelligence or not, there’s quite a lot of suffering until we get to a place where we can actually have peace in that region, and I hope that comes soon.”
There are few times in an individual’s life when they are faced with the opportunity to change the world. Sure, we all have the chance to better our own lives and the lives of those in our communities. But, the opportunity to spur a systematic change around the globe is seldom, if ever granted. And when it is, it is rarely ever capitalized on.
“We have so many mass-direct actions and protests that we’ve done,” Koren said. “I think that shows the power of workers to speak out and that we’re growing in numbers.”