We in Telegram
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010
November 2010
December 2010
January 2011
February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
21
22
23
24
25
26
27
28
29
30
31
News Every Day |

Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.

3
Vox
An injured girl with a scarf on her head holds up her hand as she steps out of the passenger seat of a van.
A December 2023 photo shows a Palestinian girl injured as a result of the Israeli bombing on Khan Yunis in the southern Gaza Strip. | Saher Alghorra/Middle East images/AFP via Getty Images

AI nudges us to prioritize speed and scale. In Gaza, it’s turbocharging mass bombing.

Israel has reportedly been using AI to guide its war in Gaza — and treating its decisions almost as gospel. In fact, one of the AI systems being used is literally called “The Gospel.”

According to a major investigation published last month by the Israeli outlet +972 Magazine, Israel has been relying on AI to decide whom to target for killing, with humans playing an alarmingly small role in the decision-making, especially in the early stages of the war. The investigation, which builds on a previous exposé by the same outlet, describes three AI systems working in concert.

“Gospel” marks buildings that it says Hamas militants are using. “Lavender,” which is trained on data about known militants, then trawls through surveillance data about almost everyone in Gaza — from photos to phone contacts — to rate each person’s likelihood of being a militant. It puts those who get a higher rating on a kill list. And “Where’s Daddy?” tracks these targets and tells the army when they’re in their family homes, an Israeli intelligence officer told +972, because it’s easier to bomb them there than in a protected military building.

The result? According to the Israeli intelligence officers interviewed by +972, some 37,000 Palestinians were marked for assassination, and thousands of women and children have been killed as collateral damage because of AI-generated decisions. As +972 wrote, “Lavender has played a central role in the unprecedented bombing of Palestinians,” which began soon after Hamas’s deadly attacks on Israeli civilians on October 7.

The use of AI could partly explain the high death toll in the war — at least 34,735 killed to date — which has sparked international criticism of Israel and even charges of genocide before the International Court of Justice.

Although there is still a “human in the loop” — tech-speak for a person who affirms or contradicts the AI’s recommendation — Israeli soldiers told +972 that they essentially treated the AI’s output “as if it were a human decision,” sometimes only devoting “20 seconds” to looking over a target before bombing, and that the army leadership encouraged them to automatically approve Lavender’s kill lists a couple weeks into the war. This was “despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases,” according to +972.

The Israeli army denied that it uses AI to select human targets, saying instead that it has a “database whose purpose is to cross-reference intelligence sources.” But UN Secretary-General Antonio Guterres said he was “deeply troubled” by the reporting, and White House national security spokesperson John Kirby said the US was looking into it.

How should the rest of us think about AI’s role in Gaza?

While AI proponents often say that technology is neutral (“it’s just a tool”) or even argue that AI will make warfare more humane (“it’ll help us be more precise”), Israel’s reported use of military AI arguably shows just the opposite.

“Very often these weapons are not used in such a precise manner,” Elke Schwarz, a political theorist at Queen Mary University of London who studies the ethics of military AI, told me. “The incentives are to use the systems at large scale and in ways that expand violence rather than contract it.”

Schwarz argues that our technology actually shapes the way we think and what we come to value. We think we’re running our tech, but to some degree, it’s running us. Last week, I spoke to her about how military AI systems can lead to moral complacency, prompt users toward action over non-action, and nudge people to prioritize speed over deliberative ethical reasoning. A transcript of our conversation, edited for length and clarity, follows.

Sigal Samuel

Were you surprised to learn that Israel has reportedly been using AI systems to help direct its war in Gaza?

Elke Schwarz

No, not at all. There have been reports for years saying that it’s very likely that Israel has AI-enabled weapons of various kinds. And they’ve made it quite clear that they’re developing these capabilities and considering themselves as one of the most advanced digital military forces globally, so there’s no secret around this pursuit.

Systems like Lavender or even Gospel are not surprising because if you just look at the US’s Project Maven [the Defense Department’s flagship AI project], that started off as a video analysis algorithm and now it’s become a target recommendation system. So, we’ve always thought it was going to go in that direction and indeed it did.

Sigal Samuel

One thing that struck me was just how uninvolved the human decision-makers seem to be. An Israeli military source said he would devote only about “20 seconds” to each target before authorizing a bombing. Did that surprise you?

Elke Schwarz

No, that didn’t either. Because the conversation in militaries over the last five years was that the idea is to accelerate the “kill chain” — to use AI to increase the fatality. The phrase that’s always used is “to shorten the sensor-to-shooter timeline,” which basically means to make it really fast from the input to when some weapon gets fired.

The allure and the attraction of these AI systems is that they operate so fast, and at such vast scales, suggesting many, many targets within a short period of time. So that the human just kind of becomes an automaton that presses the button and is like, “Okay, I guess that looks right.”

Defense publications have always said Project Convergence, another US [military] program, is really designed to shorten that sensor-to-shooter timeline from minutes to seconds. So having 20 seconds fits quite clearly into what has been reported for years.

Sigal Samuel

For me, this brings up questions about technological determinism, the idea that our technology determines how we think and what we value. As the military scholar Christopher Coker once said, “We must choose our tools carefully, not because they are inhumane (all weapons are) but because the more we come to rely on them, the more they shape our view of the world.”

You wrote something reminiscent of that in a 2021 paper: “When AI and human reasoning form an ecosystem, the possibility for human control is limited.” What did you mean by that? How does AI curtail human agency or reshape us as moral agents?

Elke Schwarz

In a number of ways. One is about the cognitive load. With all the data that is being processed, you kind of have to place your trust in the machine’s decision. First, because we don’t know what data is gathered and exactly how it then applies to the model. But also, there’s a cognitive disparity between the way the human brain processes things and the way an AI system makes a calculation. This leads to what we call “automation bias,” which is basically that as humans we tend to defer to the machines’ authority, because we assume that they’re better, faster, and cognitively more powerful than us.

Another thing is situational awareness. What is the data that is incoming? What is the algorithm? Is there a bias in it? These are all questions that an operator or any human in the loop should have knowledge about but mostly don’t have knowledge about, which then limits their own situational awareness about the context over which they should have oversight. If everything you know is presented to you on a screen of data and points and graphics, then you take that for granted, but your own sense of what the situation is on the battlefield becomes very limited.

And then there’s the element of speed. AI systems are simply so fast that we don’t have enough [mental] resources to not take what they’re suggesting as a call to action. We don’t have the wherewithal to intervene on the grounds of human reasoning. It’s like how your phone is designed in a way that makes you feel like you need to react — like, when a red dot pops up in your email, your first instinct is to click on it, not to not click on it! So there’s a tendency to prompt users toward action over non-action. And the fact is that if a binary choice is presented, kill or not kill, and you’re in a situation of urgency, you’re probably more likely to act and release the weapon.

Sigal Samuel

How does this relate to what the philosopher Shannon Vallor calls “moral de-skilling” — her term for when technology negatively affects our moral cultivation?

Elke Schwarz

There’s an inherent tension between moral deliberation, or thinking about the consequences of our actions, and the mandate of speed and scale. Ethics is about deliberation, about taking the time to say, “Are these really the parameters we want, or is what we’re doing just going to lead to more civilian casualties?”

If you’re not given the space or the time to exercise these moral ideas that every military should have and does normally have, then you’re becoming an automaton. You’re basically saying, “I’m part of the machine. Moral calculations happen somewhere prior by some other people, but it’s no longer my responsibility.”

Sigal Samuel

This ties into another thing I’ve been wondering about, which is the question of intent. In international law contexts like the genocide trial against Israel, showing intent among human decision-makers is key. But how should we think about intent when decisions are outsourced to AI? If tech reshapes our cognition, does it become harder to say who is morally responsible for a wrongful act in war that was recommended by an AI system?

Elke Schwarz

There’s one objection that says, well, humans are always somewhere in the loop, because they’re at least making the decision to use these AI systems. But that’s not the be-all, end-all of moral responsibility. In something as morally weighty as warfare, there are multiple nodes of responsibility — there are lots of morally problematic points in the decision-making.

And when you have a system that distributes the intent, then with any subsystem, you have plausible deniability. You can say, well, our intent was this, then the AI system does that, and the outcome is what you see. So it’s hard to attribute intent and that makes it very, very challenging. The machine doesn’t give interviews.

Sigal Samuel

Since AI is a general-purpose technology that can be used for a multitude of purposes, some beneficial and some harmful, how can we try to foretell where AI is going to do more harm than good and try to prevent those uses?

Elke Schwarz

Every tool can be refashioned to become a weapon. If you’re vicious enough, even a pillow can be a weapon. You can kill somebody with a pillow. We’re not going to prohibit all pillows. But if the trajectory in society is such that it seems there’s a tendency to use pillows for nefarious purposes, and access to pillows is really easy, and in fact some people are designing pillows that are made for smothering people, then yes, you should ask some questions!

That requires paying attention to society, its trends and its tendencies. You can’t bury your head in the sand. And at this point, there are enough reports out there about the ways in which AI is used for problematic purposes.

People say all the time that AI will make warfare more ethical. It was the claim with drones, too — that we have surveillance, so we can be a lot more precise, and we don’t have to throw cluster bombs or have a large air campaign. And of course there’s something to that. But very often these weapons are not used in such a precise manner.

Making the application of violence a lot easier actually lowers the threshold to the use of violence. The incentives are to use the systems at large scale and in ways that expand violence rather than contract it.

Sigal Samuel

That was what I found most striking about the +972 investigations — that instead of contracting violence, Israel’s alleged AI systems expanded it. The Lavender system marked 37,000 Palestinians as targets for assassination. Once the army has the technological capacity to do that, the soldiers come under pressure to keep up with it. One senior source told +972: “We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly.”

Elke Schwarz

It’s kind of a capitalist logic, isn’t it? It’s the logic of the conveyor belt. It says we need more — more data, more action. And if that is related to killing, it’s really problematic.

Москва

Азербайджанцев оправдали за убийство спортсмена Евгения Кушнира в Самарской области. Делом заинтересовался глава Следкома РФ А. Бастрыкин

$90,000 settlement approved in teen’s bullying lawsuit against LAUSD

AML check crypto

Gunmen open fire and kill 4 people, including 3 foreigners, in Afghanistan's central Bamyan province

Ballroom culture coming to the Long Beach Pride Festival

Ria.city






Read also

Hamilton says Mercedes in 'no man's land'

Pro-Palestine Heckling, Fighting Break Out at Jerry Seinfeld Stand-Up Gig: ‘Love a Little Jew Hate to Spice Up a Show’ | Video

Jurgen Klopp bids emotional farewell to Anfield on final day as drops huge hint he might RETIRE after leaving Liverpool

News, articles, comments, with a minute-by-minute update, now on Today24.pro

News Every Day

AML check crypto

Today24.pro — latest news 24/7. You can add your news instantly now — here


News Every Day

Ballroom culture coming to the Long Beach Pride Festival



Sports today


Новости тенниса
Вера Звонарёва

Звонарёва проиграла американке Крюгер в квалификации турнира в Страсбурге



Спорт в России и мире
Москва

"Спартак" обыграл "Рубин" в последнем матче Джикии



All sports news today





Sports in Russia today

Москва

В столице Туркменистана - Ашхабаде открыли памятник легендарному армянскому поэту и композитору Саят-Нове


Новости России

Game News

RPG Battle of Souls доступна в Google Play 2 стран


Russian.city


Москва

Москвичам рассказали, для чего нужно обособление трамвайных путей от автодорог


Губернаторы России
Спартак

"Спартак" обыграл "Рубин" со счетом 3:1


На севере Москвы произошло ДТП с участием пешехода на моноколесе

Шапки женские вязаные на Wildberries, 2024 — новый цвет от 392 руб. (модель 466)

20 мая: какой сегодня праздник, что было в этот день

Шапки женские на Wildberries — скидки от 398 руб. (на новые оттенки)


Звезде ТНТ Марине Кравец исполняется 40 лет

Россия, Культура, Теат, Дети, ПДД: кукольным языком о дорожной безопасности детям показали в Ульгэре

Яна Рудковская, Филипп Киркоров, Ксения Собчак, Мари Краймбрери и другие гости торжественного ужина в честь сотрудничества Димы Билана с парфюмерным брендом

Балерина Волочкова продолжила участвовать в благотворительных концертах


Звонарёва проиграла американке Крюгер в квалификации турнира в Страсбурге

Теннисисты Роджер Федерер и Рафаэль Надаль стали лицами рекламной кампании Louis Vuitton

Экс‑теннисистка Джорджи обвиняется в краже мебели и ковров на €100 тысяч — СМИ

Российский теннисист Медведев опустится на строчку в рейтинге ATP



Создание Сайтов. Создание веб сайта. Создание сайта html. Создание сайтов цена. Создание и продвижение сайтов. Создание сайта с нуля. Создание интернет сайта.

Азербайджанцев оправдали за убийство спортсмена Евгения Кушнира в Самарской области. Делом заинтересовался глава Следкома РФ А. Бастрыкин

Шапки женские на Wildberries — скидки от 398 руб. (на новые оттенки)

Бухалово и Париж: откуда появились необычные и смешные названия населенных пунктов в России


Cadillac насмерть сбил пенсионерку в центре Москвы

Открытие восьмого сезона программы «Военные оркестры в парках» в Подмосковье

Что там в IT: ИИ-отрыв Google, ChatGPT почти человек, отечественный BIOS

"Спартак" обыграл "Рубин" в последнем матче Джикии


Сергей Собянин. Главное за день

На севере Москвы произошло ДТП с участием пешехода на моноколесе

Уехавший из России актер Андрей Бурковский появился на премьере фильма в Каннах

ЦСКА вернулся в финал // Армейцы после годичного перерыва вновь поборются за титул чемпиона Единой лиги ВТБ



Путин в России и мире






Персональные новости Russian.city
Булат Окуджава

Тагильский художник Александр Иванов создал металлическую скульптуру Булата Окуджавы в честь 100-летия поэта   



News Every Day

Glen Powell’s parents crash Texas movie screening to troll him




Friends of Today24

Музыкальные новости

Персональные новости