this post was submitted on 04 Apr 2025
1668 points (98.9% liked)

Technology

68400 readers
2529 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A Microsoft employee disrupted the company’s 50th anniversary event to protest its use of AI.

“Shame on you,” said Microsoft employee Ibtihal Aboussad, speaking directly to Microsoft AI CEO Mustafa Suleyman. “You are a war profiteer. Stop using AI for genocide. Stop using AI for genocide in our region. You have blood on your hands. All of Microsoft has blood on its hands. How dare you all celebrate when Microsoft is killing children. Shame on you all.”

Sources at Microsoft tell The Verge that shortly after Aboussad was ushered out of Microsoft’s event, she sent an email to a number of email distribution lists that contain hundreds or thousands of Microsoft employees. Here is Aboussad’s email in full:

archive.today link

you are viewing a single comment's thread
view the rest of the comments
[–] CheeseToastie@lazysoci.al 46 points 1 day ago (7 children)

Can anyone ELI5 how they're using AI for genocide? I have awful IT skills so I don't understand AI

[–] DrDeadCrash@programming.dev 19 points 10 hours ago

Here's her words on it

When I moved to AI Platform, I was excited to contribute to cutting-edge AI technology and its applications for the good of humanity: accessibility products, translation services, and tools to “empower every human and organization to achieve more.” I was not informed that Microsoft would sell my work to the Israeli military and government, with the purpose of spying on and murdering journalists, doctors, aid workers, and entire civilian families. If I knew my work on transcription scenarios would help spy on and transcribe phone calls to better target Palestinians (source), I would not have joined this organization and contributed to genocide. I did not sign up to write code that violates human rights.

[–] GreyAlien@lemm.ee 93 points 1 day ago (5 children)

In long.

In short:

The AI system labeled tens of thousands of Gazans, mostly men, as suspected militants, with a 10% error rate, meaning thousands were likely civilians.

Human officers spent ~20 seconds per target, often just confirming gender, before approving airstrikes.

"Where’s Daddy?": A companion AI tracked targets to their homes, prioritizing bombings at night when families were present.

The military authorized 15–20 civilian deaths per low-ranking militant and 100+ for senior Hamas officials

Strikes frequently used unguided munitions, maximizing destruction and civilian harm

Officers admitted acting as "stamps" for AI decisions, with one calling the process "hunting at large"

Additional informations: Project Nimbus

[–] stormdahl@lemmy.world 15 points 12 hours ago

That is seriously dystopian. Wow, what the fuck..

[–] CheeseToastie@lazysoci.al 6 points 10 hours ago

When families were present??? That's indefensible. It's not just the children in the home, it's all the neighbours kids, visitors and passers by. It's inhumane.

[–] Knock_Knock_Lemmy_In@lemmy.world 6 points 12 hours ago (1 children)

Google and Amazon are mentioned in the Wikipedia article but not Microsoft.

Not defending them, just asking for better evidence.

[–] Bazoogle@lemmy.world 13 points 11 hours ago (1 children)

Microsoft isn't mentioned in the text, but the 3rd citation is a reference to Microsoft: Microsoft to Launch Much Awaited Cloud Server Farm in Israel in 2021

Separately, I did a quick search: Revealed: Microsoft deepened ties with Israeli military to provide tech support during Gaza war

In recent years, documents show, Microsoft has also provided the Israeli military with large-scale access to OpenAI’s GPT-4 model – the engine behind ChatGPT – thanks to a partnership with the developer of the AI tools which recently changed its policies against working with military and intelligence clients.

[–] Knock_Knock_Lemmy_In@lemmy.world 4 points 10 hours ago

Thanks. Not doubting, but it's good to have detailed sources.

[–] Gormadt@lemmy.blahaj.zone 26 points 1 day ago* (last edited 1 day ago)

Holy fuck!

That's fucking awful!

[–] OmegaLemmy@discuss.online 23 points 1 day ago

this is comedically evil I don't even know what to say

[–] tauren@lemm.ee 38 points 1 day ago (1 children)

From the article:

The Israeli military uses Microsoft Azure to compile information gathered through mass surveillance, which it transcribes and translates, including phone calls, texts and audio messages, according to an Israeli intelligence officer who works with the systems.

From my understanding, they use AI to automate the processing of text, audio, and video data collected by the intelligence services.

[–] Realitaetsverlust@lemmy.zip 33 points 1 day ago (2 children)

AI is being pushed into war machines big time. America and China are both working on it. With ukraine showing how incredibly effective drones are in warfare, just imagine the damage and destruction a swarm of drones controlled by an AI could cause.

[–] stormdahl@lemmy.world 3 points 12 hours ago

Hey, I’ve seen that episode of Black Mirror!

[–] CheeseToastie@lazysoci.al 15 points 1 day ago

Its scary and awful

[–] supersquirrel@sopuli.xyz 16 points 1 day ago* (last edited 1 day ago)

Stop believing in the veneer of smartness and superiority around these genocidal fuckers.

There is no NON ELI5 explanation here of how they're using AI for genocide, because the truth is horrible, stupid and brutal.

They are using AI because it is the best tool bullshitters have currently to offload blame for things they, individual human beings, chose to do onto obscure abstract entities like corporations, AI decisions and other bullshit.

There is nothing more to it than that, I promise you, it is all just layers of bullshit that is attempting to obscure culpability for participating in a genocide, and honestly it is the perfect technology for that.

[–] TheFriar@lemm.ee 35 points 1 day ago (2 children)
[–] ganbramor@lemmy.world 23 points 1 day ago (1 children)

But critics warn the [AI] system is unproven at best — and at worst, providing a technological justification for the killing of thousands of Palestinian civilians.

This 2023 article didn’t age well.

[–] buddascrayon@lemmy.world 14 points 1 day ago

The critics were right

[–] CheeseToastie@lazysoci.al 18 points 1 day ago

Fucking hell that's bad

[–] j0ester@lemmy.world 4 points 1 day ago* (last edited 1 day ago)

There are also many higher education professors doing smart drones. I was in one conference and they were showing off drones flying by themselves and showed the difference between different weather and everything.