this post was submitted on 28 Mar 2024
621 points (97.1% liked)

Technology

58111 readers
5125 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Israel has deployed a mass facial recognition program in the Gaza Strip, creating a database of Palestinians without their knowledge or consent, The New York Times reports. The program, which was created after the October 7th attacks, uses technology from Google Photos as well as a custom tool built by the Tel Aviv-based company Corsight to identify people affiliated with Hamas.

you are viewing a single comment's thread
view the rest of the comments
[–] GrymEdm@lemmy.world 89 points 5 months ago* (last edited 5 months ago) (3 children)

Israel is the type of control-heavy far-right state other dictators wish they could govern, and it's made possible by Western money and technology (I was going to name just the US but my country of Canada, among others, is not blameless either). This news also sucks because there's no way that tech is staying in Israel only. Citizens of the world better brace for convictions via AI facial recognition.

"Our computer model was able to reconstruct this image of the defendant nearly perfectly. It got the hands wrong and one eye is off-center, but otherwise that's clearly them committing the crime."

[–] wanderingmagus@lemm.ee 18 points 5 months ago (1 children)

From what I remember, AI facial recognition tech was already being used by police and agencies worldwide, like the FBI, PRC police etc, or am I misinformed? I remember something about Chinese and American facial recognition software.

[–] GrymEdm@lemmy.world 18 points 5 months ago* (last edited 5 months ago) (1 children)

I had not read anything like that but a quick search pulled up this story from last September by Wired that supports your post: FBI Agents Are Using Face Recognition Without Proper Training. "Yet only 5 percent of the 200 agents with access to the technology have taken the bureau’s three-day training course on how to use it, a report from the Government Accountability Office (GAO) this month reveals." So it sounds like you're right, and also that they are probably inadequately trained even if they complete all 3 days on how to identify people with legal ramifications.

[–] wanderingmagus@lemm.ee 11 points 5 months ago

And I wonder how many of those 95% have already used misapplied AI facial recognition to justify FISA court warrants for ~~stalking~~ investigating ~~random people~~ suspected terrorists?

[–] Sanctus@lemmy.world 10 points 5 months ago* (last edited 5 months ago) (3 children)

Facial tattoos of drop table commands. Embed computer worms into your iris. We can get insane to fuck all this shit up too. I bet theres a way to embed a computer virus on your own face.

[–] GrymEdm@lemmy.world 15 points 5 months ago* (last edited 5 months ago) (2 children)

I guess I'll adjust my life goals to "hot cyberpunk partner in technological dystopia", because that sounds like some Bladerunner/Cyberpunk 2077 stuff.

[–] Sanctus@lemmy.world 6 points 5 months ago (2 children)

Its not that far off. We'll see exactly what I said soon enough. You can put a virus or worm inside an image in an email. You can do the same thing with a tattoo. Its unfortunate it will be here so far before the superhuman cybernetics.

[–] rottingleaf@lemmy.zip 5 points 5 months ago (1 children)

You can put a virus or worm inside an image in an email.

I'd much prefer that people who haven't done this wouldn't talk.

[–] Sanctus@lemmy.world 2 points 5 months ago (1 children)

Are you implying you can't use steganography techniques on real objects and images? You act like I stated it would be easy.

[–] rottingleaf@lemmy.zip 0 points 5 months ago* (last edited 5 months ago) (1 children)

OK, so who'll decode your "virus" from those real objects? Or it's a case of "I'm a poor Nigerian virus, please kindly run me with root privileges on a system with such and such"?

EDIT: I mean, steganography is too a word a person should know the meaning of before using.

[–] Sanctus@lemmy.world 1 points 5 months ago* (last edited 5 months ago) (1 children)

Just because you said this wouldn't work like SQL Injection, does not mean it won't. You don't know either. Have you worked on facial recognition databases? How do they store their data? Its most likely just a database. Then I would start by looking at steganography techniques to see how those can be applied. Obviously I'm not hiding an executable in there, but I don't see why you couldn't try for unsanitized input, you never know. Now if you want to continue into realism, you would just wear a full face mask outside. You also never answered my question about steganography.

[–] rottingleaf@lemmy.zip 0 points 5 months ago (1 children)

Your question doesn't make any fucking sense in the context of attacking anything, steganography is encoding your message inside redundant encoding for something else.

So, about that word.

A "virus in an image" situation is for cases when a program which will open that image has some vulnerability the attacker knows about, so the image is formed specifically to execute some shellcode in this situation.

Same with "a virus in an MP3", some MP3 decoder has a known vulnerability allowing a shellcode.

Same with PDFs and anything else.

There are more high-level situations where programs with their own complex formats (say, DOCX which is a ZIP archive with some crap inside) execute stuff.

All this is not steganography.

Steganography is when, a dumb example, you have an image and you hide your message in lower bits of pixel color values. Or something like that with an MP3 file.

Obviously I’m not hiding an executable in there, but I don’t see why you couldn’t try for unsanitized input, you never know.

Attacks are a matter of probabilities, and "you never know" doesn't suffice.

[–] Sanctus@lemmy.world 1 points 5 months ago (1 children)

So they're just storing all this facial data unencoded somewhere? Theres no way to figure that out? There is no sort of encoding/decoding going on with the facial data at all? Its impossible chief back it up the bots won? I don't think so man. People are gonna find all sorts of ways to fuck with this. Now you can join in the speculation or get expactorating all over this post. The choice is your's.

[–] rottingleaf@lemmy.zip 0 points 5 months ago

You seem to be talking to your imagination.

[–] Lath@kbin.earth 2 points 5 months ago (1 children)

Sounds like a great time to start a costume & mask making company named "The ministry of silly walks".

[–] Sanctus@lemmy.world 1 points 5 months ago

This is probably what people would actually do. Just wear a full mask.

[–] wanderingmagus@lemm.ee 3 points 5 months ago

Honestly with enshittification "technological dystopia" sounds like exactly where we already are. Now, if only implants weren't being R&D'd by Muskrat and there were some open source non-invasive version...

[–] fruitycoder@sh.itjust.works 2 points 5 months ago

Attempts at adversial ai tatoo, face masks and clothing have been done before. Basically exploiting the model not having a deeper understanding of the world so you can trick it with specific visual artifacts.

[–] rottingleaf@lemmy.zip 2 points 5 months ago (1 children)

Which may even work with 0.001% probability of that recognized string not being screened.

There's a difference between SQL injections on thematic web forums and the same in such a system.

That "we can ... too" is lazy complacency. "They" will get even stronger while "we" talk like this.

[–] Sanctus@lemmy.world 0 points 5 months ago (2 children)

Nothing is casual about this. Be pessimistic if you want. But we will not stop jabbing the eye that watches. This is an arms race.

[–] rottingleaf@lemmy.zip 1 points 5 months ago (1 children)

What I'm saying is that you personally haven't done any of this and look stupid.

Yep, people do use vulnerabilities in software and hardware to do things. Just not you, so that "we" seems weird.

Neither did I, I just played with crackmes and shellcodes a bit, but I'm not the person writing pretentious posts with that "we".

[–] Sanctus@lemmy.world 0 points 5 months ago

The original commentor I replied to was speculating about this being commonplace. You came in with your statements about people having to do things to talk about them in a post about speculation.

[–] feedum_sneedson@lemmy.world 0 points 5 months ago

You are doing absolutely nothing, are you.