this post was submitted on 27 Jul 2023
222 points (97.8% liked)

World News

32285 readers
905 users here now

News from around the world!

Rules:

founded 5 years ago
MODERATORS
 

A.I. company Worldcoin has rolled out 1,500 Orbs to more than 35 cities in a bid to create digital identities for the world's citizens.

top 37 comments
sorted by: hot top controversial new old
[–] Arotrios@kbin.social 44 points 1 year ago (1 children)

Just hell no. Sounds like a spray paint campaign is in order. I'm gonna go post this on the anarchy subs and see how they feel about it (unless you already got there first).

[–] BroBot9000@lemmy.world 17 points 1 year ago (1 children)

Baseball bats sound more effective. Make ‘em eat the costs.

[–] Tibert@compuverse.uk 7 points 1 year ago (1 children)

Just hide your eyes or they'll scan you before you can beat the ball

[–] BroBot9000@lemmy.world 9 points 1 year ago

Sunglasses and a mask are still totally fine to wear nowadays. Just walk up and pretend to trip.

[–] fearout@kbin.social 31 points 1 year ago* (last edited 1 year ago) (2 children)

“I’ve been very interested in things like universal basic income and what’s going to happen to global wealth redistribution,” Sam Altman, Worldcoin’s cofounder

Holy crap it’s Sam Altman, the CEO of OpenAI. After that recent article about his $2 Kenyan workers it’s much harder to believe in benevolent intentions.

[–] explodicle@local106.com 3 points 1 year ago

Any time someone creates a new coin instead of using the thousands available, it's 99.9% a scam. We don't need a new money supply for a UBI, this has been discussed to death in crypto circles for over a decade.

[–] DaveNa@lemmy.ml 1 points 1 year ago

That's twice the minimum wage where I live.

[–] Dubious_Fart@lemmy.ml 28 points 1 year ago (1 children)

Well I look forward to hearing the endless tales of people smashing the fuck out of these, and taking the hardware to figure out how to do greater damage to the entire project.

[–] killernova@lemmy.world 5 points 1 year ago

I think these would look pretty cool in my art deco living room, and they're free too! Such a great deal ;)

[–] StalksEveryone@lemmy.villa-straylight.social 24 points 1 year ago (1 children)

People trying to force social credit onto the free world.

[–] jonesy@lemmy.ml 7 points 1 year ago

Shiit, if you think about it, we kinda already have a social credit system in the US. It's less social I suppose, but does affect things that effect our social status, like being able to finance a car or house afforedably.

[–] NaibofTabr@infosec.pub 15 points 1 year ago (1 children)

Hmm, based on the pictures in the article this thing is basically a camera in a shiny ball about 1ft in diameter (it appears to be about the width of 3 bricks laid side by side). It's not like a Cloud Gate-sized object. To get a scan of your irises you would have to be pretty close to it for at least a few seconds - it's not like it could get a scan if you're just walking by a few feet away. You'd have to walk up and point your face at it on purpose. The camera in it also looks fixed - I doubt it can rotate to follow you, that would be mechanically complex, expensive and prone to failure.

Based on the description, their software takes an image of your irises and reduces it to a hash value. The original image is deleted (they claim) and the hash value is stored as an ID code. It seems likely that the hash value will be unique to their software - e.g. if you wrote your own code to produce hash values from images, you would get a different number even if you had the same picture of the same eyes. So the hash value doesn't necessarily represent anything about your eyes that would be much of a privacy invasion... It's just a mathematically derived number string which is unique to their software.

It's not clear what part of this system is "AI", though my guess would be it has something to do with re-identifying your eyes next time you want to access whatever is secured with your hash code. It's really not clear how that would work... a new image of your eyes collected a year later under different lighting conditions would probably produce a different hash value, so how does this system match them, if it only records the hashes?

FWIW, I think smashing or spray painting these things, while fun ideas in the rebellious teenager sense, is probably overkill and likely to get you more attention from law enforcement than you want. But, you could probably just walk up behind it and slap a sticker or tape over the camera... they'd still have to pay someone to go out and peel it off.

[–] jon@kbin.social 7 points 1 year ago (1 children)

Taking a picture instantly after would probably create a different hash value. The thing about hashing is that even if one bit is different between source images, the resulting hashes would look entirely different.

I suppose I could conceive of a proprietary hash algorithm that would allow for fuzzy matching of iris photos, but as you said, eyes taken years apart in different conditions wouldn't match the original hash. Or falsely match similar looking eyes. It's not like this system allows them to get high resolution perfectly lit iris photos, after all.

The whole thing sounds dubious, and I suspect AI is mentioned solely to secure investor funding, much like how several years back everything mentioned Blockchain.

[–] UFODivebomb@programming.dev 1 points 1 year ago

They are likely using a form of https://en.wikipedia.org/wiki/Perceptual_hashing

The noise level a perceptual hash is sensitive to can be tuned.

The "falsely match similar looking" is harder than one would expect. I used to work on an audio fingerprinting system which was extremely robust to "similar" audio matching. What sounded similar to us was always identified uniquely by the hash with high confidence.

For example. Take the same piano piece done by the same artists on the same piano performed as close as they could to the same: never confused the perceptual hash with ~10 sec of audio. Not once. We could even identify how much of a pre-recorded song was used in a "live" performance.

There are adversarial attacks for perceptual hashes. However, "similar eyes" would not be one to a standard perceptual hash. More like: a picture of an abstract puppy happens to have the same hash as an eye.

I'd be curious on the details of the hash. That is necessary to know what the adversely attacks are. But I see no mention of the details. Which is suspicious on it's own.

[–] Erikatharsis@kbin.social 11 points 1 year ago

The absolute absurdity of a news article on nefarious data collection requiring that I enable JS to read it, just so that it can load a ridiculous number of trackers.

[–] Tibert@compuverse.uk 10 points 1 year ago (1 children)

What the hell is this? How can they even do this without getting deleted.

I'm not sure I understood it correctly. Do people just need to just look at the mirrored surface to get scanned, and they get a coin?

You don't know and don't have an account your bad! We have your eyes now!

Or do people need to read a privacy policy and accept everything before they get scanned?

[–] Tibert@compuverse.uk 6 points 1 year ago (1 children)

Well it's not magic at least https://worldcoin.org/blog/engineering/opening-orb-look-inside-worldcoin-biometric-imaging-device

It's not a 360 camera. People have to be able to look at the dark spot where the glass for the cameras are.

[–] BroBot9000@lemmy.world 2 points 1 year ago (1 children)

Cool now we know how to approach and destroy these abominations.

[–] BaroqueInMind@kbin.social 4 points 1 year ago* (last edited 1 year ago) (1 children)

Don't destroy them. Ship them to me so I can convert them into stationary cameras for my front door and yard with my self hosted security system.

[–] Goathound@kbin.social 2 points 1 year ago

Eyes for the Eye God!

[–] morry040@kbin.social 6 points 1 year ago

This article described some concerning methods they used to develop and deploy the system...
https://www.technologyreview.com/2022/04/06/1048981/worldcoin-cryptocurrency-biometrics-web3/

[–] CeleryFC@beehaw.org 3 points 1 year ago (1 children)

One step closer to minority report

[–] Cybersteel@lemmy.ml 1 points 1 year ago

Yea all those futuristic dystopia and gantz.

[–] inspxtr@lemmy.world 3 points 1 year ago

So this company has ties with OpenAI? That is kinda concerning …

[–] CreativeTensors@beehaw.org 2 points 1 year ago* (last edited 1 year ago)

So, no way in hell this could comply with GDPR then.

[–] coco@lemmy.world 2 points 1 year ago

Another shitcoin lol !!!

[–] yip-bonk@kbin.social 0 points 1 year ago* (last edited 1 year ago) (2 children)

Passersby need only gaze into its mirrored surface, whereupon the device will scan their irises and generate a unique hash or numeric code attached to their particular set of eyes. In exchange, each participant will receive a World ID and a WLD token.

What th’ . . . are you kids on _dope?!_

[–] jungle@lemmy.world 1 points 1 year ago (1 children)

The article is misrepresenting the whole thing. It's voluntary, the devices are not just randomly picking up iris scans.

Click bait at it's finest.

[–] mPony@kbin.social 0 points 1 year ago

BUT WE WANT TO FREAK OUT AND BE AFRAID FOREVER : The Internet

[–] Blastasaurus@lemm.ee 0 points 1 year ago (1 children)

Have none of you ever travelled?! They already have our retinal scans...

[–] explodicle@local106.com 1 points 1 year ago

I have traveled and have no idea what you're talking about.

load more comments
view more: next ›