this post was submitted on 18 Jul 2023
7 points (100.0% liked)

Technology

60087 readers
2731 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

The time has come for us to make passwords for identifying each other..

top 14 comments
sorted by: hot top controversial new old
[–] MargotRobbie@lemmy.world 0 points 1 year ago

With deepfake technology being so advanced nowadays, how will we ever know if the person we are talking with on the internet is who they say they are?

[–] redcalcium@c.calciumlabs.com 0 points 1 year ago* (last edited 1 year ago) (1 children)

Right now deepfakes doesn't work well when the face is viewed from extreme angles, so you can ask them to slowly turn their face to the side or up/down as far as they can until the face is not visible. It also doesn't work well when something obstruct the face, so ask them to put their hand in their face. It also can't seem to render mouth right if you open it too wide, or stick out your tongue.

I base this from a deepfake app I tried: https://github.com/s0md3v/roop . But as the tech improves, it might be able to handle those cases in the future.

Edit: chance that the scammer use a live deepfake app like this one: https://github.com/iperov/DeepFaceLive . It also supports using the Insight model which only need a single well lit photo to impersonate someone.

[–] 14th_cylon@lemm.ee 1 points 1 year ago

Right now deepfakes doesn’t work well when the face is viewed from extreme angles, so you can ask them to slowly turn their face to the side or up/down as far as they can until the face is not visible.

or, you know, you can just pickup the phone and call them.

[–] kn33@lemmy.world 0 points 1 year ago (1 children)

I got one of these a few months ago. I could tell it was fake before I even answered, but I was curious so I pointed my own camera at a blank wall and answered. It was creepy to see my friend's face (albeit one that was obviously fake if you knew what to look for) when I answered.

[–] Kodemystic@lemmy.kodemystic.dev 0 points 1 year ago (1 children)

How do these scamers know who our friends are? Also how are they able to get pictures or video from said friend to create the fake?

[–] kn33@lemmy.world 1 points 1 year ago

In my case, the friend's facebook account was compromised. So they were able to get his pictures and call me from his account.

[–] preasket@lemy.lol 0 points 1 year ago (1 children)

Here’s hoping for popularising secure communication protocols. It’s gonna become a must at some point.

[–] riskable@programming.dev 0 points 1 year ago (1 children)

WhatsApp video calls are end-to-end encrypted. A secure protocol means nothing in this context.

[–] Takumidesh@lemmy.world 1 points 1 year ago

But key exchanges work.

Signal for example, will warn you when the person you are talking to is using a new device.

As long as the user heeds the warning, it is an effective stop, and at the very least gives the user pause.

If the signal safety number changes, but the communication stays on track, as in, the context of the conversation is the same, it's unlikely to be a problem. But if the safety number changes and the next message is asking for money, that is a very simple and easy to process situation.

[–] AmbientChaos@sh.itjust.works 0 points 1 year ago* (last edited 1 year ago) (1 children)

I'm in the US and have a well off friend who had his Facebook hacked. The bad actors sent messages to his friends asking to borrow $500 until tomorrow because his bank accounts were locked and he needed the cash. Someone who was messaged by the bad actors posted a screenshot of a deepfaked video call he received that caused him to fall for it. Wild times we live in!

[–] tallwookie@lemmy.world 0 points 1 year ago (1 children)

I know someone who fell for a similar scam but it involved purchasing gift cards.

[–] djmarcone@lemmy.world 0 points 1 year ago (1 children)

I routinely get emails from the owner of the company I work for asking me to kindly purchase several large gift cards and forward them and the receipt to him for prompt reimbursement.

[–] graphite@lemmy.world -1 points 1 year ago

asking me to kindly purchase several large gift cards

kindly give me your money, thanks

[–] JazzAlien@lemm.ee -1 points 1 year ago

Dude had too much money. Simple.