this post was submitted on 19 Jul 2023
60 points (96.9% liked)
Asklemmy
44182 readers
1536 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Nah once deepfakes become simple enough for the majority to make, citizen-created video evidence will be worthless.
Only 'tamper-proof' sources will be trusted even when they will be tampered with.
I don't remember if this came from cybersecurity logging practices or from anti-deepfake advice I saw online, but maybe physical cameras can constantly upload video evidence to a reliable third-party server which will save the checksums of, suppose, every minute's worth of data. Then there would be no way for the source of the video to retroactively replace the content on that server with deepfake videography without this leaving evidence in the checksums.
I'm not sure if/how the third-party server would be able to tell that it's listening to a real bodycam/dashcam rather than simply receiving data from a deepfake-generating AI model. I guess to use a video for evidence, you'd have to have corroborating evidence from nearby people who recorded the same event from a different angle (AI-generated videos would have trouble with creating different angles of the same event, right?).
And even if you can't use a video as evidence, witness testimony has always been used in court. Someone else on Lemmy wrote that people have been making arguments in court since before there was photo/video evidence; our justice system (whoever "our" refers to) will simply revert to pre-camera ways when a photo/video cannot be trusted.
Another option related to the checksum solution is that camera manufactorers could implement a system on the physical camera where the raw file is tagged with some checksum/stamp and the same is stored on the device. In a situation where the validity of the photo/video in question, you could use the raw files and the physical device that captured it as the proof.
I'm sure we will see multiple attempts to solve this, whether it be adverserial "de-fake" AIs, some physical verification or something completely different. It will be interesting to see what work and not, and what may turn out to become the standard way of verification
Then some company will put out a camera that uploads all the video to the cloud with verification and makes it read only.