I wonder if GPG would gain mainstream adoption as now AI can generate convincing videos. Cryptographically signing whould become a solution to fake videos - not signed, can't trust.
4 months ago 路 馃憤 maxheadroom
@ruby_witch Right.. That's not something I considered. But I don't think it's an issue is it?
For example, say there's an image floating around claiming it's from big news sources like Associated Press or Reuters. Or claimed the image is publushed by the White House. It would be easy to confirm that using GPG. And people would want to keep the GPG signature to keep the realiablity and credibility. But if its something made up by Some-Random-Dush-Meme-2024 on Facebook. Either the signature is missing or you can't match it up with AP/Reuters. Then we know it's not trust worthy. 路 4 months ago
@haze Maybe that's an idea that would work for you if you only consume media that's sent directly to you from your friends who've signed it.
But when grandpa sees a video signed by his trusted news source "VKontakteMemes420" and shares it with of his friends on Facebook, you'd better believe they're going to know it's real! And those are the people who are causing all of the trouble in the world today. You're already smart enough to know better. 路 4 months ago
@ruby_witch I agree with the 1st issue. But I **think** it can be mitigated with some really good UX design.
For 2. Bots won't have the same GPG key as the real person. Say someone is trying to inpersonate me. They won't have my GPG key. (Even if I got hacked and leaked it, I can always revoke the key) 路 4 months ago
Two problems:
1. I think at this point we've safely proven that normies, the kind of people who are likely to be influenced by a fake AI video, will never know or care what GPG or cryptographic signatures are.
2. This is the more important one: What's to prevent an AI from automatically signing every video it generates? Or a bad actor signing them after the fact before distributing them to the public? 路 4 months ago