The White House wants to ‘cryptographically verify’ videos of Joe Biden so viewers don’t mistake them for AI deepfakes::Biden’s AI advisor Ben Buchanan said a method of clearly verifying White House releases is “in the works.”

  • @Blackmist@feddit.uk
    link
    fedilink
    English
    25 months ago

    Honestly I’d say that’s on the way for any video or photographic evidence.

    You’d need a device private key to sign with, probably internet connectivity for a timestamp from a third party.

    Could have lidar included as well so you can verify that it’s not pointing at a video source of something fake.

    Is there a cryptographically secure version of GPS too? Not sure if that’s even possible, and it’s the weekend so I’m done thinking.

    • @SpaceCowboy@lemmy.ca
      link
      fedilink
      English
      15 months ago

      It’s way more feasible to simply require social media sites to do the verification and display something like a blue check on verified videos.

      This is actually a really good idea. Sure there will still be deepfakes out there, but at least a deepfake that claims to be from a trusted source can be removed relatively easily.

      Theoretically a social media site could boost content that was verified over content that isn’t, but that would require social media sites to not be bad actors, which I don’t have a lot of hope in.

      • @kautau@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        5 months ago

        I agree that it’s a good idea. But the people most swayed by deepfakes of Biden are definitely the least concerned with whether their bogeyman, the “deep state” has verified them