Time-stamp: <2021-03-21 16:24>

Fucking With the Likes of Clearview «AI»

Facial Recognition: What Happens When We're Tracked Everywhere We Go? [NY Times, possibly paywalled]

Societies are built on trust. People by default trust one another. Rogue «AI» data-hoovering companies work by abusing this trust. When I created my web site in 1998, I never would have thought that some company would scrape my data to put it into a database for mass facial recognition. I never consented to this, and while German copyright law essentialy forbids this, I'm powerless against it.

These data collection practices work because the information we put out is honest. And therein lies the catch: Why not fuck with the data? Suppose we'd all create a bogus noggin and post it tagged with our real name. Suppose we create a pool of pictures, the more individuals the better, and randomly mix those, posting it in various places linked to our names. Would that not have a chance of breaking their scheme?

What do you think?

----------

✍ Wolfgang Mederle CC BY-SA 4.0

✉ <madearl@mailbox.org>

language: EN

date: 2021-03-21 16:22

tags: privacy "facial recognition"