💾 Archived View for nicksphere.ch › 2021 › 01 › 18 › consumer-data-protection-is-a-distraction captured on 2022-04-29 at 12:16:44. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2022-04-28)
-=-=-=-=-=-=-
_ _ _ _ _ (_)__| |__ ____ __| |_ ___ _ _ ___ | ' \| / _| / /(_-< '_ \ ' \/ -_) '_/ -_) |_||_|_\__|_\_\/__/ .__/_||_\___|_| \___| |_|
This post is a public service announcement.
Businesses collect data from consumers for a variety of reasons. Data is collected to provide better customer service, provide a personalized customer experience, refine marketing strategy, derive other data, suggest new products, make predictions, recommendations and determine optimal business decisions. But as internationally renowned security technologist and author Bruce Schneier points out, data is a toxic asset[1].
I recommend reading his full blog post[2]. But from just a consumer perspective, giving companies your data is dangerous for several reasons according to Schneier:
"Saving it is dangerous because many people want it. Of course companies want it; that’s why they collect it in the first place. But governments want it, too. In the United States, the National Security Agency and FBI use secret deals, coercion, threats and legal compulsion to get at the data. Foreign governments just come in and steal it. When a company with personal data goes bankrupt, it’s one of the assets that gets sold.
Saving it is dangerous because it’s hard for companies to secure. For a lot of reasons, computer and network security is very difficult. Attackers have an inherent advantage over defenders, and a sufficiently skilled, funded and motivated attacker will always get in."
That last part is important. "...a sufficiently skilled, funded and motivated attacker will always get in". The problem is you cannot trust corporations to keep your data safe. There aren't exceptions to this that come to mind. Even if we suppose the data is encrypted on the server and only you control the encryption key, that's not the case of a corporation being trustworthy to hold your data. They couldn't leak it if they wanted to. That's what's called trustless design. The system is set up so you don't have to trust whoever you're doing business with. The best of systems are set up that way. It's good for the consumer and it minimizes risk for the business.
The central reason you can't trust businesses to keep your data safe is you don't know how it's being handled once it's out of your hands. Even if the business claims to have reasonable data protection, how can you possibly know that for sure? All it takes is 1 incompetent or malicious employee for your data to be leaked. All it takes is 1 out of date software package or 1 software vulnerability. All it takes is 1 government to steal from or coerce the business for the data. And if there's ever a merger or acquisition then some other business acquires your data as an asset by default.
And let's not forget data is combined with other data by data brokers to derive things about you that you didn't explicitly share. You might think that 5 minute Youtube video of yourself doesn't reveal too much but disturbing uses of AI[3] can be applied to it to derive information that you didn't intend to include. And AI will only get better over time. You can't predict the capabilities future AI will have to derive new information from your data. Even if it's just metadata[4], remember the former CIA and NSA director Michael Hayden's statement concerning NSA bulk surveillance: "We kill people based on metadata". Put simply, consumer data protection is, has always been, and will be for the foreseeable future, a house of cards.
The only foolproof way to protect yourself from data leaks is to never give data to businesses in the first place. "Consumer data protection" is a distraction campaign. You see, the more businesses talk about "consumer data protection" the less "bandwidth" there is in public discourse to talk about outright refusal to give up your data. Businesses can tout their data security practices all they want but it distracts from the truth which is you can just choose not to give your data to companies. We now live in a culture of "I agree" to the point that people forget they can say no to these things. Don't consent. Don't click "I agree" unless you've actually read the terms. Don't provide identifying information without serious consideration.
And for those of you who say "I have to give Goolag[5] my data! Rearranging my life to protect my data would be too hard! I need a Goolag account for my job or university or whatever the case may be." I leave you with a quote from the Roman stoic Seneca:
"It's not that we don't dare do things because they are difficult; rather, they are difficult because we don't dare" -- Seneca
Link(s):
1: https://www.schneier.com/blog/archives/2016/03/data_is_a_toxic.html
2: https://www.schneier.com/blog/archives/2016/03/data_is_a_toxic.html
3: https://github.com/daviddao/awful-ai
4: https://wikiless.org/wiki/Metadata
5: https://www.urbandictionary.com/define.php?term=Goolag
Unless otherwise noted, the writing in this journal is licensed under CC BY-SA 4.0.
Copyright 2019-2022 Nicholas Johnson