💾 Archived View for dioskouroi.xyz › thread › 29375949 captured on 2021-11-30 at 20:18:30. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
________________________________________________________________________________
Facebook has a lot of very smart engineers and I am certain that once they put some effort behind it that they will be able to delay this until 2030 or even further.
or delay forever. They have no interest in e2e.
i know being a blind hater is cool, but it has nothing to do with facebook's desires but rather your elected officials begging them not to turn off the data hose.
I was actually commenting more on the fact that they announced only a few weeks ago that they were delaying until 2022.
And I actually don't see how e2e and their business model sync in any possible way. They make their money off of knowing everything about what you do on and off their sites so how does walling off all of that data make any sense?
So they say they are working on it but I see a distinct contrast between their 'vision' and their needs as a company
Am I really understanding this article correctly? It's claiming that it's bad that FB is delaying encryption because encryption is good, but also bad that it's adding encryption because encryption will help child predators, but that these two points are not contradictory because child predation is only an issue on social networks and not on other messaging services?
This article sounds like it was written by a markov chain trained on members of Congress bloviating about encryption and "protecting the children".
It may not be as non-sensical as it sounds.
Unlike Whatsapp, FB messenger is tied to a public profile, where people extensively share their life. This allows a predator to get contextual info like age, likes/dislikes, family problems and more which helps them in finding a victim and set a trap for the child.
In the US, law enforcement surveilling all messages in order to analyze to look for suspected child abusers would be illegal.
We're talking about hoping that private companies will do it on their own (or be required to do it somehow?) in order to turn over to the police what their systems flag as looking like predators? And that it's important E2EE doesn't interfere with that?
> In the US, law enforcement surveilling all messages in order to analyze to look for suspected child abusers would be illegal
What if they do it to look for suspected terrorists (which is legal) and accidentally also check for child abusers?
> We're talking about hoping that private companies will do it on their own (or be required to do it somehow?) in order to turn over to the police what their systems flag as looking like predators
IIRC Facebook already do this, and have been criticized by charities in the domain that it isn't enough.
I don't believe it's legal to do for terrorism, but they may do it anyway semi-secretly, and it is currently legally contested (and they keep it semi-secret to make it harder to contest legally).
So, what if they do?
By reports they do indeed already do this, and it was earnestly explained in the tweet[1] linked elsewhere in this thread by a FB developer that the delay in E2EE is to make sure they can continue to do it.
I don't like it, but as you note, others have other opinions.
I think we should be clear that it's what's going on, what is we're talking about when we talk about delaying encryption to protect children, exactly what it is that is being proposed. Others in this thread still think "oh no that's not what we're talking about they'd never do that." Let's be clear about what we're talking about, whether we support it or not.
[1]:
https://twitter.com/elegant_wallaby/status/14628453362888253...
I don't think that's it.
If you do suspect that someone is a child predator, encrypting what is probably the most likely stashes of incriminating evidence would make things harder to build a strong case. I could see a lack of encryption being useful even when everyone follows rules that many would see as too strict (I don't fall in that camp, I think).
I don't think trade off is worth it though.
That makes sense, but I didn't understand the point very well from the article because it seems to flip back and forth between unrelated criticisms of Facebook.
I don't know enough about the structure of predation to have an opinion on the merits of the point, but it sure seems like a reasonable excuse for delaying E2EE.
Whereas government files on children, often used by child predators who work for/in/with child services or their private contractors contain both public AND non-public info (e.g. medical data on the child) helping out even more. Child services child predators often stand out due to the enormous number of children they victimize. Current record is over one thousand. You might wonder how one person can even do that, as you physically don't have enough time to do that as one person, so ... prepare to be disgusted:
https://www.newyorker.com/magazine/2021/07/26/the-german-exp...
Yet, somehow, despite child predators using these institutions regularly (in fact even the loverboys often complained about often recruit girls from child services institutions), there is no need to do anything about that. In fact, quite the opposite.
Child abuse overwhelmingly happens in the circle of influence of said kids. So the internet will never have a significant impact. Try to explain that to angry and concerned parents...
It's like when apple said "Do you want to opt out of internet tracking?" and 90% of people said yes (rest were bots anyways). This is like saying "Do you want to opt out of us reading your conversations and selling everything you say?" Who wouldn't opt out, and it would only diminish saleable assets further. Ask a bank to loan money with no interest while at it.
Facebook doesn't use the contents of messages for ads targeting (except link clicks).
There are other entities interested in the content, so the content gains value and why shouldn't Facebook make deals here?
Facebook has lied before though.
Facebook also uses the contents of messages to decide which messages to censor (some
explicitly, some links shadowbanned) and which accounts to delete or suspend.
There are many URLs you can't send to people on Facebook.
I opted in to tracking. I would rather see ads relevant for me than useless garbage that is nowhere near my interests.
For a company the size of facebook, I am having trouble understanding what is the difficulty in rolling this out.
At this rate, the virtual world they are making will take 100 years, or simply, it will be even more difficult to trust facebook.
Here’s a good thread from someone who saw it from the inside:
https://twitter.com/elegant_wallaby/status/14628453362888253...
Basically management assumed (as it seems that you do) that E2EE is a technical feature that can be flipped on without changing anything in how the social model works on Meta’s three distinct, unconnected messaging systems.
It is a technical feature. It can be "flipped on", though that exaggerates the ease.
That they're delaying e2ee doesn't surprise me, but that they claim to want to implement it in the first place does. I think the intent to implement it is a effectively a farce to convince us that the "hard choice" FB will make to not give users privacy was all about protecting our children.
Child sex abuse could be brought to a practical minimum without reading everyone's communication, and reading comms is not going to stamp it out. Reading everyone's communication is the prize, not saving kids.
The unencrypted messaging didn't seem to stop Jefferey Epstein and Ghislaine Maxwell, nor the many who participated in the activities with them.
That's the point:
Large system = difficult to change
Not only because of technical challenges, but also regulatory issues.
The Metaverse doesn't really have users, so you could progress faster - but I highly doubt it as well.
It's difficult to roll out because the vast majority of users don't want it. Facebook is trying to find ways to add it without breaking UX and without making it too difficult for people in the EU (where encryption will be required) to talk to people outside the EU (where it may be optional).
Actually the EU is working on mandating backdoored E2E so they don't want this.
https://news.ycombinator.com/item?id=29200506
https://news.ycombinator.com/item?id=29308617
Why would someone want to opt out of end to end encryption?
- server-side searching of messages becomes impossible; this essentially breaks message search for most people - depending on how it's done, it can mean the loss of messages when changing devices (one of the top complaints I hear about signal)
They could start with something like Telegram’s “secret chats”, where users explicitly start e2ee conversations separate from regular conversations.
They already have secret chats which are supposedly encrypted.
I don't want E2EE because I don't care at all whether or not Facebook has the plaintext of my chats, and I don't want to make any of the usability tradeoffs required for E2EE to work.
What useability tradeoffs?
If you don't care if your chats are in plain text, then you shouldn't care if they are encrypted.
If my chats are encrypted and FB doesn't have the private key, I can't send messages from my desktop without either
1. my phone is on and does a secure key exchange each time I login on desktop, in which case I have to carefully audit complex novel crypto or trust that FB doesn't snoop the key during the exchange.
or
2. I have to scan some QR code or some other bullshit to make desktop chat work.
And more importantly, it's possible for me to lose my message history if I lose all my devices.
I don't want any of these.
> 1. my phone is on and does a secure key exchange each time I login on desktop, in which case I have to carefully audit complex novel crypto or trust that FB doesn't snoop the key during the exchange.
This isn't a usability tradeoff, is it? If you don't care that your messages are in clear text, then I don't see why you would then be forced to audit how their crypto works to be sure that it works so that you could then use that crypto for messaging. I don't audit how my web browser uses ssl, do you?
> 2. I have to scan some QR code or some other bullshit to make desktop chat work.
> And more importantly, it's possible for me to lose my message history if I lose all my devices.
Yea, these do seem like potential usability issues and I could see these being valid reasons for them not wanting to roll it out.
That being said, they are facebook and they have a ridiculous amount of engineering talent to apply. I would be really curious to know why they think they can solve the problem well it in a few years but not in a shorter time frame or if they don't think they can achieve it in a few years and are just kicking the can down the road in the hope that everyone forgets about the can later.
Or 3, use a time- and memory-hard key derivation algorithm that generates your private key from a password that is never sent to the server.
And if a user forgets the password, all their messages are gone!
Also, requiring that users memorize another password is a pretty huge UX burden. Remember that FB has billions of users, so we're talking about mostly non saavy users. No popular E2EE messaging service that does this as far as I'm aware.
I've begun using E2EE on FB Messenger and these are the tradeoffs I've noticed:
- Photos don't pinch-to-zoom
- No read-receipt indicators
- No reactions besides the primary emoji
- Primary emoji can't be changed
- No nicknames or other frills
...none of which have _anything_ to do with E2EE itself. Feels like MVP issues, to be honest.
Was there some kind of poll that shows that users don't want encryption? What reasons do those users have for not wanting encryption? What was would adding encryption break ux?
i submitted an app to ios app store recently and they said something about certain encryption has to have some type of government ok before it can be on the app store, is this related to that or?