💾 Archived View for dioskouroi.xyz › thread › 29396643 captured on 2021-11-30 at 20:18:30. Gemini links have been rewritten to link to archived content

View Raw

More Information

➡️ Next capture (2021-12-04)

-=-=-=-=-=-=-

FBI's ability to legally access secure messaging app content and metadata [pdf]

Author: sega_sai

Score: 306

Comments: 258

Date: 2021-11-30 19:53:32

Web Link

________________________________________________________________________________

throw_away_dgs wrote at 2021-12-01 00:07:37:

Some FBI agents came to my house once and told me that my home Internet had been used to visit Islamic Extremist websites. They brought a local police office with them and a 'threat assessment' coordinator from my workplace. They asked me if my family was Muslim and wanted to know if we had been radicalized.

We are not religious (at all). We do not attend church, synagogue or mosque. We are lower middle class white Americans born and raised in the USA and have never traveled outside the country.

I have no idea why they thought this about us. Maybe it was an IP mix-up, but it was very disturbing. I feared that I may lose my job. I became very afraid of the FBI that day. I think this could happen to anyone at anytime.

ClumsyPilot wrote at 2021-12-01 01:21:33:

"threat assessment' coordinator from my workplace"

"I feared that I may lose my job."

I understand that police/FBI have to conduct investigation. What dont understand is involvement of the employer , it's extremely disturbing - you have not been convincted, you have not been charged, you are not even a suspect or accused of anything at this point - how is your private life the business of your employer?

Why is your privacy being breached and livehood being placed at risk?

Surely the FBI is not allowed to publicise random dirt they find on innocent people?

BLKNSLVR wrote at 2021-12-01 01:22:24:

It's (scarily) interesting that they react with actual personal attendance based purely on a very limited set of electronic information.

From your further description:

> We are not religious (at all). We do not attend church, synagogue or mosque. We are lower middle class white Americans born and raised in the USA and have never traveled outside the country.

Would not the FBI have been able to any amount of background searching (read: further electronic information gathering), that would be less effort-intensive than getting arranging a 'threat assessment' coordinator from throw_away_dgs' actual workplace and a local police officer for an in-person door-knock. If such background checks were performed, then they either don't have much data or their threat weightings are set to red-scare levels of paranoia. Either way, it's scary.

Unless there's more to the story.

Eelongate wrote at 2021-12-01 01:42:00:

I think what he experienced is another manifestation of the same phenomena as zero-tolerance policies in schools; institutions ask their enforcers to suspend common sense and strictly enforce the letter of the law/guideline/etc, even in situations where any reasonable person would decide it made no sense. They do this because such common sense and gut feelings is how bias and prejudice might creep into their oh-so-perfect system.

It used to be that if a teacher saw a kid get bullied and then punch his bully back, the teacher was empowered to evaluate the situation using their best judgement, and punish the bully while congratulating the bullied kid who stuck up for himself. The system sees a problem with that; the teacher's perception of the incident might have bias and prejudice. The system's solution is to have zero tolerance for any violence and punish both students equally. The system's solution to the possibility of prejudice against one student is to ensure prejudice against _both_ students.

Terry_Roll wrote at 2021-12-01 01:32:01:

> It's (scarily) interesting that they react with actual personal attendance based purely on a very limited set of electronic information.

Either their intel is better than they let on and didnt think they would be walking into an ambush or they are more stupid than we think.

sjs382 wrote at 2021-12-01 01:21:47:

> and a 'threat assessment' coordinator from my workplace.

What was the reason for this? What type of workplace?

giantg2 wrote at 2021-12-01 00:27:39:

It absolutely can.

xanaxagoras wrote at 2021-11-30 20:47:01:

They left off one very popular messenger, SMS:

heavyset_go wrote at 2021-11-30 21:07:21:

There's also:

* Law enforcement simply asks nicely: can render all message content for the last 1-7 years

gnopgnip wrote at 2021-11-30 21:30:36:

The Stored Communications Act makes disclosing the contents of messages without a search warrant unlawful

AnthonyMouse wrote at 2021-11-30 21:47:03:

The people responsible for investigating and prosecuting such crimes have some not so great incentives to avoid doing so and keep the whole thing secret though, don't they?

And then when they get caught, they do this:

https://cdt.org/insights/the-truth-about-telecom-immunity/

2OEH8eoCRo0 wrote at 2021-11-30 22:21:13:

Sounds like an easy way to have your case tossed out in court.

It's funny how much this differs from my own personal experience with law enforcement. The friends I know are timid as hell and don't do anything without a warrant just to stay on the safe side- even if they probably don't need one.

marricks wrote at 2021-11-30 22:31:29:

I'm just glad you're here to stick up for your friends without any corroboration or linking story. It's just a good thing to do.

jdavis703 wrote at 2021-11-30 23:23:54:

Being charitable, let’s assume his friends work as homicide or theft detectives. If so, they need a high standard for admissible evidence to build their case.

If on the other hand his friends are street cops tasked with clearing a corner of drug dealers because some neighbor complained to their council person who complained to the police chief then those cops don’t necessarily care about extrajudicial activities.

Having been harassed by street cops and interacted with homicide detectives, I can tell you they vary tremendously in professionalism.

xanaxagoras wrote at 2021-11-30 23:28:41:

They definitely need a high standard for admissible evidence, that doesn't stop them from purchasing large amounts of data from all-too-willing communications companies and using parallel construction to build their case once they find out what happened via warantless spying.

2OEH8eoCRo0 wrote at 2021-11-30 23:38:53:

Cybercrime. Lots of scams and child abuse.

intricatedetail wrote at 2021-11-30 23:40:47:

They can also query these messages to see if there is something on the dealers they get paid from and then warn them if something comes up. It works both ways, no?

King-Aaron wrote at 2021-12-01 00:23:11:

Imagine a world where the entire law enforcement complex followed the law. What a world.

kingcharles wrote at 2021-11-30 23:06:19:

Good luck with that. In my case there was a ton of violations of the SCA. Violations of the SCA are only actionable if they are "constitutional" in nature. (That essentially means that if the government indict you based on information they illegally gathered through violating the SCA but the information did not belong to you - say it belonged your wife or business partner - then you can't get the information suppressed/excluded in court)

In my case the government did violate the SCA and my constitutional rights, but two judges have looked at it and both stated the same answer - the police must be allowed to commit crimes to gather evidence. Next stop: appeal courts.

giantg2 wrote at 2021-12-01 00:36:24:

Yep, the courts side with law enforcement. The whole 'truth comes out in a fair fight' is completely undermined by this. The system protects itself above all else.

I was involved with a case that sounds similar - the judges don't care about your rights and blatantly missapply the law. Also, magistrates are also _complete_ BS, and don't even know basic legal stuff. I had one think I called him prejudice when requesting a case be dismissed with prejudice... Complaints do nothing. There's no real oversight, leading to a completely incompetent system.

heavyset_go wrote at 2021-12-01 01:17:52:

> _There's no real oversight, leading to a completely incompetent system._

It's the system working as intended. If you want something that looks like justice, you'll need substantial wealth to get it.

giantg2 wrote at 2021-12-01 00:29:19:

"Sounds like an easy way to have your case tossed out in court."

This is terribly naive in my experience.

heavyset_go wrote at 2021-12-01 01:15:11:

Let's be honest, how often do people share with their pals about how they commit crimes, or are less than scrupulous, at work, assuming their pals aren't criminals, as well? People tend to keep things like that a secret, even from people that are close to them.

op00to wrote at 2021-11-30 22:25:25:

The really smart cops get the tips using “less than legal” means, then walk back and reconstruct using legal evidence.

a4isms wrote at 2021-11-30 22:41:17:

“Parallel Construction:”

https://en.wikipedia.org/wiki/Parallel_construction

jfrunyon wrote at 2021-11-30 22:08:29:

The reality is that many times the only barrier to sensitive information is a shared login which many people know and a statement that users represent that they have legal authority to access that info.

kevin_thibedeau wrote at 2021-11-30 22:36:46:

EO12333 makes it lawful without a warrant.

gnopgnip wrote at 2021-12-01 00:30:53:

> EO12333

An EO making it lawful for a federal agency to collect doesn't mean it is lawful for a private company to disclose, it doesn't change when a company is permitted to disclose the content of messages under the SCA

giantg2 wrote at 2021-12-01 00:37:54:

I mean, this whole discussion is moot since nobody will enforce things like this, especially against themselves.

dkdk8283 wrote at 2021-12-01 00:39:21:

Companies also sell data to law enforcement.

heavyset_go wrote at 2021-12-01 01:20:15:

Many tech companies even develop nice portals for law enforcement to use where they can request and view data, with or without a warrant or subpoena.

Consultant32452 wrote at 2021-11-30 21:23:58:

* Law enforcement wants to stalk ex-girlfriend: can render all message content for the last 1-7 years

lsiebert wrote at 2021-12-01 00:13:46:

It's about secure messaging

grumple wrote at 2021-12-01 00:26:14:

Source is a few years old, but I suppose we can make another FOIA request to find out how long carriers store text messages these days - it was basically 0-5 days a decade ago:

https://www.nbcnews.com/technolog/how-long-do-wireless-carri...

authed wrote at 2021-11-30 23:02:12:

You forgot email... and they don't need a warrant for messages older then 180 days if in the cloud (they never delete them, too):

https://www.consumerreports.org/consumerist/house-passes-bil...

kingcharles wrote at 2021-11-30 23:20:36:

IIRC the only reason this amendment was made was because the 180 day limit was found unconstitutional anyway by an appellate court. So, technically the amendment did nothing.

It doesn't matter where your data is held, locally or cloud, (if you are an American resident and your data is in the USA) as it is _your_ data and it is unconstitutional for them to read it without a warrant. In theory.

authed wrote at 2021-11-30 23:23:59:

> It doesn't matter where your data is held, locally or cloud

In the US it does

encryptluks2 wrote at 2021-12-01 00:14:44:

If they are local and encrypted... oops, forgot the encryption key.

kingcharles wrote at 2021-11-30 23:59:11:

Citation?

This ruling has been adopted by the US Supreme Court:

https://privacylaw.proskauer.com/2007/06/articles/electronic...

xster wrote at 2021-11-30 22:42:15:

This seems like a good place to say that I strongly recommend Yasha Levine's Surveillance Valley book (

https://www.goodreads.com/book/show/34220713-surveillance-va...

) where he suggests that all of this is working as intended, going all the way back to the military counter-insurgency roots of the arpanet first in places like Vietnam, and then back home in anti-war and leftist movements. The contemporary themes that are relevant are the fact that current privacy movements like Tor, Signal, OTF, BBG are fundamentally military funded and survive on government contracts. It distracts from the needed political discourse into a technology one where "encryption is the great equalizer" and everyone can resist big brother in their own way on the platforms the government has built. Encryption does exist, but it also distracts from other vectors like vulnerabilities (that led to Ulbricht getting caught), what services you would e2e connect to, how you get the clients to connect to those services, what store can push binaries for said clients etc.

adamfisk wrote at 2021-11-30 23:04:08:

Yasha Levine is a conspiracy theorist hack. There’s really no other way to say it. His narrative is attractive to a left leaning audience with shallow knowledge in this area, but the reality is that without publicly funded software like Tor, Signal, OTF, and my own Lantern, our world would be more fully saturated with corporate control of the internet. We need more public funding for open source software (with public security audits, mind you), not less. Without them, we’d basically be left with Wikipedia as the only popular entity on the internet outside of corporate control.

All of these projects are more properly grouped with government funding in other spheres, such as the BBC or PBS in media, than they are with the surveillance state or the NSA. Levine overlooks basic details, such as reproducible builds, that quickly collapse the house of cards that is his narrative. He tries to paint them all with the NSA brush, when, in fact, they’re simply projects that have historically received some of their funding from the government while fulfilling missions with extraordinary humanitarian benefits. Levine’s own knowledge and experience in this area is shallow. Look elsewhere.

xster wrote at 2021-11-30 23:23:14:

I don't disagree with what you're saying. I'm not sure your statement is in disagreement with mine either? I don't think he's saying less OSS is better or anything dogmatic? All he's saying is that using Tor/Signal shouldn't be the end all be all of your surveillance concerns.

> would be more fully saturated with corporate control of the internet

You might disagree. His point was that the "corporate controllers of the internet" support projects like Tor because A) it gives a (somewhat ineffective) channel for people to focus on rather than political recourses and B) there's no real threat to the corporate model. What would you do in this e2e encrypted internet without corporate services?

> such as reproducible builds

Seems like a tangential point. You can have an untampered copy of a client with a vulnerability.

> funding from the government while fulfilling missions with extraordinary humanitarian benefits

I don't think this is in disagreement with anything either

pphysch wrote at 2021-12-01 00:34:45:

This comment is an incredibly naive attempt at a smear.

> Without them, we’d basically be left with Wikipedia as the only popular entity on the internet outside of corporate control.

Wikipedia is absolutely not "outside of corporate control". It is trivially astroturfed to advance special interests.

> All of these projects are more properly grouped with government funding in other spheres, such as the BBC or PBS in media

Both BBC and PBS routinely publish outright disinformation to advance the special interests of their corporate/government clients, including the intelligence community. For example, look at PBS Frontline's ridiculous puff piece for the violent extremist group HTS last year.

> Levine overlooks basic details, such as reproducible builds

Reproducible builds are also easily circumvented by _selectively_ deploying backdoors and other malware, based on IP or other fingerprints.

If there are good reasons to dispute Levine's investigative journalism, they're not here.

CyanBird wrote at 2021-12-01 00:43:30:

> from the government while fulfilling missions with extraordinary humanitarian benefits

Ahh yes, the famed operation Condor, operation Gladio, operation iceberg and so many other famed "humanitarian" projects

At the end of the day all that you mentioned goes back to a post-facto "it is good because *we* do it", I would go to say that most people here in HN are well aware of the start of Google when it was funded by us Intel as a way to parse Vietnam era datasets, or how US Intel uses Radio Free Asia to destabilize enemy countries abroad, but again, it is only good/not bad when "*we"* do it

Apologies for a rather low quality comment, but these types of persons handwaving the actual structure behind all of this really get on my nerves, specially when I have had family members be tortured as a consequence of these US activities

b8 wrote at 2021-11-30 23:49:11:

Ulbricht was caught via poor OPSEC and not via a Firefox/Tor 0day afaik. Though there was/is speculation that a Firefox/Tor 0day was used to bring down some Tor markets and possibly to locate the Silk Road's server. Silk Road 2.0 was brought down in like a few months, which could indicate such a 0day existed. Or that it was ran by some former Silk Road staff members who got doxed when Silk Road 1.0 was shut down.

ichydkrsrnae wrote at 2021-12-01 00:50:51:

Ulbricht was caught because an FBI agent, who would read things slowly and twice, recognized these 4 letters : _heyy_.

That's how Ulbricht sometimes spelled _hey_, and the agent had seen that particular spelling before in his investigation, in an email from Ulbrict’s student email address.

Nick Bilton's book “American Kingpin: The Epic Hunt for the Criminal Mastermind Behind the Silk Road” is a great read, highly recommended.

unobatbayar wrote at 2021-12-01 01:13:53:

That's what they want you to think. He was caught because; Nothing can match against the surveillance arsenal of the NSA.

ichydkrsrnae wrote at 2021-12-01 01:24:48:

That's not what I think, that's what the Bilton thinks. The quality of his book makes me partial to his opinion, of course, but NSA conspiracy blah adds nothing.

SavantIdiot wrote at 2021-11-30 23:51:08:

Where is any evidence of Tor being a military surveillance project? I find it hard to believe an open source project like this has been infiltrated. Yes, there is suspicion some ECC curves are compromised, but only the ones provided by NIST. I'd really like to see evidence of Tor.

tbihl wrote at 2021-12-01 00:35:42:

The seed for that line of thinking is the fact that a US Navy lab built it.[0] Having said that, I believe that's the only basis and is a far cry from making the theory convincing or even probable.

[0]

https://en.m.wikipedia.org/wiki/Tor_(network)

SavantIdiot wrote at 2021-12-01 00:43:12:

Wow, I feel like an idiot. All this time I had no idea the Navy built it, when a simple Wiki search would have pointed that out. Thanks!

tptacek wrote at 2021-11-30 23:45:56:

Signal isn't funded by the military, by OTF/BBG, or any branch of the USG government. People who claim otherwise are confused (deeply) about a program OTF ran that sponsored third-party security reviews and development projects (summer-of-code style), none of which was mediated through OTF --- it was just a bucket of money.

You should be extremely skeptical about people who bring OTF/BBG up in these discussions. I have complicated feelings about Tor stemming mostly from culture and effectiveness concerns and would push back on claims that it's co-opted by the Navy or corporate interests, but at least I can see a clear (if silly) line connecting Tor to these supposed conflicts of interest.

tialaramex wrote at 2021-12-01 00:22:49:

As I understand it the technology behind Tor is strengthened by an arms race. You _want_ several different well-funded entities running nodes, because that makes the service better for everybody. Even if some of those entities are _hostile_ they still help unless one entity controls a large portion of interior nodes and even then you're only giving metadata to that single entity (whichever it is) by using Tor, not anybody else - which is better than you're going to do with alternative technologies.

suetoniusp wrote at 2021-11-30 23:39:56:

Thank you. I never knew the source of the ridiculous theory that the internet sprang from spying attempts on the Vietnamese. I am always looking for keywords to filter conspiracy weirdos. Yasha Levine added

hutzlibu wrote at 2021-11-30 22:54:06:

"are the fact that current privacy movements like Tor, Signal, OTF, BBG are fundamentally military funded and survive on government contracts."

Are those "facts" avaiable for investigating, without having to buy the book?

(that Tor is partly US administration funded is known, but Signal? And what is OTF and BGG?)

xster wrote at 2021-11-30 23:30:49:

https://www.opentech.fund/results/supported-projects/open-wh...

Funded by Open Technology Fund (OTF)

https://en.wikipedia.org/wiki/Open_Technology_Fund

Which is funded by Radio Free Asia (RFA)

https://en.wikipedia.org/wiki/Radio_Free_Asia

. It had a few reboots but was created as a CIA program in 1951 (

https://en.wikipedia.org/wiki/Radio_Free_Asia_(Committee_for...

) to blast shortwaves into China from Manilla to try to overthrow the Chinese government. Rebooted more recently since the advent of the great firewall of China.

evgen wrote at 2021-12-01 00:32:15:

Wow, that is so thin it is transparent. If this is the sort of 'proof' that we are going to find then I am glad you posted the ref here so that I could add yet another kook to the list of those whose privacy/security rantings and books I can ignore. The biggest danger to long-term privacy projects is not the risk of taking advantage of an opportune partnership with a government agency when incentives align, it is conspiracy nutjobs poisoning the well with their paranoia and delusions.

hutzlibu wrote at 2021-12-01 01:30:40:

And Signal?

The main tool, used for private communication?

georgyo wrote at 2021-11-30 20:11:54:

It says Telegram has no message content. Isn't telegram not E2EE by default, instead required explicit steps to make a conversation encrypted?

Either way looks like Signal wins by a lot. The size of it spot is so small, it seems almost squeezed in. But only because they have nothing to share.

nimbius wrote at 2021-11-30 20:42:58:

for signal users this means the messages of course _do_ exist on your phone, which will be the first thing these agencies seek to abscond with once youre detained as its infinitely more crackable in their hands.

as a casual reminder: The fifth amendment protects your speech, not your biometrics. do not use face or fingerprint to secure your phone. use a strong passphrase, and if in doubt, power down the phone (android) as this offers the greatest protection against offline bruteforce and sidechannel attacks used currently to exploit running processes in the phone.

leokennis wrote at 2021-11-30 21:13:19:

My advice if you’re not on the level where three letter agencies are actively interested in your comings and goings:

- Use a strong pass phrase

- Enable biometrics so you don’t need to type that pass phrase 100 times per day

- Learn the shortcut to have your phone disable biometrics and require the pass phrase so you can use it when police is coming for you, you’re entering the immigration line in the airport etc. - on iPhone this is mashing the side button 5 times

ndesaulniers wrote at 2021-11-30 23:22:25:

> Learn the shortcut to have your phone disable biometrics and require the pass phrase

On my Pixel (Android), it's hold the power button for ~2 seconds then select Lockdown.

wskinner wrote at 2021-11-30 21:33:33:

On recent iPhones, the way to disable biometrics is to hold the side button and either volume button until a prompt appears, then tap cancel. Mashing the side button 5 times does not work.

minhazm wrote at 2021-11-30 21:42:46:

Not sure how recent you're talking but I have an iPhone 11 Pro and I just tested pressing the side button 5 times and it takes me to the power off screen and prompts me for my password the same way that side button + volume does.

Apple's docs also say that pressing the side button 5 times still works.

> If you use the Emergency SOS shortcut, you need to enter your passcode to re-enable Touch ID, even if you don't complete a call to emergency services.

https://support.apple.com/en-us/HT208076

cyral wrote at 2021-11-30 22:08:22:

Pressing it five times starts the emergency SOS countdown (and requires the passcode next time) on my iPhone XS. Maybe you have the auto-calling disabled?

samtheprogram wrote at 2021-11-30 23:25:54:

It doesn't on my 2nd Gen iPhone SE (2020). That said, anything that causes the "swipe to power off" screen to appear has the same affect, so essentially holding down the button for 5 seconds does the trick.

diebeforei485 wrote at 2021-11-30 23:56:46:

The side button 5 times thing is disabled by default, but can be enabled from Settings > Emergency SOS.

I just verified this on iOS 15.1 on an iPhone 12.

croutonwagon wrote at 2021-11-30 22:13:13:

Works fine on my 11, my wifes 12, her backup SE gen 2 and my backup SE gen1.

Just tested all of them

upofadown wrote at 2021-11-30 23:09:59:

In most cases you are going to want to separately passphrase your messaging stuff so it is locked up when you are not using it. That makes every thing else a lot easier. For example, there is a Signal fork that supports such operation:

*

https://github.com/mollyim/mollyim-android

babypuncher wrote at 2021-11-30 23:48:19:

So you're saying I should have to type a secure passcode every single time I want to read or send a message on my phone?

No thanks.

kingcharles wrote at 2021-11-30 23:29:42:

If you _are_ at the level where TLAs are interested in you they will not give you a chance to mash that button. You will have a loaded gun pointed at your head out of nowhere and you will freeze. From experience.

quenix wrote at 2021-11-30 23:32:11:

Is that a story you mind sharing?

14 wrote at 2021-11-30 21:26:41:

I just tried this an it does not work for iPhone is it only on a certain iOS? I am a bit behind on updates. Thanks

david_allison wrote at 2021-11-30 21:35:38:

Try: Hold "volume up" and "power" for 2 seconds

You'll feel a vibration, and biometric login will be disabled until you enter your passcode.

14 wrote at 2021-11-30 23:36:16:

That did the trick thanks. But ultimately I’m behind on updates so my phone could probably be broken into trivial with the forensic tools available to most law enforcement. I’m going to update soon.

ribosometronome wrote at 2021-11-30 21:34:19:

That's actually the old method for iPhone 7 and before. Now, you can activate emergency SOS by holding the power button and one of the volume buttons. Assuming you don't need to contact any emergency contacts or services, just cancel out of that and your passcode will be required to unlock.

https://support.apple.com/en-us/HT208076

croutonwagon wrote at 2021-11-30 22:11:47:

Also IOS has a panic button. Hit the main/screen button (on the right) five times really fast and faceid/touchid is disabled and passcode is required

vorpalhex wrote at 2021-11-30 21:28:46:

Your statement on the 5th amendment is no longer accurate broadly, but the matter still has some cross-jurisdictional disagreement:

https://americanlegalnews.com/biometrics-covered-by-fifth-am...

torstenvl wrote at 2021-11-30 21:52:21:

District courts don't make law. Magistrates working for those district courts even less so. The case this news article cites has no precedential value anywhere - not even within N.D.Cal. - and should not be relied upon.

IAAL but IANYL

kingcharles wrote at 2021-11-30 23:37:44:

Agreed. That decision is unlikely to be repeated by any appellate court. IMO, all the rulings on biometrics not being testimonial are constitutionally correct, even if that sucks. A lot of constitutional rulings suck.

The real solution is for a federal statute to require warrants.

timbit42 wrote at 2021-11-30 20:59:01:

Signal recently added 'disappearing messages' which lets you specify how long a chat you initiate remains before being deleted.

bigiain wrote at 2021-11-30 22:13:22:

Not "recently". Disappearing messages have been there for at least 5 or 5 years.

Almost _all_ my Signal chats are on 1 week or 1 day disappearing settings. It helps to remind everyone to grab useful info out of the chat (for example, stick dinner plan times/dates/locations into a calendar) rather than hoping everybody on the chat remembers to delete messages intended to be ephemeral.

The "$person set disappearing messages to 5 minutes" has become shorthand for "juicy tidbit that's not to be repeated" amongst quite a few of my circl3es of friends. Even in face to face discussion, someone will occasionally say something like "bigiain has set disappearing messages to five minutes" as a joke/gag way of saying what used to be expressed as "Don't tell anyone, but..."

(I just looked it up,

https://signal.org/blog/disappearing-messages/

from Oct 2016.)

timbit42 wrote at 2021-11-30 22:18:10:

Maybe it was only added recently on the desktop client.

upofadown wrote at 2021-11-30 22:41:37:

Keep in mind that any time a message is on flash storage there might be a hidden copy kept for flash technical reasons. It is hard to get to (particularly if the disk is encrypted) but might still be accessible in some cases.

I think encrypted messengers should have a "completely off the record" mode that can easily be switched on and off. Such a mode would guarantee that your messages are never stored anywhere that might become permanent. When you switch it off then everything is wiped from memory. That might be a good time to ensure any keys associated with a forward secrecy scheme are wiped as well.

noasaservice wrote at 2021-11-30 21:11:37:

And a screenshot, or another camera, or a rooted phone can easily defeat that.

The analog hole ALWAYS exists. Pretending it doesnt is ridiculous.

wizzwizz4 wrote at 2021-11-30 21:14:51:

> _And a screenshot, or another camera, or a rooted phone can easily defeat that._

Not if the message has already been deleted. Auto-deleting messages are so the recipient doesn't have to delete them manually, not so the recipient can't possibly keep a copy.

summm wrote at 2021-11-30 21:37:58:

Exactly this. Even more: Auto-deleting messages are also that the sender doesn't have to delete them manually. Most people do not understand this. I even had a discussion with an open source chat app implementer who insisted on not implementing disappearing messages because they couldn't be really enforced.

hiq wrote at 2021-11-30 22:01:46:

That's a different threat model, no messaging app is trying to protect the sender from the receiver. Disappearing messages are meant to protect two parties communicating with each other against a 3rd party who would eventually gain access to the device and its data.

bigiain wrote at 2021-11-30 22:15:14:

Wickr has a "screenshot notification to sender" feature (which of course, can be worked around by taking a pic of the screen without Wickr knowing you've done it).

timbit42 wrote at 2021-11-30 21:30:00:

What made you think I was pretending it doesn't?

jfrunyon wrote at 2021-11-30 22:09:44:

The fifth amendment doesn't protect either speech or biometrics. Nor does it protect passwords.

kingcharles wrote at 2021-11-30 23:39:05:

You are wrong. It protects passwords as speech, as they are testimonial, per many court rulings. It does not protect biometrics based on law that basically says the police can force you to give up your fingerprints for their records, so they can sure as fuck force your finger onto a reader.

xuki wrote at 2021-12-01 00:06:59:

Can they force someone to LOOK at the phone? FaceID with attention check will need you to look before it opens.

FalconSensei wrote at 2021-11-30 21:15:31:

> do not use face or fingerprint to secure your phone

but can't they force you to put your password in that case, instead of your finger?

caseysoftware wrote at 2021-11-30 21:20:38:

In general, no.

The contents of your mind are protected because you must take an active part of disclose them. Of course, they can still order you to give them the password and stick you in jail for Contempt of Court charges if you don't.

Check out Habeas Data. It's a fascinating/horrifying book detailing much of this.

ribosometronome wrote at 2021-11-30 21:40:04:

To err on the side of caution, it's best to make all your passcodes themselves an admission to a crime.

emn13 wrote at 2021-11-30 22:01:02:

They don't actually need your passphrase to unlock your phone - they just need somebody with the passphrase to unlock in for them. And if there's any doubt about who that is, then having that passphrase counts as testimonial; but if there's not - it might not count as testimonial.

Although there are apparently a whole bunch of legal details that matter here; courts have in some cases held that defendants can be forced to decrypt a device when the mere act of being able to decrypt it is itself a foregone conclusion.

(If you want to google a few of these cases, the all writs act is a decent keyword to include in the search).

The defendant never needs to divulge the passphrase - they simply need to provide a decrypted laptop.

oceanplexian wrote at 2021-11-30 23:18:18:

We really should up our game on encryption, perhaps some kind of time-based crypto rotation that inherently self-destructs rendering the data unusable if you don't authenticate with it every so often. If you are physically unable to unlock a device you can't be compelled to do so.

Y_Y wrote at 2021-11-30 21:57:27:

My passwords are so obscene it's a crime to write them down.

dylan604 wrote at 2021-11-30 22:50:40:

great, so they'll just be able to hit you with lewd charges on top of everything else they are filing.

shadowgovt wrote at 2021-11-30 21:55:09:

"Your honor, the state agrees to not prosecute on any information inferrable from the text of the password."

"Understood. The defendant's Fifth Amendment right to protection from self-incrimination is secured. As per the prior ruling, the defendant will remain in custody for contempt of court until such time as they divulge the necessary password to comply with the warrant."

kingcharles wrote at 2021-11-30 23:44:17:

I don't know why you're being downvoted. For a start, if it was a third party that had the passcode and refused to divulge it they can be held in jail until they release it, e.g. if your wife knows it. (There are many cases where people have been sentenced to years or decades in prison for not testifying)

If it is you not divulging your own passcode, then legally the judge can't give you contempt, but in reality they could give you contempt until you fought it through the appellate court. Contempt is a special type of thing - certainly here in Illinois you have no right to a jury trial on contempt charges. You're just fucked.

randomluck040 wrote at 2021-11-30 21:22:05:

I think a fingerprint is easier to get if you’re not willing to cooperate. However, I think if they really, I mean really want your password, they will probably find a way to get it out of you. I think it also depends if it’s the local sheriff asking for your password or someone from the FBI while you’re tied up in a bunker somewhere in Nevada.

chiefalchemist wrote at 2021-11-30 22:05:33:

Apple should allow for 2 PWs, one the real PW, the other triggers a "self-destruct" mode.

Knowing that is possible law enforcement would then hesitate to ask.

detaro wrote at 2021-11-30 22:09:49:

_using_ such a self-destruct mode would be a certain way getting yourself charged with destroying evidence/contempt of court/... though.

dylan604 wrote at 2021-11-30 22:53:02:

i was under such duress that i was shaking so badly that i made typos in my 30 character password 10 times. the loss of evidence is not my fault as it is the people putting me under that duress. don't think it'll hold up though

kingcharles wrote at 2021-11-30 23:48:47:

This would be difficult to prove. They would have to know for certain the evidence was on there to begin with. I don't see the prosecutor easily meeting their burden of proof on this charge.

This is how the statute is worded here in Illinois:

"A person obstructs justice when, with intent to prevent the apprehension or obstruct the prosecution or defense of any person, he or she knowingly commits any of the following acts: (1) Destroys, alters, conceals or disguises physical evidence."

Ugh. It's a vague law. I don't even know how they would prosecute that for virtual evidence held on a device that they didn't already have a view inside of.

oceanplexian wrote at 2021-11-30 23:28:31:

FaceID can already prevent a device from unlocking if someone is sleeping. In theory devices could detect if they were being unlocked "under duress" by using biometrics to look at facial expressions, heartbeat, etc, and then wipe themselves. I don't know how practical in reality but perhaps it could be a feature you turn on in a sensitive environment.

cronix wrote at 2021-11-30 21:25:12:

How? They can physically overpower you and place the sensor against your finger, or in front of your eye and pry it open without your consent and gain access with 0 input from you. How do they similarly force you to type something that requires deliberate, repeated concrete actions on your part?

xur17 wrote at 2021-11-30 23:27:01:

https://arstechnica.com/tech-policy/2020/02/man-who-refused-...

kingcharles wrote at 2021-11-30 23:50:02:

In my case they threatened to harm my wife if I didn't stop refusing. After my case is over I'll happily release the video tapes so you can see how this shit works.

adgjlsfhk1 wrote at 2021-11-30 21:19:51:

no. The 5th amendment has been read weirdly by the supreme court.

kingcharles wrote at 2021-11-30 23:28:22:

Don't have any family or friends, either. If you refuse to talk and invoke your rights the government will just threaten to hurt those you love until you break and give up your passwords. From experience.

I liked it in Wrath of Man where one guy is acting tough as fuck until they bring his girl into the room.

Also, if you can, if you are encrypting data, use a hidden volume inside the first - that way you can give the government the outer password and they'll be happy thinking they have everything.

skrowl wrote at 2021-11-30 21:23:47:

Telegram is encrypted OVER THE WIRE and AT REST by default with strong encryption no matter what you do. It's E2EE if you select private chat with someone.

Lots of FUD out there there about Telegram not being encrypted that's just not true. There's nothing either side can to do send a message in clear text / unencrypted.

Andrew_nenakhov wrote at 2021-11-30 23:04:57:

"Encrypted OVER THE WIRE and AT REST" means that telegram has easy and unfettered access to chat logs. So they can give it up to authorities. (I don't argue that they DO, just that they very much CAN).

This is proven by an extremely simple experiment: you log in on your new phone, enter password and instantly see all chats.

Another simple experiment points that chats are unlikely to be even encrypted at rest is that Telegram has an extremely fast server side message search. You log into a web client, half a second later you can type a search query and uncover chats from years ago.

nicce wrote at 2021-11-30 23:43:17:

It kinda depends on if images and videos are encrypted separately and only indexed at first.

How much data there are on your chats? 1 megabyte is around one thick book in plaintext.

AES-CBC as example method decrypts more than 2 gigabits per second with hardware opcodes (2012 processor), for example if we look this data

https://www.bearssl.org/speed.html

It is impossible to say based on delay when searching plaintext on this level whether there is encryption.

shawnz wrote at 2021-12-01 00:09:21:

Encryption over the wire and at rest is a basic expectancy of any web service today. They would meet that criteria just by using SSL and disk encryption on their servers. E2EE is a much stronger criteria.

octorian wrote at 2021-12-01 00:28:11:

> It's E2EE if you select private chat with someone.

And its not E2EE if you fail to select private chat.

What this means is that any conversations where you do select E2EE are the ones the "authorities" will take interest in, even if only to the extent of metadata.

That's the fundamental problem with E2EE-by-exception, rather than by default. It calls attention to specific data, even if its not cleartext, rather than obscuring everything.

542458 wrote at 2021-11-30 21:25:20:

For somebody who isn’t super cyprtography-savvy, what’s the difference between over the wire and e2ee? Does the former mean that telegram itself can read non-private-chat messages if it so chooses?

skinkestek wrote at 2021-11-30 21:42:24:

> For somebody who isn’t super cyprtography-savvy, what’s the difference between over the wire and e2ee?

E2EE: As long as it is correctly set up and no significant breakthroughs happens in math, nobody except the sender, the receiver can read the messages.

> Does the former mean that telegram itself can read non-private-chat messages if it so chooses?

Correct. They say they store messages encrypted and store keys and messages in different jurisdictions, effectively preventing themselves from abusing it or being coerced into giving it away, but this cannot be proven.

If your life depends on it, use Signal, otherwise use the one you prefer and can get your friends to use (preferably not WhatsApp though as it leaks all your connections to Facebook and uploads your data _unencrypted_ to Google for indexing(!) if you enable backups.

Edited to remove ridiculously wrong statement, thanks kind SquishyPanda23 who pointed it out.

SquishyPanda23 wrote at 2021-11-30 21:49:55:

> nobody except the sender, the receiver and the service provider can read the messages

E2EE means the service provider cannot read the messages.

Only the sender and receiver can.

skinkestek wrote at 2021-11-30 21:54:16:

Thanks! I edited a whole lot and that came out ridiculously wrong! :-)

loeg wrote at 2021-11-30 21:32:23:

Yeah, if you connect to

https://facebook.com

and use messenger, it's encrypted over the wire because you're using HTTPS (TLS). But it's not E2EE.

Gigachad wrote at 2021-11-30 21:29:15:

Pretty much. End to end uses the encryption keys of both _users_ to send. Over the wire has both sides use the platforms keys so the platform decrypts, stores in plain text, and sends it encrypted again to the other side. Over the wire is basically just HTTPS.

Daegalus wrote at 2021-11-30 21:31:03:

over the wire is when its encrypted during transmission between the User and Telegram's servers. HTTPS or SSL/TLS, etc. At Rest is when its encrypted in their DBs or hard drives, etc. Theoretically, Telegram can still read the contents if they wished to do so if they setup the appropriate code, or tools inbetween these steps.

E2EE means that the users exchange encryption keys, and they encrypt the data at the client, so that only the other client can decrypt it. Meaning Telegram can never inspect the data if they wanted to.

Andrew_nenakhov wrote at 2021-11-30 23:08:27:

I very much doubt that Telegram really does encrypt messages "at rest": their server side full text search works extremely fast.

Daegalus wrote at 2021-11-30 23:50:58:

That's a fair assessment, I didn't make the original claim, just answered the definitions of the encryption states.

I haven't dug enough to know what telegram does or claims to do.

andreyf wrote at 2021-11-30 21:33:16:

yes. worth remembering also that even with e2ee, a ad-tech-driven company could have endpoints determine marketing segments based on content of conversations ad report those to the company to better target ad spend.

skinkestek wrote at 2021-11-30 21:44:51:

Also, as is the case with WhatsApp, they siphon off your metdata and even have the gall to make an agreement with Google to download message content _unencrypted_ to Google when one enable backups.

blueprint wrote at 2021-11-30 21:50:34:

(how) does the telegram server prevent unencrypted content?

also curious - how does telegram support encryption for chatrooms without the parties being known in advance? or are those chats not encrypted?

racingmars wrote at 2021-11-30 23:00:13:

This chart is showing what messaging providers are _willing_ to give to law enforcement, _not_ a reflection of the technical capabilities of the messaging provider.

I assume what they're showing for Telegram (basically no data except IP/phone data if Telegram decides it's for a legit counter-terrorism activity) is a matter of Telegram business policy.

Signal gives the limited information they do because I assume they are subject to warrants from U.S. courts. Telegram is run, to my understanding, from jurisdictions where enforcing a U.S. court order would be difficult-to-impossible, and they keep the private keys to decrypt their stored message content split between servers in relatively non-overlapping legal jurisdictions, so even a successful seizure of data in one wouldn't be enough to decrypt message content.

That's all well and good -- and I appreciate Telegram for setting things up that way -- but that means at any time Telegram _could_ make a policy decision to cooperate with law enforcement and provide much more than what is shown on this chart. Signal, on the other hand, could choose to cooperate as much as they want but not have the technical capability to provide more information. (Barring them updating their client to intentionally build in a backdoor, etc., but I'm basing this on what the current implementation is.)

The other important thing about this chart: this is the unclassified version. Is there another classified document out there which says "we have a secret relationship with Telegram/whomever and they give us all the message content we want" but they don't advertise to the law enforcement community at large? They secretly use it to aid in parallel construction so they don't ever have to reveal that a messaging vendor is giving them message content in court? We have no idea.

tl;dr: Telegram looks great on this chart because of _policy_, not _technology_. I love Telegram, but I'm under no illusions that it's appropriate for talking about things I wouldn't want law enforcement to have access to. Luckily, I haven't found myself needing to talk to my friends about illegal activity.

godelski wrote at 2021-11-30 23:33:16:

TLDR: Telegram depends on trusting Telegram. Signal is trustless.

makeworld wrote at 2021-11-30 20:43:06:

> the FBI's ability to _legally_ access secure content

Maybe there are laws preventing legal access to message content? Maybe related to wherever Telegram is incorporated.

inetknght wrote at 2021-11-30 21:24:36:

> _Maybe there are laws preventing legal access to message content?_

Well sure. A lot of laws require a court order. In the U.S. that's usually not too difficult.

officeplant wrote at 2021-11-30 21:16:20:

It helps Telegram is HQ'd in the UK and the operational center is in Dubai.

rootsudo wrote at 2021-11-30 21:24:01:

Does it? UK and Dubai are USA partners in Intelligence gathering and work together several times.

Biggest example as of late:

https://www.bbc.com/news/world-middle-east-58558690

to11mtm wrote at 2021-11-30 20:17:47:

I don't know whether Telegram is E2EE by default (probably not.) When you do a call on telegram you are given a series of emoji and they are supposed to match what the person on the other side has, and that's supposed to indicate E2EE for that call.

RL_Quine wrote at 2021-11-30 20:21:57:

Verification in band seems pretty meaningless, approaching security theatre.

Andrew_nenakhov wrote at 2021-11-30 23:14:27:

Real privacy is too burdensome for most users, so they feel just fine if the service owner promises in a stern voice that their chats are _really secure_.

It is not necessary to provide real security, do fingerprint verification, etc if the users are already happy with the level of security they are promised.

nitrogen wrote at 2021-11-30 20:34:11:

For voice? It's hard to fake the voice of someone you know.

Muromec wrote at 2021-11-30 21:00:14:

you don't have to fake the voice, just mitm and record cleartext

ajsnigrutin wrote at 2021-11-30 21:12:40:

But they have to fake the voice, if I call the other person and say "my emoji sequence is this, this and that" for the other person to verify and vice-versa.

wizzwizz4 wrote at 2021-11-30 21:16:27:

Person A calls you. I intercept the call, so person A is calling _me_, and then I call you (spoofing so I look like Person A). When you pick up, I pick up, then I transmit what you're saying to Person A (and vice versa).

How do you know I'm intercepting the transmission? Does the emoji sequence verify the _call_, perhaps?

tshaddox wrote at 2021-11-30 21:42:27:

The emoji sequence is a hash of the secret key values generated as part of a modified/extended version of the Diffie-Hellman key exchange. The emoji sequence is generated and displayed independently on both devices _before_ the final necessary key exchange message is transmitted over the wire, so a man-in-the-middle has no way of modifying messages in flight to ensure that both parties end up generating the same emoji sequence.

I'm not a cryptographer, but that's what I glean from their explanation:

https://core.telegram.org/api/end-to-end/video-calls#key-ver...

summm wrote at 2021-11-30 21:39:57:

Both connections would show different emojis on both sides then. So you would need to somehow deep fake the voice of the one telling their emojis to the other one.

tptacek wrote at 2021-11-30 20:52:19:

It is not, by default, and none of the group chats are.

sgjohnson wrote at 2021-11-30 23:29:14:

Telegram isn’t E2EE by default.

My bet is on the fact that they are based in Russia, so they don’t give a shit about a US warrant or subpoena.

vadfa wrote at 2021-11-30 20:22:37:

That is correct. By default all messages sent over Telegram are stored permanently in their servers unencrypted.

Borgz wrote at 2021-11-30 20:44:49:

Not exactly. Non-secret chats are stored encrypted on Telegram's servers, and separately from keys. The goal seems to be to require multiple jurisdictions to issue a court order before data can be decrypted.

https://telegram.org/privacy#3-3-1-cloud-chats

https://telegram.org/faq#q-do-you-process-data-requests

anon11302100 wrote at 2021-11-30 21:02:25:

"Not exactly" means "completely incorrect" now?

Telegram doesn't store your messages forever and they are encrypted and seizing the servers won't allow you to decrypt them unless you also seize the correct servers from another country

skrowl wrote at 2021-11-30 21:24:58:

Source for Telegram storing the information unencrypted at rest?

pgalvin wrote at 2021-11-30 22:36:12:

It is widely known and confirmed by Telegram themselves that your messages are encrypted at rest by keys they possess.

This is a similar process to what Dropbox, iCloud, Google Drive, and Facebook Messenger do. Your files with cloud services aren’t stored unencrypted on a hard drive - they’re encrypted, with the keys kept somewhere else by the cloud provider. This way somebody can’t walk out with a rack and access user data.

Andrew_nenakhov wrote at 2021-11-30 23:15:58:

How do they provide near-instant full text search on server side if the chats are "encrypted at rest"?

t0mas88 wrote at 2021-11-30 20:09:58:

So if you have something to hide, don't use iCloud backup.

And Whatsapp will give them the target's full contactbook (was to be expected), but _also_ everyone that has the target in their contact list. That last one is quite far reaching.

kf6nux wrote at 2021-11-30 21:08:47:

> if you have something to hide

Most people don't realize that most people have something to hide. The USA has so many laws on its books. Many of which are outright bizarre[0] and some of which normal people might normally break[1].

And that's only counting _current/past_ laws. It wasn't that long ago a US President was suggesting all Muslims should be forced to carry special IDs[2]. If you have a documented history being a Muslim, it could be harder to fight a non-compliance charge.

[0]

https://www.quora.com/Why-is-there-a-law-where-you-can-t-put...

[1]

https://unusualkentucky.blogspot.com/2008/05/weird-kentucky-...

[2]

https://www.snopes.com/fact-check/donald-trump-muslims-id/

kingcharles wrote at 2021-11-30 23:53:23:

I always liked this one I found in the Illinois statutes - it basically criminalizes every person online:

Barratry. If a person wickedly and willfully excites and stirs up actions or quarrels between the people of this State with a view to promote strife and contention, he or she is guilty of the petty offense of common barratry[.]

https://www.ilga.gov/legislation/ilcs/ilcs4.asp?DocName=0720...

hunterb123 wrote at 2021-11-30 22:30:21:

> “Certain things will be done that we never thought would happen in this country in terms of information and learning about the enemy,” he added. “We’re going to have to do things that were frankly unthinkable a year ago.”

> “We’re going to have to look at a lot of things very closely,” Trump continued. “We’re going to have to look at the mosques. We’re going to have to look very, very carefully.”

That's all he said to the interviewer. The interviewer was asking the hypothetical and suggested the special identification! He wouldn't take the bait, so since he didn't answer the hypothetical they said "he wouldn't deny it" and wrote the campaign of hit piece articles anyway. Whatever response they got they would have wrote that same piece. If he would have answered one way they would have quoted out of context. Since he responded generically it's obviously drummed up. The fact check is hilarious. "Mixed", lol.

Never answer a hypothetical, it's always a trap.

daqhris wrote at 2021-12-01 00:43:45:

Your last sentence just made me freak out thinking that I've previously done such stupidity in front of a "law officer".

I never for one second thought it could be a trap; I was overly willing to cooperate and truthfully respond to a "theoretical" inquiry. Damn, it hurts in retrospective.

rootusrootus wrote at 2021-12-01 00:48:51:

> That's all he said to the interviewer

And then the next day, he clarified:

Reporter: "Should there be a database or system that tracks Muslims in this country?"

Trump: "There should be a lot of systems, beyond databases. I mean, we should have a lot of systems."

And then he tried to backpedal. Decided it was a watch list, not a database, etc. Basically the usual shtick of his where he tries to say everything and nothing at the same time.

president wrote at 2021-11-30 21:47:12:

Did you even read the snopes article you referenced before making what seems like a definitive claim about how Trump was suggesting muslims carry special IDs? Because Snope's own rating is "Mixture" of truth and false and if you read the assessment, it is grasping at straws to even make that conclusion.

kf6nux wrote at 2021-12-01 00:28:24:

Yes, "mixed" means you have to read the nuance. I think I accurately captured the reality. If you have a correction to offer, please do.

EDIT: Ultimately, the nuance in that history is not relevant to the point that criminal law changes to include new categories in unexpected ways.

georgyo wrote at 2021-11-30 20:14:29:

You and the person you are communicating with must both not use iCloud backup. And since apple pushes the backup features pretty heavily, you can be reasonable sure that the person you are communicating is using backups. IE, you cannot use iMessage.

xanaxagoras wrote at 2021-11-30 20:42:32:

I got off all Apple products when they showed me their privacy stance is little more than marketing during the CSAM fiasco, but IIRC the trouble with iCloud backup is it stores the private key used to encrypt your iMessages backup. Not ideal to be sure, but wouldn't iMessage users be well protected against dragnet surveillance, or do we know that they're decrypting these messages en masse and sharing them with state authorities?

vmception wrote at 2021-11-30 21:22:23:

iCloud backup can backup your whole phone, specifically the files section. iOS and OSX users can save anything to that.

fumar wrote at 2021-11-30 20:27:44:

Has Apple made any public statements regarding iCloud's lack of privacy features. It takes the wind out of their privacy marketing that is effectively hurting ad tech but not truly protecting consumers from state-level actors with data access.

amatecha wrote at 2021-11-30 20:37:33:

Kind of. These details are indeed publicly written on their website[0]. Do many users ever read this page? Probably not.

[0]

https://support.apple.com/en-us/HT202303

fumar wrote at 2021-11-30 21:23:17:

Here is an excerpt. The language sounds like encryption is enabled and the chart includes iCloud features as server and in transit protected. Seems like smoke and mirrors then.

> On each of your devices, the data that you store in iCloud and that's associated with your Apple ID is protected with a key derived from information unique to that device, combined with your device passcode which only you know. No one else, not even Apple, can access end-to-end encrypted information.

nicce wrote at 2021-11-30 22:42:43:

E2EE was in the iOS 15 beta for backups but it was removed? (Did not land for release) after they changed the time table of CSAM scanning feature. So we will see if we get E2EE backups once that image scanning lands.

sschueller wrote at 2021-11-30 20:13:25:

Can you turn that off if you have icloud or do you need to not use icloud all together?

ceejayoz wrote at 2021-11-30 20:35:51:

You can turn it off individually just for Messages, but you're still left not knowing the state of the setting on the other end.

KennyBlanken wrote at 2021-11-30 22:08:21:

Yes, and you can delete old backups on iCloud - and then switch to local, automatic, fully encrypted backups to a Mac or PC running iTunes.

HN tends to get very frothy-at-the-mouth over Apple and privacy but the reality is that iPhones can be easily set up to offer security and privacy that best in class, they play well with self-hosted sync services like Nextcloud....and unlike the Android-based "privacy" distros you're not running an OS made by a bunch of random nameless people, you can use banking apps, etc.

The only feature I miss is being able to control background data usage like Android does.

lupire wrote at 2021-11-30 20:11:09:

If you have something to hide, don't give a copy to _any_ third-party.

even a second-party is a risk.

skinkestek wrote at 2021-11-30 21:53:23:

Check the difference between Telegram and WhatsApp.

Add to this the fact that WhatsApp

- uploads messages unencrypted to Google if you or someone you chat with enable backups

- and send all your metadata to Facebook.

Then remember how many people here have tried to tell us that Telegram is unusable abd WhatsApp is the bees knees.

Then think twice before taking security advice from such people again.

PS: as usual, if your life depends on it I recommend using Signal, and also being generally careful. For post card messaging use whatever makes you happy (except WhatsApp ;-)

zer0zzz wrote at 2021-11-30 22:28:08:

This isn’t the case anymore with WhatsApp. Backups to iCloud and google drive are optionally fully encrypted. You have the choice of storing the decryption artifacts on facebooks servers (which are held on a Secure Enclave) or to backup the 64 character decryption code yourself.

KennyBlanken wrote at 2021-11-30 22:21:11:

Telegram defaults to no encryption, does not do encrypted group chats, has a home-rolled encryption protocol which almost guarantees it's weak as nearly every home-rolled encryption system always is (if not also backdoored). Coupled with it being headquartered in Russia means it is completely untrustable.

The only reason Telegram comes out on top of Whatsapp in the document in question is because Telegram is a foreign company with little interest in cooperating with a US domestic police agency; the FBI has no leverage over Russian companies.

What that list doesn't show is what Telegram does when the FSB knocks. By all means, give your potentially embarassing message content to a hostile nation's intelligence service.

nicce wrote at 2021-11-30 22:51:47:

That is a lot of speculation. If you read the encryption protocol, actual methods being used for encryption are well known. Client is open source and supports reproducible builds. If there is a backdoor, it is in front of our eyes.

> What that list doesn't show is what Telegram does when the FSB knocks. By all means, give your potentially embarassing message content to a hostile nation's intelligence service.

Telegram is in a lot of trouble in operating in Russia. It was blocked for two years. [1]

If they are so co-operative, why pass the opportunity to watch on their own people. Or did they become co-operative after unblock? It seems, that they help on some level [2], but does this threaten to other countries? Hard to say so.

[1]

https://en.wikipedia.org/wiki/Blocking_Telegram_in_Russia

[2]

https://www.independent.co.uk/news/world/europe/telegram-rus...

Andrew_nenakhov wrote at 2021-11-30 23:19:10:

Telegram's block in Russia was likely a very successful PR action coordinated with authorities.

It was never removed from national appstores, and Google/Apple usually comply with such requests, and the fact that it was unbanned is unprecedented.

nicce wrote at 2021-12-01 00:08:17:

Apple did stop updates for the Telegram. Google and Apple has weak history on compiling Russian requests. Maybe they complie with other countries more, but not Russian.

https://www.pcmag.com/news/after-almost-2-months-apple-stops...

baby wrote at 2021-11-30 20:57:54:

I'm wondering how this was obtained, and how old this is?

For WhatsApp:

if target is using an iPhone and iCloud backups enabled, iCloud returns may contain WhatsApp data, to include message content

Probably not true since WhatsApp launched encrypted backups.

gnabgib wrote at 2021-11-30 23:59:38:

I mean the document says the data is accurate "as of November/2020", and the slide was prepared 7-Jan-2021

vorpalhex wrote at 2021-11-30 22:52:05:

Reading the document answers this for you: It is a declassified government document originally produced by the FBI and was prepared on Jan 2nd, 2021.

no_time wrote at 2021-11-30 20:21:37:

LINE,telegram,threema and WeChat are not even american companies. Can't they just tell the FBI to suck a fat one when they ask for user data?

colechristensen wrote at 2021-11-30 20:28:31:

Not if they want to operate in the United States or have access to our banking system.

You don’t get to pick your jurisdiction and then operate globally. You’re obligated to follow the laws where you want to operate.

xxpor wrote at 2021-11-30 20:41:43:

Fwiw if you want to do any sort of FX whatsoever or accept credit cards, you need access to the American banking system.

no_time wrote at 2021-11-30 20:37:10:

I wonder how this affects nonprofits like Matrix/Element and Signal. What can they do with them? Gangstalk their developers? Coerce big tech to ban them from their appstores?

ev1 wrote at 2021-11-30 20:39:42:

Doesn't Signal's dev already get bothered every single time they travel?

viro wrote at 2021-11-30 21:11:22:

Refusing a valid court issued search warrant/order is a criminal offense. I think 180 days for each refusal of a legal order.

no_time wrote at 2021-11-30 21:22:37:

The issue is a bit more complex. I was thinking more on the lines of "will I get bothered for making crypto available for the masses that nobody can crack?"

millzlane wrote at 2021-11-30 22:42:14:

IRRC Didn't they jail a guy for that?

colechristensen wrote at 2021-11-30 21:21:10:

Well signal does not have the data, they comply with such orders with the tiny amount of metadata they have (like a timestamp of when your account was created and that’s about it)

smoldesu wrote at 2021-11-30 21:01:28:

The design of these decentralized/federated platforms is specifically so they _can't_ easily coerce their owners into disclosing incriminating information. In some sense, it's similar to how Bittorrent implicates it's users.

er4hn wrote at 2021-11-30 21:12:55:

Telegram, as I understand it, can access your messages when stored in their Cloud[1]. They just make a choice to not provide the content of those to anyone.

[1] -

https://telegram.org/privacy#4-1-storing-data

Decabytes wrote at 2021-11-30 20:28:32:

Depends on whether the countries these companies exist in have agreements with the U.S for surveillance and stuff

b8 wrote at 2021-11-30 23:55:13:

Can't the FBI get chatlogs from WeChat?

https://www.youtube.com/watch?v=N5V7G9IBomQ

In the short documentary that the FBI made about catching Kevin Mallory they mentioned catching him sending classified stuff via WeChat.

fractal618 wrote at 2021-11-30 20:38:28:

Now I just have to get my friends and family to use Signal.

yabones wrote at 2021-11-30 21:58:31:

I've had surprisingly good luck with strong-arming people into switching. The important part is having their trust, if they don't believe you they won't listen. The next part is to make simple, verifiable, and non-technical arguments for switching. Believe it or not, almost everybody is willing to take small steps if they're free.

Instead of rambling on and on about "end to end encryption" or "double-ratchet cryptographic algorithms" or other junk only nerds care about, approach it like this:

* There are no ads, and none of the messages you send can be used for advertising

* It's not owned by Facebook, Google, Microsoft, or any of the other mega-corporations, and you don't need an account on one of their sites to use it

* It will still work great if you travel, change providers, etc

* It's much safer to use on public Wi-Fi than other services or SMS

Honestly, don't even touch on law enforcement access as in the OP. That can strike a nerve for some people. The best appeals are the simple ones.

godelski wrote at 2021-11-30 23:42:45:

Also, a big one that works for me (especially iPhone users, which are the hardest to convert): "You can send full quality images and videos to Android users." The fact that Apple shots themselves in the foot is an advantage to Signal.

ViViDboarder wrote at 2021-12-01 01:13:47:

It’s also an issue the other way around. MMS is the limitation.

godelski wrote at 2021-11-30 23:41:39:

The best advice I have to give to get people to switch is showing that you have cross platform capabilities. Essentially everyone can have the features of iMessage/WA: full resolution images and videos, responding to messages with emojis (WA doesn't have), stickers (unfortunately you have to grab from signalstickers.com instead of in-app), voice and video calling, etc. If Apple didn't have such a closed ecosystem then I think it would be harder to get people to switch. In this respect, Signal is more feature rich than anything else (except Telegram, but Telegram doesn't have the same security and isn't trustless).

I think the common mistake is trying to convince people with the security. Use that as a bonus, not the main feature. You're talking geek to people that don't speak geek (convince geeks with these arguments, not mom and dad). I also suggest strong arming people and using momentum (if 4 people in a group of 5 have Signal, switch the group to Signal. Or respond to WA messages on Signal).

djanogo wrote at 2021-11-30 21:31:13:

I switched to signal and got few people to switch too, then they started their shit coin(MOB). IMO Signal Messenger is just a way for that company to reach their shit coin goals. Uninstalled and never recommending that again.

MatekCopatek wrote at 2021-11-30 22:01:56:

I remember many people being pissed off when these features were announced some months ago.

As far as I can tell, nothing really happened afterwards. I use Signal on a daily basis and haven't noticed any coin-related functionalities. Either they were canceled, haven't been released yet or they're just buried somewhere deep and not advertised.

Do you have a different experience?

godelski wrote at 2021-11-30 23:44:36:

MOB is in beta and I think getting moved (if not already) to main soon. But it is non-intrusive and you won't notice it unless you look for it. People are just complaining about a feature that you have to look for. I'm not a fan of MOB and how the situation was handled, but I also think the reactions people are having are a bit over the top.

fossuser wrote at 2021-11-30 22:39:45:

It's in beta, you can enable it in settings.

It's still a pain to buy MOB in the US so it's not that usable in the states. It would have been interesting to me if they just used Zcash instead of rolling their own, but I'm not sure what's supposed to be special about MOB vs. Zcash.

I also don't think it's that big of a deal.

godelski wrote at 2021-11-30 23:47:01:

I'd love Zcash (forced private transactions). But honestly I'd also like if we could use different currencies. My dream was that you could send cash and they would just use MOB as the intermediate transaction (so your bank would just see a transaction to/from Signal and not who you were sending/receiving to/from). But that also has technical challenges and legal issues so I understand why not. I think a multi-currency wallet is the next best option imo.

fossuser wrote at 2021-12-01 01:08:10:

Yeah, my long term hope for this stuff is that Urbit succeeds and then a lot of the UX here gets fixed by that and all of these apps become redundant and unnecessary. I'm definitely in the minority there but I think there's a future path where that's possible and works well.

arthurcolle wrote at 2021-11-30 22:15:00:

Are you sure you're not thinking of Telegram? They had a thing called Telegram Open Platform or something (TOP rings a bell for some reason)

_-david-_ wrote at 2021-11-30 22:34:08:

Signal has some coin

https://support.signal.org/hc/en-us/articles/360057625692-In...

millzlane wrote at 2021-11-30 22:41:19:

I think it's

https://mobilecoin.com/

upofadown wrote at 2021-11-30 22:45:58:

Even if you do it is pretty much impossible to get them to check their safety numbers and keep them checked.

catlikesshrimp wrote at 2021-11-30 21:02:10:

I have been trying to get people to install signal for 2 years. No one has budged.

The day facebook went down for some hours I got phone calls.

basilgohar wrote at 2021-11-30 21:15:12:

I have had some success. It helps that many of the people I regularly contact were willing to migrate, even after some time. Most already used WhatsApp, so the friction to installing a new app was less than someone not accustomed to using a dedicated app for messaging.

But most of my American friends that don't have international contacts still just use SMS because they are not really accustomed to an app such as WhatsApp and so on.

anonporridge wrote at 2021-11-30 21:09:26:

It's incredibly disheartening how difficult it is to get most people to care about digital privacy.

goatcode wrote at 2021-11-30 22:44:08:

It's kind of funny that WeChat seems pretty locked-down to the FBI, especially for Chinese citizens. Makes sense, really, but still funny.

sgjohnson wrote at 2021-11-30 23:39:27:

Like Telegram, they simply don’t care about US warrants.

bredren wrote at 2021-11-30 21:46:48:

The way these became bullet points on the slide is ~

An active investigation leads an agent to a suspect known to have used one of these applications

An administrative subpoena is issued to the company asking for what information is available

The company is then ordered by a federal judge to provide information related to a particular account or accounts

The company complies.

This is why it is important to understand how your messaging service handles data and how you can compromise your own safekeeping of all or part of that data.

hexis wrote at 2021-12-01 00:10:52:

I wonder where Matrix/Element would fit into this chart.

stunt wrote at 2021-11-30 20:23:52:

Well, who cares when all they need is to use something like Pegasus to obtain full access to your phone simply by sending you a WhatsApp message (without having you even open the message).

Knowing how well guarded IOS is against app developers, I wonder what kind of zero-day would suddenly turn a message received in WhatsApp to full system access. I think NSO found a WhatsApp backdoor, not a zero-day bug.

JBiserkov wrote at 2021-11-30 21:37:45:

NSO can't send you an WhatsApp message if you don't have WhatsApp on your iPhone.

catlikesshrimp wrote at 2021-11-30 21:07:52:

Whatsapp is owned by facebook, not by apple. I don't think Apple wants to share a backdoor with facebook.

I don't know any detail of the whatsapp vulnerability that NSO exploited.

mmazing wrote at 2021-12-01 00:20:41:

Or compromise the device in some other way.

At the risk of being cliche here's a relevant xkcd -

https://xkcd.com/538/

the_optimist wrote at 2021-11-30 21:00:11:

Isn’t this simply imaginary, where in practice all the FBI has to do to up the ante is to request military-grade interception from a willing foreign counterpart?

anonporridge wrote at 2021-11-30 21:13:53:

The point of promoting and using privacy respecting software is not necessarily to make it _impossible_ for law enforcement to get what they want. It's to make it somewhat expensive and require targeted probes.

You simply want it to be cost prohibitive to engage in mass surveillance on everyone, because that is an immensely powerful tool of totalitarian oppression that get really bad if we happen to elect the wrong person once.

wizzwizz4 wrote at 2021-11-30 21:27:28:

I don't care if they spy on _me_; they probably have a good reason to! But I do care if they spy on _everyone_, so I make it hard to spy on me.

the_optimist wrote at 2021-11-30 21:47:04:

I agree with you on the level of my person, and naturally flag that this economic argument is extremely poor policy. It’s quite unclear that the marginal cost is non-zero, or even flat by person. One might reasonably conclude we are already each inside a high-resolution springing trap, waiting for the moment we find ourselves athwart the powers that be. Imagine in physical space where the local police could simply call in foreign air strikes upon domestic citizens, with only economics to prevent otherwise. We must have transparent and firm laws, reformed at a fundamental level.

upofadown wrote at 2021-11-30 22:48:06:

Can't the FBI do a Pegasus style remote access thing on an appropriate warrant themselves?

the_optimist wrote at 2021-11-30 23:04:05:

Seems like it. And can they also do it without an appropriate warrant [by asking someone else]?

tata71 wrote at 2021-11-30 21:07:09:

Tier of which requires....expense.

rodmena wrote at 2021-11-30 23:17:49:

WhatsApp. -> FBI

Telegram. -> KGB

Signal. -> The rest of us?

anderber wrote at 2021-12-01 00:57:55:

Didn't Russia(KGB) try to block Telegram in the past and were unsuccessful? I feel like they are fairly safe and trustworthy. Of course, I like Signal best, but Telegram has so many nice features.

temptemptemp111 wrote at 2021-11-30 23:29:57:

Signal -> Mossad

timbit42 wrote at 2021-11-30 20:37:32:

I'd like to see Tox and Jami.

wizzwizz4 wrote at 2021-11-30 21:19:48:

I read somewhere that Tox's security was compromised.

timbit42 wrote at 2021-11-30 21:33:17:

I'd like to see that. Was it not fixed?

wizzwizz4 wrote at 2021-11-30 22:49:30:

I found

https://media.ccc.de/v/rc3-709912-adopting_the_noise_key_exc...

which might be it.

jareklupinski wrote at 2021-11-30 22:45:19:

is this only page two of an alphabetical list?

or are there no messaging systems with a name before 'i'

wwww3ww wrote at 2021-11-30 22:13:15:

I do DIY encryption with enigma reloaded and it works

wwww3ww wrote at 2021-11-30 22:14:08:

i use enigma reloaded to manually encrypt my messages

beervirus wrote at 2021-11-30 20:20:12:

What about regular text messages?

tag2103 wrote at 2021-11-30 20:29:46:

If I remember correctly standard SMS has no security on it at all and is in the clear during transit. I may be wrong and never scared of being corrected.

markab21 wrote at 2021-11-30 20:28:25:

Assume anything sent over a cellular network carrier via normal SMS can not only be retrieved, but intercepted.

brink wrote at 2021-11-30 20:43:22:

Thank goodness a lot of companies regularly use it for 2FA.

stunt wrote at 2021-11-30 20:28:54:

Telecommunication is highly regulated. They have to keep records for a long time and make them available to law enforcement.

bonestamp2 wrote at 2021-11-30 20:23:56:

They've been inside the phone companies for a long time so I assume they have full access to SMS.

yownie wrote at 2021-11-30 20:05:24:

link seems to be broken

https://propertyofthepeople.org/document-detail/?doc-id=2111...

mmh0000 wrote at 2021-11-30 20:31:23:

Not only is there main link broken, but their silly PDF reader is broken for me.

Here's a direct link to the PDF:

https://assets.documentcloud.org/documents/21114562/jan-2021...

finite_jest wrote at 2021-11-30 20:09:03:

Dupe of

https://news.ycombinator.com/item?id=29394945

and

https://news.ycombinator.com/item?id=29394945

.

fractal618 wrote at 2021-11-30 20:39:32:

Now I just have to get all my friends and family to use Signal.

tacLog wrote at 2021-11-30 20:57:49:

My family has really taken to it. Granted it's mostly just message family app to them, but they are very not technically fluent but yet seemed to have picked it up just fine.

I really think this is not discussed when hacker news brings up secure messaging. The user experience is so much more important than the underlying tech. My family doesn't care about end to end encryption. They care about video calling with the press of a button, and easy features that are just there and work like zoom or the many other software products that they have to use work.

Thank you Signal team for focusing so hard on the user experience.

headphoneswater wrote at 2021-11-30 21:00:25:

If this is what it takes to keep us safe, I and most americans are ok with it. We live in dangerous times.

The US has a balanced criminal justice system -- as long as due process is preserved privacy from the state should not be a major issue

benlivengood wrote at 2021-11-30 22:36:08:

Current U.S. "due process" includes national security letters and other secret legal requests and secret courts to approve those requests. So there are still some checks and balances but it's less clear that they are working well enough or as intended.

Just look at the transparency reports of major Internet companies; they can report numbers of (certain types of) requests and that's about it. Mass surveillance under seal is not a great trend.

When political parties start advocating for jailing political opponents and treating the supreme court as political office for nominations, I find it harder to trust the current due process.

numlock86 wrote at 2021-11-30 21:10:25:

How does them reading my messages keep me safe, though?

headphoneswater wrote at 2021-11-30 22:24:23:

In that example it's keeping me safe from you

wizzwizz4 wrote at 2021-11-30 21:29:52:

> _if it's not we have bigger problems anyway_

Really, we shouldn't do anything; we have the bigger problem of the eventual heat death of the universe.

Take into account not only the size of the problem, but how easy it is to do something about it.

efitz wrote at 2021-11-30 20:26:39:

This discussion is not very interesting from a security perspective. I tuned out at “cloud”.

If it’s not in your physical possession, it’s not your computer. If it’s not your computer, then whoever administers the computer, or whoever [points a gun at/gives enough money to] the administrator of that system can access whatever you put on that system.

If a “cloud” or “service” is involved, then you can trivially use them to move or store data that you encrypted locally on your computer with your key that was generated and stored locally and never left your system. But subject to the limits above, the administrators of the other computers will still be able to see metadata like where the data came from and is going to. And they might be able to see your data too if you ever (even once, ask Ross Ulbrecht) failed to follow the basic encryption guidelines above.

You can make metadata access harder via VPNs and Tor, but you CANNOT make it impossible- in the worst case, maybe your adversary is controlling all the Tor nodes and has compromised the software.

Which leads me to my last point, if you did not write (or at least read) the code that you’re using to do all of the above, then you’re at the mercy of whoever wrote it.

And, if you try to follow perfect operational security, you will have a stressful and unpleasant life, as it’s really really hard.

michaelmior wrote at 2021-11-30 20:31:00:

> if you did not write (or at least read) the code that you’re using to do all of the above, then you’re at the mercy of whoever wrote it.

It's worse than that. Even if you read the code, you have to trust that the code you read is the code a service is actually using. Even if you deploy the code yourself, you have to trust that the infrastructure you're running on does not have some type of backdoor. Even if you run your own infrastructure, hardware can still have backdoors. Of course, the likelihood of any of these things actually becoming a problem decreases significantly as you read through the paragraph.

inetknght wrote at 2021-11-30 21:45:43:

> _the likelihood of any of these things actually becoming a problem decreases significantly as you read through the paragraph._

And yet, "likelihood" doesn't necessarily mean "hasn't been done".

Just look at:

* [0]: Intel ME

* [1]: Solarwinds attack and CI systems

* [2]: Ubiquiti attack and complete infrastructure compromise

* [3]: And the famous Ken Thompson statement

[0a]:

https://news.ycombinator.com/item?id=15298833

[0b]:

https://www.blackhat.com/eu-17/briefings/schedule/#how-to-ha...

[1]:

https://www.cisecurity.org/solarwinds/

[2]:

https://krebsonsecurity.com/2021/04/ubiquiti-all-but-confirm...

[3]:

https://users.ece.cmu.edu/~ganger/712.fall02/papers/p761-tho...

dointheatl wrote at 2021-11-30 21:57:48:

> Even if you read the code, you have to trust that the code you read is the code a service is actually using.

Don't forget to verify the code for the compiler to ensure that hasn't been compromised in order to inject an exploit into the binary at compile time.

judge2020 wrote at 2021-11-30 20:30:53:

If i'm reading this page correctly, AMD is working on something that would allow you to run trusted code that not even someone with physical access to the hardware could read (without breaking this system).

https://www.amd.com/en/processors/epyc-confidential-computin...

And this tech is already implemented by GCP:

https://cloud.google.com/confidential-computing

> With the confidential execution environments provided by Confidential VM and AMD SEV, Google Cloud keeps customers' sensitive code and other data encrypted in memory during processing. Google does not have access to the encryption keys. In addition, Confidential VM can help alleviate concerns about risk related to either dependency on Google infrastructure or Google insiders' access to customer data in the clear.

efitz wrote at 2021-11-30 20:44:16:

Then you only have to trust that AMD did not accidentally or intentionally introduce a bug in the system. Remember Spectre? Remember all the security bugs in the Intel management code?

You also have to trust that AMD generated and have always managed the encryption keys for that system properly and in accordance with their documentation.

And are you even sure that you’re actually running on an AMD system? If the system is in the cloud, then it’s hard to be sure what is executing your code.

And are you sure that your code didn’t accidentally break the security guarantees of the underlying system?

I have worked on all these problems in my day job, working on HSMs. At the end of the day there are still some leaps of faith.

smoldesu wrote at 2021-11-30 21:04:28:

_puts on tinfoil hat_

You'd also need to consider AMD's management engine, the Platform Security Processor. If we're really slinging conspiracy theories, AMD processors are likely just as backdoored as Intel one. I don't mean to be grim, but I think it's safe to assume that the US government has direct memory access to the vast majority of computer processors you can buy these days.

[/conspiracy]

123pie123 wrote at 2021-11-30 21:32:00:

if you're going to that level, then have a look at five-eyes (and it's derivatives)

https://en.wikipedia.org/wiki/Five_Eyes

/ Echelon

smoldesu wrote at 2021-11-30 21:35:35:

I probably shouldn't have removed my tinfoil lining yet but yes, you're correct. Any information the US government has access to through these channels is also probably accessible by our surveillance/intelligence allies. It raises a lot of questions about how deep the rabbit hole goes, but I won't elucidate them here since I've been threatened with bans for doing so. I guess it's a do-your-own research situation, but always carry a healthy degree of skepticism when you read about anything government-adjacent.