💾 Archived View for dioskouroi.xyz › thread › 29396643 captured on 2021-12-04 at 18:04:22. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2021-11-30)

➡️ Next capture (2021-12-05)

🚧 View Differences

-=-=-=-=-=-=-

FBI's ability to legally access secure messaging app content and metadata [pdf]

Author: sega_sai

Score: 554

Comments: 444

Date: 2021-11-30 19:53:32

Web Link

________________________________________________________________________________

throw_away_dgs wrote at 2021-12-01 00:07:37:

Some FBI agents came to my house once and told me that my home Internet had been used to visit Islamic Extremist websites. They brought a local police office with them and a 'threat assessment' coordinator from my workplace. They asked me if my family was Muslim and wanted to know if we had been radicalized.

We are not religious (at all). We do not attend church, synagogue or mosque. We are lower middle class white Americans born and raised in the USA and have never traveled outside the country.

I have no idea why they thought this about us. Maybe it was an IP mix-up, but it was very disturbing. I feared that I may lose my job. I became very afraid of the FBI that day. I think this could happen to anyone at anytime.

ClumsyPilot wrote at 2021-12-01 01:21:33:

"threat assessment' coordinator from my workplace"

"I feared that I may lose my job."

I understand that police/FBI have to conduct investigation. What dont understand is involvement of the employer , it's extremely disturbing - you have not been convincted, you have not been charged, you are not even a suspect or accused of anything at this point - how is your private life the business of your employer?

Why is your privacy being breached and livehood being placed at risk?

Surely the FBI is not allowed to publicise random dirt they find on innocent people?

IIAOPSW wrote at 2021-12-01 01:56:11:

The FBI still has buildings named after J Edgar Hoover. That should tell you everything you need to know about their institutional respect for justice and due process.

gruez wrote at 2021-12-01 04:11:05:

For non-americans, what was J Edgar Hoover known for?

stephen_g wrote at 2021-12-01 04:32:20:

I'm also not an American - but as far as I've read - massive abuse of power in using the FBI to spy on political rivals, illegal wiretapping, illegal surveillance of US congressmen and even presidents, running the FBI while they were doing extremely controversial programs like COINTELPRO and programs and investigations that tried to hinder the civil rights movement, etc.

dandare wrote at 2021-12-01 14:01:16:

As a non American, I think COINTELPRO is the single most anti-democratic abuse of power ever done by US government.

https://en.wikipedia.org/wiki/COINTELPRO#Range_of_targets

https://en.wikipedia.org/wiki/COINTELPRO#Alleged_methods

Clubber wrote at 2021-12-01 04:16:39:

https://en.wikipedia.org/wiki/J._Edgar_Hoover#Investigation_...

https://en.wikipedia.org/wiki/J._Edgar_Hoover#Reaction_to_ci...

black6 wrote at 2021-12-01 04:33:48:

Turning the FBI into a blackmail operation.

newsclues wrote at 2021-12-01 13:15:03:

Being in the pocket of the mob

BLKNSLVR wrote at 2021-12-01 14:07:18:

Cross dressing

heavyset_go wrote at 2021-12-01 02:45:25:

> _Surely the FBI is not allowed to publicise random dirt they find on innocent people?_

If they're doing an investigation, they very likely got the employer involved in order to get more information on the person they're investigating, and companies have liaisons for law enforcement, as well. If the FBI comes knocking and says, "we think you've hired a terrorist," it's going to ruffle some feathers at the company no matter how unfounded or untruthful the claim is.

It isn't just the suspicion of terrorism that might have law enforcement or the FBI knocking at an employer's door. If someone is suspected of any type of cyber crime, the FBI will be coming for all of their computers and electronic devices, including the ones they use at work.

ClumsyPilot wrote at 2021-12-01 10:16:04:

"If they're doing an investigation, they very likely got the employer involved in order to get more information on the person they're investigating"

What is an employer going to contibute, realistically. "Oh yeah, he always carries potassium nitrate and makes explosions during lunch breaks!"

hulahoof wrote at 2021-12-01 10:49:02:

Depending on the company they would likely audit their activities incase the company itself was a vector, assuming that terrorists also require intelligence networks.

yownie wrote at 2021-12-01 09:01:02:

This is par for the course FBI intimidation tactics, along with interviewing everyone you've regularly conversed with. Serves a double purpose of investigation while simultaneously making you radioactive to be around.

Thereby isolating the person during a period of high emotional anxiety.

aj3 wrote at 2021-12-01 05:55:15:

Employer might have been defense contractor. Most jobs without clearance don't even have "threat assessment coordinaror".

mike_d wrote at 2021-12-01 06:39:50:

> Most jobs without clearance don't even have "threat assessment coordinaror"

The title may vary from place to place but all companies have people filling this role, even if you've never met them.

Normally falls somewhere under a team like Global Intelligence, Workplace Security, Business Continuity, etc.

viro wrote at 2021-12-01 20:15:42:

No, most places do not have Global Intelligence, Workplace Security positions. Business Continuity is most often a IT business function ...

lrem wrote at 2021-12-01 15:38:05:

Companies that employ software engineers likely are divided into those that have that role and those that don't have it _yet_.

numpad0 wrote at 2021-12-01 03:18:24:

You deserve to be always assumed innocent until proven guilty, and you will have to be proven guilty to be found guilty, and realistically speaking, those premises are extremely technical.

Geezus-42 wrote at 2021-12-01 04:42:20:

You don't have to be found guilty to be punished, lookup "case load". That can keep you on probation and monitoring as long as they want to draw out the case and the whole time you are required to make monthly payments or risk going to jail.

x86_64Ubuntu wrote at 2021-12-01 05:07:56:

In the US, the process IS the punishment.

dredmorbius wrote at 2021-12-01 12:37:59:

One of the principle argument for the "speedy trial" clause of the US 6th Amendment, and similar rights in other jurisdictions.

Note that the US law does _not_ apply to noncriminal processes --- civil lawsuits or other elements of law.

Geezus-42 wrote at 2021-12-01 18:40:41:

How about a State felony case that has taken nearly two years?

dredmorbius wrote at 2021-12-01 19:18:21:

How about it?

Without specifics, or some indication of who is triggering the delay (e.g., defendants may request delays), I couldn't possibly comment.

Given law and legal processes are not my baliwick, I'd probably not be able to comment intelligently regardless. But you've posed a null-content question.

Geezus-42 wrote at 2021-12-02 14:22:25:

The State Attorney General dragging the case out because they refuse to look at it. They also filed it under the wrong statue so their arguments are incorrect.

dredmorbius wrote at 2021-12-02 18:55:19:

Seems possible grounds for a challenge. The entire case can be dismissed if the right is denied.

https://www.justia.com/criminal/procedure/right-to-a-speedy-...

https://www.nolo.com/legal-encyclopedia/the-right-speedy-tri...

Geezus-42 wrote at 2021-12-02 23:36:36:

Not during COVID times...

neves wrote at 2021-12-01 17:37:15:

He is already being punished.

I'm reading a book where the main character receives a subpoena to go to a interview with the Portugal dictatorship political police. Nothing happens to him (till now) but everybody in the hotel where he is hosted starts to treat him differently.

Who will be the first in the line when a firing is necessary? Probably the guy that has problems with the FBI.

BLKNSLVR wrote at 2021-12-01 01:22:24:

It's (scarily) interesting that they react with actual personal attendance based purely on a very limited set of electronic information.

From your further description:

> We are not religious (at all). We do not attend church, synagogue or mosque. We are lower middle class white Americans born and raised in the USA and have never traveled outside the country.

Would not the FBI have been able to any amount of background searching (read: further electronic information gathering), that would be less effort-intensive than getting arranging a 'threat assessment' coordinator from throw_away_dgs' actual workplace and a local police officer for an in-person door-knock. If such background checks were performed, then they either don't have much data or their threat weightings are set to red-scare levels of paranoia. Either way, it's scary.

Unless there's more to the story.

Eelongate wrote at 2021-12-01 01:42:00:

I think what he experienced is another manifestation of the same phenomena as zero-tolerance policies in schools; institutions ask their enforcers to suspend common sense and strictly enforce the letter of the law/guideline/etc, even in situations where any reasonable person would decide it made no sense. They do this because such common sense and gut feelings is how bias and prejudice might creep into their oh-so-perfect system.

It used to be that if a teacher saw a kid get bullied and then punch his bully back, the teacher was empowered to evaluate the situation using their best judgement, and punish the bully while congratulating the bullied kid who stuck up for himself. The system sees a problem with that; the teacher's perception of the incident might have bias and prejudice. The system's solution is to have zero tolerance for any violence and punish both students equally. The system's solution to the possibility of prejudice against one student is to ensure prejudice against _both_ students.

throaway46546 wrote at 2021-12-01 08:07:01:

At my school it was worse than that. Any one "involved" in a physical altercation would be suspended. Someone could walk up and punch you and you would be suspended for it. This obviously had a chilling effect on reporting. No more bullying. Problem solved.

BeFlatXIII wrote at 2021-12-01 13:11:21:

Such policies also justify and encourage excessive retribution. If you’re getting suspended whether you fight back or not, may as well cause some real damage to earn it.

admax88qqq wrote at 2021-12-01 02:45:06:

> the teacher's perception of the incident might have bias and prejudice.

I mean that's not entirely wrong either. Bullying was still a thing before zero tolerance policies.

Not to say zero tolerance policies are the right solution, but personal bias _is_ a big problem when it comes to enforcement.

Eelongate wrote at 2021-12-01 02:50:18:

Of course. Bias and prejudice is always a real concern. In situations where the teacher gets it wrong and punishes the bullied kid, the kid learns an unfortunate but useful lesson; that some agents of the system cannot be relied on.

But the zero tolerance response to this circumstance ensures the bullied student is prejudiced against, judging him guilty before considering the facts of the individual circumstance. What does that teach the kid? That the _system itself_ cannot be relied on.

pope_meat wrote at 2021-12-01 03:49:09:

to be fair, that's a pretty valuable lesson to learn out here. it would be neat if we had a system we could rely on.

bigger_inside wrote at 2021-12-01 09:32:40:

was about to comment the same thing. I teach future teachers, and I always say that_ everyone forgets their school math and chemistry lessons after cramming for the test. What sticks is learning how to survive in an unequal, dysfunctional system where you're the oppressed class, fighting among each other while you can't touch the people in power.

collaborative wrote at 2021-12-01 10:02:35:

This is how 95% of the world works. In most countries, people are conditioned to "join" the rulers from a very young age, and people who use critical thinking are a tiny minority (often invisible)

yosito wrote at 2021-12-01 07:57:14:

Bullying is still a thing.

raxxorrax wrote at 2021-12-01 07:52:51:

But they did not establish how legislation has elevated itself from that.

raxxorrax wrote at 2021-12-01 07:51:18:

They are right that everyone is biased, what they completely fail to establish is how they improved their own perception. Actions justified because of the presence of bias and prejudice very closely mirror religious dogma by a more objective metric.

Terry_Roll wrote at 2021-12-01 01:32:01:

> It's (scarily) interesting that they react with actual personal attendance based purely on a very limited set of electronic information.

Either their intel is better than they let on and didnt think they would be walking into an ambush or they are more stupid than we think.

CrazyCatDog wrote at 2021-12-01 03:55:01:

Actually, I think they had no intel. You NEED intel for a judge to order a subpoena—and if a subpoena was issued, the ISP would open their firehose, and overwhelm the FBI with evidence suggesting that there’s nothing to investigate. And having visited extremist sites a handful of times—even if advertantly—is probably not going to meet the threshold for a subpoena.

If the FBI visited me and casually asked about my web history, I would casually ask them to pound sand (as should everyone!). But if the agent was accompanied with someone from my employer, I would eagerly cart up every single device in my home and offer to carry it out to their vehicle (as I fear most would).

It smells like someone is taking massive investigative shortcuts, at very significant cost to the accused. Then again, I can’t even fathom the upside for the FBI.

germinalphrase wrote at 2021-12-01 11:46:14:

My gut reaction is simply speed. Why sit at my desk for a few hours reading documents when I can a couple phone calls and be scary for 20 minutes to feel secure in saying “yep - not terrorists”.

Or - you know - “weeeelp, I’ve been sitting at this desk all morning, let’s go talk to someone”.

BLKNSLVR wrote at 2021-12-01 14:05:40:

As commenter below says: Power.

Why spend the extra time and effort, let's just hit the road and totally and completely fuck at least one citizen's opinion of the entire system upon which their life and livelihood depends.

Saves me a couple of hours, and the sun's out. Sold!

Ironically, maybe this will actually radicalise the people they're investigating for radicalisation.

donw wrote at 2021-12-01 04:54:42:

> Then again, I can’t even fathom the upside for the FBI.

The upside is power.

You yourself said as much: "If the agent was accompanied with someone from my employer, I would eagerly cart up every single device in my home and offer to carry it out to their vehicle."

You fear them. Rightly so. The FBI has incredible power, backed by the full might of corporate media. To cross them is to be crushed.

Why would they need a warrant, when Apple and Google climb over each other volunteer every scrap of your private information? Why take the time for a trial, when justice can more efficiently be served by both your employer _and_ your union gleefully ruining you financially upon request?

People have been demanding[1] this for years. Now it's here.

[1]

https://xkcd.com/1357/

MaxBarraclough wrote at 2021-12-01 20:47:29:

Apple have famously refused FBI requests.

https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_d...

coldtea wrote at 2021-12-01 12:22:29:

>_If such background checks were performed, then they either don't have much data or their threat weightings are set to red-scare levels of paranoia. Either way, it's scary._

They're not gonna have anything happen to them if they go tough on (and fuck over an) innocent guy.

They're gonna look bad if they miss a terrorist.

So they have no incentive to not have "red-scare levels of paranoia".

BLKNSLVR wrote at 2021-12-01 14:00:11:

That's true, I still remember the fact that the Boston Bomber(s) were on international watch lists and their home countries warned the US (whichever TLA, may have been an issue of crossed wires) that these guys were on the move, and it was all ignored.

Now, visit a 'bad' website, or somehow be mistaken for someone that visited a 'bad' website, and you'll get some deep personal treatment.

Feds can't win, but it seems to be through their own laziness or incompetence or lack of interagency cooperation.

Notanothertoo wrote at 2021-12-01 14:54:30:

Or maybe because it's motives, and what level of capture they have over their 'customers'? Seems pretty simple to me. They have a monopoly of service and the only retribution people can take is political which means everything is done on appearance.

NmAmDa wrote at 2021-12-01 05:03:01:

Imagine being a muslim in such case. Trying to convince them that this can be a mix-up ( which is possible easily) won't be successful.

coldtea wrote at 2021-12-01 12:24:31:

Imagine being a muslim and e.g. having a kid visiting those sites.

Or just going there out of intellectual curiosity, like how a leftie might read Main Kampf to check what that shit is.

You can end up in a very bad position...

selimthegrim wrote at 2021-12-01 13:48:00:

A couple years after 9/11, my father and I had donated to the Holy Land foundation.

The IRS proceeded to audit me (16 years old) and my $8k a year woodselling business I had with my dad. You tell me.

tw04 wrote at 2021-12-01 06:06:48:

I’ll be the dissenting voice and say this reads like a “sow discord in the US 101”. Why on earth would the FBI bring both the police and a “threat assessment” coordinator from your work to interview you? Why would your workplace ever agree to it? That screams lawsuit waiting to happen.

And on that note, why didn’t you sue your workplace for harassment? Whether you’re religious or not isn’t any of their business and is a protected class.

lrvick wrote at 2021-12-01 06:34:23:

A decade ago the FBI harassed me at my home waking me up from sleeping twice and at a past employer before on entirely unfounded claims.

They didn't care what the consequences were for targeting someone innocent.

They also made nasty threats like "Someone has to go down for this, and if you help us collect intel on your industry peers we suspect then someone else can be that person"

I told them politely to go die in a fire because I was not about to help them harass other innocent people but it was terrifying none the less that they seemingly had the power to end my whole universe.

I became convinced through that ordeal that the FBI is a deeply corrupt organization that creates pressure to close cases by any means needed.

The OPs post seems totally believable and consistent with stories I have heard from others, particularly if they work for an organization that has the US government as a customer like a defense contractor.

matheusmoreira wrote at 2021-12-01 13:05:00:

> Someone has to go down for this

The so called "justice" system, I guess.

igammarays wrote at 2021-12-01 09:12:11:

You're incredibly naive if you think this kind of stuff doesn't happen _all the time_ since 9/11. I personally know several people with similar stories in the US.

tw04 wrote at 2021-12-01 13:01:36:

You know several people whose employers sent someone to their house with FBI agents to harass them about their religious beliefs?? And none of them sued?

I’m not surprised at all that the FBI is harassing people, I find it incredibly hard to believe a private business would touch the situation with a 4,000 foot pole. They have absolutely nothing to gain and massive liability.

vbezhenar wrote at 2021-12-01 05:40:20:

Is it prohibited to visit those websites? I once was interested to understand the way radicals think, to read about their arguments, so I spent some time hanging around some radical websites.

lrvick wrote at 2021-12-01 06:42:07:

I was visited by the FBI for doing security research that made them at least pretend to assume I was a blackhat they wanted to take down.

Use Tor browser if you are going to research anything a criminal might regardless of pure motives.

If you so much as want to research lock picking, use Tor.

ISP traffic logs can and will be twisted against you in a court of law.

throaway46546 wrote at 2021-12-01 08:11:07:

I openly participate in locksport communities and I haven't had any visits from the FBI.

comboy wrote at 2021-12-01 11:35:51:

I'm fairly confident that those agencies use context in an automated manner to get any meaningful results.

So "keyword" (could be a word, domain or some other pattern) X may trigger only if Y and Z was already triggered. And some keyword A may only trigger if B was NOT present.

This way you can distinguish doctors, reporters or people studying history or chemistry from those who plan something.

Or e.g. ML applied to patterns over time. Globally.

And yes I do not like it at all, HN is full of people that may likely research some kind of bomb, religion or whatever else out of pure curiosity, but since there are not many such people it can be problem in court one day.

Mix in some Snowden, your hardware stack, gag orders and the fact that we have more laws that anybody can read and you may feel like watching some stupid memes.

SturgeonsLaw wrote at 2021-12-01 20:24:20:

OPSEC is about lowering the probability of things going sideways, there are no guarantees either way

throwaway0a5e wrote at 2021-12-01 12:19:23:

To quote a Dartmouth history professor who taught a class on the subject: "if you don't get _randomly_ selected for a search on your next flight you aren't doing your homework"

It's not prohibited but they notice and subject you to harassment by the system at every action with every part of the system that is integrated with their database.

q1w2 wrote at 2021-12-01 16:47:10:

This seems like hyperbole.

rubyist5eva wrote at 2021-12-01 01:54:39:

Did they have a warrant? Never talk to the police without counsel, refuse all searches without warrants, "we might think you went on a website" is not probable cause, you have a _right_ to an attorney and silence.

matheusmoreira wrote at 2021-12-01 12:54:36:

That's extremely disturbing. Accessing some random website should never cause police to show up. They should never even know what you did. That's like keeping tabs on what books people read and raiding somebody's house because they looked up how bombs are made.

sandworm101 wrote at 2021-12-01 04:28:42:

Do you work for a government agency or contractor? That might explain why they contacted your employer so readily.

sjs382 wrote at 2021-12-01 01:21:47:

> and a 'threat assessment' coordinator from my workplace.

What was the reason for this? What type of workplace?

akira2501 wrote at 2021-12-01 01:56:07:

I'm assuming any workplace which requires a government security clearance to enter and work in.

aero-glide2 wrote at 2021-12-01 08:24:50:

I never use a VPN. That changes today.

fsflover wrote at 2021-12-01 10:42:44:

You should use Tor instead. With VPN, you just shift your browsing history from one place to another.

unobatbayar wrote at 2021-12-02 03:44:34:

Or worse yet, the VPN provider can sell your data.

Build your own VPN

https://github.com/hwdsl2/setup-ipsec-vpn

austincheney wrote at 2021-12-01 13:08:22:

> We are lower middle class white Americans born and raised in the USA and have never traveled outside the country.

I am most curious why you believe that is a defense against radicalization. In the US that is perhaps the most common demographic for radicalization of any type.

selimthegrim wrote at 2021-12-01 13:43:24:

Radical to him only goes before Islamic terrorism apparently

darkhorn wrote at 2021-12-01 11:34:51:

This is why you and everyone should use DNS over HTTPS (DoH).

Next day they might visit you to ask you why you are visiting an opposition party web site.

smolder wrote at 2021-12-01 15:23:10:

How exactly is DoH a protection? Wouldn't they just see that as a red flag? Then, get the data from cloudflare or whomever.

darkhorn wrote at 2021-12-01 15:39:46:

Most of the time they log your plain DNS queries. But DoH is encrypted, thus they won't be able to log your DNS queries. Cloudflare is not the only DoH provider. There are many. If you want you can grab a several lines of PHP code and create your own DoH link in another country. Becouse DoH is https they cannot distinguish it from normal https. Of course if the use deep packet analyses tool they will know what website you are visiting but they are not being used widely but are used to target specific people. To sum up; DoH is better than plain text DNS queris.

pnut wrote at 2021-12-01 12:33:00:

Please disambiguate acronyms in the absence of context.

giantg2 wrote at 2021-12-01 00:27:39:

It absolutely can.

wbl wrote at 2021-12-01 12:54:37:

The only words you should ever say to the FBI "on advice of counsel I am taking the fifth".

ChrisKnott wrote at 2021-12-01 13:26:44:

This is awful advice for this specific situation.

OP apparently managed to clear up the mistake without much bother by speaking to them (although they were understandably shaken up by the experience). This presumably wouldn't have happened if they'd done what you suggest.

robbedpeter wrote at 2021-12-03 02:37:22:

Not speaking to law enforcement outside the presence of your attorney is excellent advice. There's no downside to having the attorney there, and potentially life shattering downsides to attempt otherwise.

P_I_Staker wrote at 2021-12-01 15:15:19:

On the other hand, they could accuse OP of lying (something that's highly subjective), which is a serious federal crime.

jtdev wrote at 2021-12-01 13:54:28:

The FBI is a criminal organization. Just look at the history the FBI if you think that’s a radical statement.

xanaxagoras wrote at 2021-11-30 20:47:01:

They left off one very popular messenger, SMS:

heavyset_go wrote at 2021-11-30 21:07:21:

There's also:

* Law enforcement simply asks nicely: can render all message content for the last 1-7 years

gnopgnip wrote at 2021-11-30 21:30:36:

The Stored Communications Act makes disclosing the contents of messages without a search warrant unlawful

freeflight wrote at 2021-12-01 02:06:36:

Just like the NSA spying on Americans is unlawful [0] the FBI terrorizing political movements is unlawful [1] or the CIA operating in the US is unlawful [2]

Yet, I'm pretty sure all these are still happening, to a certain degree, to this day.

[0]

https://www.reuters.com/article/us-usa-nsa-spying-idUSKBN25T...

[1]

https://en.wikipedia.org/wiki/COINTELPRO

[2]

https://en.wikipedia.org/wiki/Operation_CHAOS

nikcub wrote at 2021-12-01 02:50:36:

Larger point would be if it's obtained unlawfully it can't be used in a court of law against you.

Recent cases:

https://www.vice.com/en/article/pkppqk/court-throws-out-mess...

https://www.computerweekly.com/news/252503524/Berlin-court-f...

the_optimist wrote at 2021-12-01 03:11:20:

That is little consolation in the court of public opinion, where FBI management and the Justice Department have demonstrated willingness and capability to hold mob court and manipulate public opinion outside the formal legal system. They will SWAT you themselves if they like, on live TV.

dwiel wrote at 2021-12-01 03:45:59:

Parallel Construction makes this a technicality/nuisance, not a show stopper.

yownie wrote at 2021-12-01 09:25:21:

came here to say this.

filoeleven wrote at 2021-12-01 03:54:22:

Generally only if you have the means to hire a good lawyer.

phonethrowaway wrote at 2021-12-01 08:21:41:

The NSA doesn't need to illegally spy on Americans when an ally can do it for them and then share the data legally.

https://www.nationalarchives.gov.uk/ukusa/

https://en.wikipedia.org/wiki/Five_Eyes

freeflight wrote at 2021-12-01 14:18:15:

That's not really how it works. Sure, it is also a way to circumvent such local legislature, but for that to work American allies would need to run actual surveillance structures in the US mainland proper out in the open..

You know, like the US does in the countries of it's "allies" like Germany [0]

Do you really think the US would allow German intelligence agencies to build whole complexes, plugged right into the US's largest IPX?

That's why this situation is not nearly as "symbiotic" as it's often made out to be. At best that applies to Five Eyes countries, and even there only to a very limited degree as no Five Eyes member as as much foreign presence as the US.

[0]

https://en.wikipedia.org/wiki/ECHELON#Examples_of_industrial...

the_optimist wrote at 2021-12-01 20:29:33:

To this rhetorical question, a resounding “yes” answer. There is credible suggestion that GCHQ has been invited to operate US facilities on US soil for this explicit purpose.

https://www.theguardian.com/uk-news/2013/aug/01/nsa-paid-gch...

HamburgerEmoji wrote at 2021-12-01 09:02:00:

Is it usually legal to compensate some other party to do something illegal? I don't think so. This situation seems something like paying an ambassador to steal something. The ambassador might not be prosecutable, but why isn't the local party? I think the real answer is "power", but that's not good enough.

AnthonyMouse wrote at 2021-11-30 21:47:03:

The people responsible for investigating and prosecuting such crimes have some not so great incentives to avoid doing so and keep the whole thing secret though, don't they?

And then when they get caught, they do this:

https://cdt.org/insights/the-truth-about-telecom-immunity/

2OEH8eoCRo0 wrote at 2021-11-30 22:21:13:

Sounds like an easy way to have your case tossed out in court.

It's funny how much this differs from my own personal experience with law enforcement. The friends I know are timid as hell and don't do anything without a warrant just to stay on the safe side- even if they probably don't need one.

kingcharles wrote at 2021-11-30 23:06:19:

Good luck with that. In my case there was a ton of violations of the SCA. Violations of the SCA are only actionable if they are "constitutional" in nature. (That essentially means that if the government indict you based on information they illegally gathered through violating the SCA but the information did not belong to you - say it belonged your wife or business partner - then you can't get the information suppressed/excluded in court)

In my case the government did violate the SCA and my constitutional rights, but two judges have looked at it and both stated the same answer - the police must be allowed to commit crimes to gather evidence. Next stop: appeal courts.

giantg2 wrote at 2021-12-01 00:36:24:

Yep, the courts side with law enforcement. The whole 'truth comes out in a fair fight' is completely undermined by this. The system protects itself above all else.

I was involved with a case that sounds similar - the judges don't care about your rights and blatantly missapply the law. Also, magistrates are also _complete_ BS, and don't even know basic legal stuff. I had one think I called him prejudice when requesting a case be dismissed with prejudice... Complaints do nothing. There's no real oversight, leading to a completely incompetent system.

heavyset_go wrote at 2021-12-01 01:17:52:

> _There's no real oversight, leading to a completely incompetent system._

It's the system working as intended. If you want something that looks like justice, you'll need substantial wealth to get it.

danShumway wrote at 2021-12-01 04:30:50:

You have to generally assume that the FBI and other government agencies are competent. My baseline, starting assumption is that if everyone in the US was too scared to use programs like PRISM, they wouldn't have been built.

So these kinds of claims just don't make any sense in a world where we _know_ that government has conducted surveillance without a warrant, and where we know that the FBI has built entire programs designed to make it easier for them to conduct surveillance without a warrant.

From the article posted that you're replying to:

> What Administration officials tend to obscure is that what they seek is not immunity for future cooperation with lawful surveillance, but rather telecom immunity for assisting with unlawful surveillance conducted from October 2001 through January 17, 2007, as part of the warrantless wiretap program initiated by the White House.

I'm not sure I understand what your implication is. I don't understand how it's possible to respond to an article that is about telecoms seeking immunity for previous unlawful actions by saying, "the government/businesses would be way too scared to do anything unlawful." I mean... obviously not, they sought immunity for it. They wouldn't just randomly do that, the most likely explanation is that they made immunity a pressing issue because _they thought they needed it_.

It does not seem to me that the optimistic world you describe and the observable actions and lobbying efforts of companies/administrations line up with each other.

marricks wrote at 2021-11-30 22:31:29:

I'm just glad you're here to stick up for your friends without any corroboration or linking story. It's just a good thing to do.

jdavis703 wrote at 2021-11-30 23:23:54:

Being charitable, let’s assume his friends work as homicide or theft detectives. If so, they need a high standard for admissible evidence to build their case.

If on the other hand his friends are street cops tasked with clearing a corner of drug dealers because some neighbor complained to their council person who complained to the police chief then those cops don’t necessarily care about extrajudicial activities.

Having been harassed by street cops and interacted with homicide detectives, I can tell you they vary tremendously in professionalism.

xanaxagoras wrote at 2021-11-30 23:28:41:

They definitely need a high standard for admissible evidence, that doesn't stop them from purchasing large amounts of data from all-too-willing communications companies and using parallel construction to build their case once they find out what happened via warantless spying.

intricatedetail wrote at 2021-11-30 23:40:47:

They can also query these messages to see if there is something on the dealers they get paid from and then warn them if something comes up. It works both ways, no?

2OEH8eoCRo0 wrote at 2021-11-30 23:38:53:

Cybercrime. Lots of scams and child abuse.

op00to wrote at 2021-11-30 22:25:25:

The really smart cops get the tips using “less than legal” means, then walk back and reconstruct using legal evidence.

a4isms wrote at 2021-11-30 22:41:17:

“Parallel Construction:”

https://en.wikipedia.org/wiki/Parallel_construction

giantg2 wrote at 2021-12-01 00:29:19:

"Sounds like an easy way to have your case tossed out in court."

This is terribly naive in my experience.

King-Aaron wrote at 2021-12-01 00:23:11:

Imagine a world where the entire law enforcement complex followed the law. What a world.

heavyset_go wrote at 2021-12-01 01:15:11:

Let's be honest, how often do people share with their pals about how they commit crimes, or are less than scrupulous, at work, assuming their pals aren't criminals, as well? People tend to keep things like that a secret, even from people that are close to them.

jc01480 wrote at 2021-12-01 05:20:07:

You are correct. There’s also varying 2-party/1-party consent required depending on the state in the absence of a warrant. But unless you’re targeting the devices, you will not get much at all from service providers. They simply don’t keep it contrary to what I read here.

kevin_thibedeau wrote at 2021-11-30 22:36:46:

EO12333 makes it lawful without a warrant.

gnopgnip wrote at 2021-12-01 00:30:53:

> EO12333

An EO making it lawful for a federal agency to collect doesn't mean it is lawful for a private company to disclose, it doesn't change when a company is permitted to disclose the content of messages under the SCA

giantg2 wrote at 2021-12-01 00:37:54:

I mean, this whole discussion is moot since nobody will enforce things like this, especially against themselves.

jfrunyon wrote at 2021-11-30 22:08:29:

The reality is that many times the only barrier to sensitive information is a shared login which many people know and a statement that users represent that they have legal authority to access that info.

JohnWhigham wrote at 2021-12-01 12:25:47:

Tell that to the FAANG companies that provide white glove access to authorities any time they ask.

grouphugs wrote at 2021-12-01 03:15:08:

this guy, lmfao, completely fucking clueless to what policing is

Consultant32452 wrote at 2021-11-30 21:23:58:

* Law enforcement wants to stalk ex-girlfriend: can render all message content for the last 1-7 years

dkdk8283 wrote at 2021-12-01 00:39:21:

Companies also sell data to law enforcement.

heavyset_go wrote at 2021-12-01 01:20:15:

Many tech companies even develop nice portals for law enforcement to use where they can request and view data, with or without a warrant or subpoena.

jc01480 wrote at 2021-12-01 05:15:09:

Major service providers do not maintain SMS history beyond 24 hours, let alone 1-7 years (last time I worked a case that is). They’re transparent about it as well. Look up the LE liaison contacts on their sites and they’ll clearly list what is available or not available. That’s why it’s crucial to get the actual devices themselves. Reason: the infrastructure to manage SMS content for every customer for 7 years with zero business justification/use case is phenomenal. They’d spend most of their time responding to civil and criminal subpoenas/warrants. That would be a feat the NSA would be proud of. Been there and done that a 100 times. (This also aligns with certain VPN providers refusing to keep logs. It’s a cost that provides zero returns, so they cut it as a business decision, not because they’re trying to stick it to the man.

lrvick wrote at 2021-12-01 06:44:08:

I went to a major cell provider and asked them nicely for access to SMS for all their customers and they happily took money and gave me an API.

This was for a startup.

I have no doubt they do the same for governments.

jc01480 wrote at 2021-12-02 03:01:53:

If I understand this correctly you’re saying a major cell provider is selling you access to subscriber SMS message content?

superkuh wrote at 2021-12-01 05:36:32:

I'm surprised to hear this has changed so significantly since the snowden leaks. Especially after the blatant attack on Qwest CEO Joseph Nacchio for refusing to spy. It was established then that the major mobile telcos in the USA were keeping and providing sms full data for 2-5 years (t-mobile, at&t, verizon, etc).

yownie wrote at 2021-12-01 09:30:31:

and also that the government was subsidizing the programs when the companies complained about the added costs.

wyldfire wrote at 2021-12-01 05:24:26:

There's no reason _for_ them to keep those records, other than for law enforcement's sake. No use case for calling up your operator to ask about that text message you got "from Fred at 4am one day a couple years ago."

Terretta wrote at 2021-12-01 12:35:33:

> _Major service providers do not maintain SMS history beyond 24 hours, let alone 1-7 years_

Nobody should make decisions based on this comment.

jc01480 wrote at 2021-12-02 03:00:19:

Agreed. Do your own due diligence.

authed wrote at 2021-11-30 23:02:12:

You forgot email... and they don't need a warrant for messages older then 180 days if in the cloud (they never delete them, too):

https://www.consumerreports.org/consumerist/house-passes-bil...

kingcharles wrote at 2021-11-30 23:20:36:

IIRC the only reason this amendment was made was because the 180 day limit was found unconstitutional anyway by an appellate court. So, technically the amendment did nothing.

It doesn't matter where your data is held, locally or cloud, (if you are an American resident and your data is in the USA) as it is _your_ data and it is unconstitutional for them to read it without a warrant. In theory.

authed wrote at 2021-11-30 23:23:59:

> It doesn't matter where your data is held, locally or cloud

In the US it does

kingcharles wrote at 2021-11-30 23:59:11:

Citation?

This ruling has been adopted by the US Supreme Court:

https://privacylaw.proskauer.com/2007/06/articles/electronic...

authed wrote at 2021-12-01 04:04:31:

Look at the link I posted up there? it is 10 years newer then yours

encryptluks2 wrote at 2021-12-01 00:14:44:

If they are local and encrypted... oops, forgot the encryption key.

grumple wrote at 2021-12-01 00:26:14:

Source is a few years old, but I suppose we can make another FOIA request to find out how long carriers store text messages these days - it was basically 0-5 days a decade ago:

https://www.nbcnews.com/technolog/how-long-do-wireless-carri...

FateOfNations wrote at 2021-12-01 04:51:45:

Idk... back in the mid 2000s my parents managed to get a transcript of all of my (minor) sister's SMS messages for a few months back (as part of a billing dispute).

jc01480 wrote at 2021-12-01 05:26:44:

You’ll be lucky if it’s any longer than 24-hours now. There’s no business use case for building and maintaining the technological infrastructure to manage it for years. It’s private info and they can’t sell it to anyone without legal liability. If LE gave them the funds to build this infrastructure and use it for retention then the service provider is essentially an agent of the state at that point.

xanaxagoras wrote at 2021-12-01 18:03:03:

You're overstating the technical difficulty of archiving and retrieving text.

grumple wrote at 2021-12-01 20:00:59:

I can only imagine that the scale of all US SMS messages is absolutely staggering. It probably eclipses all other text formats combined in terms of daily production. Here's a blog post from a few years ago estimating it at 26 billion text messages per day and rising:

https://www.textrequest.com/blog/how-many-texts-people-send-...

Not counting media and assuming they are all 160 byte messages, that's 4 terabytes per day, or about 200 wikipedia's per day. I guess that's not too bad in terms of storage requirements, certainly a management amount of data for a telecom to store. But assuming that you want those indexed and easily retrievable somehow, it could get very burdensome to manage and interact with, and that tends to balloon the size at least a little bit as well.

The liability and legal issues around it (both externally and internally - don't want employees spying on their exes, leaking data from celebs, in addition to the policing issues, etc) makes it pretty undesirable to store though.

lsiebert wrote at 2021-12-01 00:13:46:

It's about secure messaging

xster wrote at 2021-11-30 22:42:15:

This seems like a good place to say that I strongly recommend Yasha Levine's Surveillance Valley book (

https://www.goodreads.com/book/show/34220713-surveillance-va...

) where he suggests that all of this is working as intended, going all the way back to the military counter-insurgency roots of the arpanet first in places like Vietnam, and then back home in anti-war and leftist movements. The contemporary themes that are relevant are the fact that current privacy movements like Tor, Signal, OTF, BBG are fundamentally military funded and survive on government contracts. It distracts from the needed political discourse into a technology one where "encryption is the great equalizer" and everyone can resist big brother in their own way on the platforms the government has built. Encryption does exist, but it also distracts from other vectors like vulnerabilities (that led to Ulbricht getting caught), what services you would e2e connect to, how you get the clients to connect to those services, what store can push binaries for said clients etc.

adamfisk wrote at 2021-11-30 23:04:08:

Yasha Levine is a conspiracy theorist hack. There’s really no other way to say it. His narrative is attractive to a left leaning audience with shallow knowledge in this area, but the reality is that without publicly funded software like Tor, Signal, OTF, and my own Lantern, our world would be more fully saturated with corporate control of the internet. We need more public funding for open source software (with public security audits, mind you), not less. Without them, we’d basically be left with Wikipedia as the only popular entity on the internet outside of corporate control.

All of these projects are more properly grouped with government funding in other spheres, such as the BBC or PBS in media, than they are with the surveillance state or the NSA. Levine overlooks basic details, such as reproducible builds, that quickly collapse the house of cards that is his narrative. He tries to paint them all with the NSA brush, when, in fact, they’re simply projects that have historically received some of their funding from the government while fulfilling missions with extraordinary humanitarian benefits. Levine’s own knowledge and experience in this area is shallow. Look elsewhere.

xster wrote at 2021-11-30 23:23:14:

I don't disagree with what you're saying. I'm not sure your statement is in disagreement with mine either? I don't think he's saying less OSS is better or anything dogmatic? All he's saying is that using Tor/Signal shouldn't be the end all be all of your surveillance concerns.

> would be more fully saturated with corporate control of the internet

You might disagree. His point was that the "corporate controllers of the internet" support projects like Tor because A) it gives a (somewhat ineffective) channel for people to focus on rather than political recourses and B) there's no real threat to the corporate model. What would you do in this e2e encrypted internet without corporate services?

> such as reproducible builds

Seems like a tangential point. You can have an untampered copy of a client with a vulnerability.

> funding from the government while fulfilling missions with extraordinary humanitarian benefits

I don't think this is in disagreement with anything either

CyanBird wrote at 2021-12-01 00:43:30:

> from the government while fulfilling missions with extraordinary humanitarian benefits

Ahh yes, the famed operation Condor, operation Gladio, operation iceberg and so many other famed "humanitarian" projects

At the end of the day all that you mentioned goes back to a post-facto "it is good because *we* do it", I would go to say that most people here in HN are well aware of the start of Google when it was funded by us Intel as a way to parse Vietnam era datasets, or how US Intel uses Radio Free Asia to destabilize enemy countries abroad, but again, it is only good/not bad when "*we"* do it

Apologies for a rather low quality comment, but these types of persons handwaving the actual structure behind all of this really get on my nerves, specially when I have had family members be tortured as a consequence of these US activities

adamfisk wrote at 2021-12-01 02:25:01:

I’m certainly not defending all US government actions. That’s exactly the point. Levine tries to lump all of this in with surveillance. The US government funds the NSA, that is true. It also funds food stamps. And torture. The trick is to untangle it.

CyanBird wrote at 2021-12-01 08:00:04:

> The trick is to untangle it.

USAID is _specifically_ designed and called that so _as to tangle it_, tell me, how would your average joe understand that USAID is a intelligence agency spinoff designed to sound "good" while doing evil all over the world rather than what its name suggests? You know... Aid?

The NSA, CIA, Extraordinary Rendition and so many other things dont exist there by accident, if said """government""" wishes to spend such amounts of money and resources to enact such evil under the veil of security, then i dont know about you, but then that to me and several other people just reads as "US Gov being flat out evil"

Do remember that there was *wide* support and acceptance back on the Kennedy days to just dissolve the CIA

> Levine tries to lump all of this in with surveillance.

I am not particularly kind to the guy, but he's just merely looking at it on a holistic system design level, any programmer minded person would do the exact same thing when presented with a black box problem

But as far as the foodstamps go, wouldn't it be great if the system where set up in such a way as that foodstamps where not needed to begin with? And on the flipside, why would "the government" allow for such a societal structure where the maintenance of "foodstamps" is necessary for the organization of the nation? I see that last bit in particular if anything as a national security problem...

As Clintonites would say: "It is the economy stupid"

orbifold wrote at 2021-12-01 09:38:14:

It seems obvious that USAID is an intelligence front (I've encountered a few instances where it was mentioned that someone worked for USAID at the time, while it was simultaneously obvious that it would make way more sense if they were Intelligence), but is there any concrete evidence for that?

CyanBird wrote at 2021-12-01 11:43:35:

> any concrete evidence

What do you mean by "concrete evidence"?

Nothing of this is disputed, they even have their own wikipedia pages for their different operations and branches within USAID

https://en.wikipedia.org/wiki/Office_of_Public_Safety

*Specially* that we are talking of USAID, on the case of NED for example, things get slightly murkier because then it is a matter of private rather than public record, but it still works as a tool for management of semi-clandestine operations and operations which need plausible deniability from CIA's end, or at least as much deniability as it can muster, tho these days they prefer to work with shell groups and other associated partners such as for example Atlas Network, Radio Free Asia also falls on that category, same with Voice Of America

If you are interested in books both, Killing Hope by William Blum and Legacy Of Ashes by Weiner are very, very, very good authoritative sources on the matter

If you prefer podcasts, Warnerd Radio has a couple very good episodes on the National Endowment For Democracy, tho they both quote excerpts of the books above

Radio War Nerd EP 274 — National Endowment for Democracy, Part 1

https://podcastaddict.com/episode/121232504

Radio War Nerd EP 275 — National Endowment for Democracy, Part 2

https://podcastaddict.com/episode/121522126

yownie wrote at 2021-12-01 09:43:16:

While those programs certainly existed this is blatant a false equivocation, you can still have humanitarian programs while being a military hegemony. It's not one or the other.

This is in fact a distinct reason CIA/NSA (and vice versa) won't accept recruits who have served in the peace corp previously, amongst other reasons.

pphysch wrote at 2021-12-01 00:34:45:

This comment is an incredibly naive attempt at a smear.

> Without them, we’d basically be left with Wikipedia as the only popular entity on the internet outside of corporate control.

Wikipedia is absolutely not "outside of corporate control". It is trivially astroturfed to advance special interests.

> All of these projects are more properly grouped with government funding in other spheres, such as the BBC or PBS in media

Both BBC and PBS routinely publish outright disinformation to advance the special interests of their corporate/government clients, including the intelligence community. For example, look at PBS Frontline's ridiculous puff piece for the violent extremist group HTS last year.

> Levine overlooks basic details, such as reproducible builds

Reproducible builds are also easily circumvented by _selectively_ deploying backdoors and other malware, based on IP or other fingerprints.

If there are good reasons to dispute Levine's investigative journalism, they're not here.

adamfisk wrote at 2021-12-01 02:28:36:

Um, ok. All of the above projects use not only reproducible builds for many platforms, but they’re all open source, and they all have public security audits. Those three pillars are about as good as it gets. Is there something you would add?

I’m not claiming PBS and the BBC are perfect entities, but they do offer an alternative source of information that runs against the grain of corporate media. You would prefer…what exactly?

pphysch wrote at 2021-12-01 03:14:18:

> Is there something you would add?

Let's start with "not being created/funded by the State Department or Pentagon".

> You would prefer…what exactly?

Again, let's start with "not being blatant propaganda produced by warmongers".

adamfisk wrote at 2021-12-01 04:18:32:

First, there’s a vast difference between the state department and the pentagon. Lumping those two together just reflects an unsophisticated understanding of the federal government. Signal has never received any state department or pentagon money. Tor had a significant early contribution from a researcher at Naval Research. That’s the extent of any pentagon funding. They have received significant state department funding, but to call the state department “warmongers” is just not accurate.

pphysch wrote at 2021-12-01 17:20:58:

Please stop spreading misinformation. From the Tor Project's public IRS documents:

> WHILE FUNDING FOR TOR ORIGINALLY FOCUSED ON BASIC RESEARCH TO BETTER UNDERSTAND ANONYMITY, PRIVACY, AND CENSORSHIP-RESISTANCE, THE MAJORITY OF FUNDING NOW FALLS INTO THREE CATERGORIES: DEVELOPMENT FUNDING FROM GROUPS LIKE RADIO FREE ASIA AND DARPA TO DESIGN AND BUILD PR OTOTYPES BASED ON RESEARCH DONE BOTH INSIDE TOR AND ALSO AT OTHER INSTITUTIONS; DEPLOYMENT FUNDING FROM ORGANIZATIONS LIKE THE US STATE DEPARTMENT AND SWEDEN'S FOREIGN MINISTRY; AND UNRESTRICTED CONTRIBUTIONS FROM PRIVATE FOUNDATIONS, CORPORATIONS, AND INDIVIDUAL DONORS FOLLOWING IS A BREAKDOWN OF THE TOR PROJECT'S FUNDING SOURCES FOR THE PERIOD ENDED JUNE 30, 2020: FUNDING FROM US GOVERNMENT SOURCES US STATE DEPT - BUREAU OF DEMOCRACY, HUMAN RI GHTS AND LABOR 752,154 GEORGETOWN UNIVERSITY - NATIONAL SCIENCE FOUNDATION 98,727 RADIO FR EE ASIA/OPEN TECHNOLOGY FUND 908,744 NEW YORK UNIVERSITY - INSTITUTE OF MUSEUM AND LIBRARY SERVICES 101,549 GEORGETOWN UNIVERSITY - DEFENSE ADVANCED RESEARCH PROJECTS AGENCY 392,00 8 FUNDING FROM NON-US GOVERNMENT SOURCES DIGITAL IMPACT ALLIANCE - UNITED NATIONS 25,000 S WEDISH INTERNATIONAL DEVELOPMENT COOPERATION AGENCY (SIDA) 284,697 FUNDING FROM CORPORATE SOURCES MOZILLA 157,500 AVAST 50,000 MULLVAD 50,000 FUNDING FROM PRIVATE FOUNDATIONS OPEN SOURCE COLLECTIVE 23,100 MEDIA DEMOCRACY FUND 270,000 ZCASH FOUNDATION 51,122 MOZILLA OPEN SOURCE SUPPORT MOSS 75,000 RIPE 53,114 CRAIG NEWMARK PHILANTHROPIC FUND 50,000 STEFAN THO MAS CHARITABLE FOUNDATION 50,000 KAO FOUNDATION 10,000 MARIN COMMUNITY FOUNDATION 1,000 IN DIVIDUAL DONATIONS 890,353

adamfisk wrote at 2021-12-01 20:36:41:

Yes they’ve received funding from DARPA. I realized I forgot that after I posted. Good catch. To my knowledge, that funding is for new anti-censorship transports to sneak traffic in and out of censored countries.

pphysch wrote at 2021-12-02 00:06:36:

And the State Department are definitely warmongers.

SecState Kissinger orchestrated the incineration of Laos, Cambodia and Vietnam.

SecState Powell orchestrated the flattening of Iraq.

SecState Clinton orchestrated the butchering of Libya.

SecState Pompeo tried and failed to orchestrate the annihilation of Iran by assassinating top officials and drawing them into war.

And so on and so forth. These aren't even theories. The State Department is closely involved in destabilizing sovereign governments through the full spectrum of means, including war, to advance Washington's interests.

yownie wrote at 2021-12-01 09:37:04:

>my own Lantern

Brilliant reposte, but I am curious what software are you referring to here?

necovek wrote at 2021-12-01 11:21:58:

A quick look through their comment submissions points at

https://www.getlantern.org/

:

      https://news.ycombinator.com/item?id=20824759#20826587

tptacek wrote at 2021-11-30 23:45:56:

Signal isn't funded by the military, by OTF/BBG, or any branch of the USG government. People who claim otherwise are confused (deeply) about a program OTF ran that sponsored third-party security reviews and development projects (summer-of-code style), none of which was mediated through OTF --- it was just a bucket of money.

You should be extremely skeptical about people who bring OTF/BBG up in these discussions. I have complicated feelings about Tor stemming mostly from culture and effectiveness concerns and would push back on claims that it's co-opted by the Navy or corporate interests, but at least I can see a clear (if silly) line connecting Tor to these supposed conflicts of interest.

CyanBird wrote at 2021-12-01 08:09:47:

> Signal isn't funded by the military

Correct, it is not funded by "the military", but this is incorrect

> any branch of the USG government

Because Signal/TextSecure received considerable amounts of seed capital from Radio Free Asia which is a CIA spinoff with the explicit aim to fund the development of the cryptography at grass roots level, not per se to have full control of it like NSA would have done, but because having strong cryptography on such platforms (Telegram might be other) is highly effective against perceived US enemies like well... Iran, or Syria, and to allow their assets/agents to communicate more easily while abroad without bulky extra proprietary phones or software

All of that above is mentioned at length on Surveillance Valley btw

tptacek wrote at 2021-12-01 15:42:32:

It's like you read 4 words from my comment and stopped.

tialaramex wrote at 2021-12-01 00:22:49:

As I understand it the technology behind Tor is strengthened by an arms race. You _want_ several different well-funded entities running nodes, because that makes the service better for everybody. Even if some of those entities are _hostile_ they still help unless one entity controls a large portion of interior nodes and even then you're only giving metadata to that single entity (whichever it is) by using Tor, not anybody else - which is better than you're going to do with alternative technologies.

yownie wrote at 2021-12-01 09:56:51:

This analogy unfortunately cuts both ways, if you've got technology that undermines the majority government / power structure in a secure fashion, you'll always have the ability to come in as a intelligence agency and foment an insurgency movement.

Which also unfortunately points to them having exploits no one has discovered yet in said technology tools.

They can still maintain generalized situational control via additional superiority vectors(MASINT, HUMINT, GEOINT, OSINT, FININT etc.)

b8 wrote at 2021-11-30 23:49:11:

Ulbricht was caught via poor OPSEC and not via a Firefox/Tor 0day afaik. Though there was/is speculation that a Firefox/Tor 0day was used to bring down some Tor markets and possibly to locate the Silk Road's server. Silk Road 2.0 was brought down in like a few months, which could indicate such a 0day existed. Or that it was ran by some former Silk Road staff members who got doxed when Silk Road 1.0 was shut down.

ichydkrsrnae wrote at 2021-12-01 00:50:51:

Ulbricht was caught because an FBI agent, who would read things slowly and twice, recognized these 4 letters : _heyy_.

That's how Ulbricht sometimes spelled _hey_, and the agent had seen that particular spelling before in his investigation, in an email from Ulbrict’s student email address.

Nick Bilton's book “American Kingpin: The Epic Hunt for the Criminal Mastermind Behind the Silk Road” is a great read, highly recommended.

nyolfen wrote at 2021-12-01 03:30:10:

it strikes me as extremely naive to take this at face value. see

https://www.reuters.com/article/us-dea-sod-idUSBRE97409R2013...

much more likely -- sigint tooling was applied to identify ulbricht, bulk metadata was turned over for his comms history, and it was pored over for things they could connect with sr to get warrants. imo, at least.

but getting to claim you're such a sharp investigator that you can figure it out by noticing the word _heyy_ makes for a much better story to tell an author.

ichydkrsrnae wrote at 2021-12-01 09:47:52:

It was more complicated than just _heyy_, but I won't spoil the book.

It's been awhile since I've read it, but my impression was that solving the case was mostly traditional casework, and a lot of it, by many different people/agents/agencies.

That Reuters article certainly gives pause. Thanks for the link.

unobatbayar wrote at 2021-12-01 01:13:53:

That's what they want you to think. He was caught because; Nothing can match against the surveillance arsenal of the NSA.

ichydkrsrnae wrote at 2021-12-01 01:24:48:

That's not what I think, that's what Nick Bilton thinks. The quality of his book makes me partial to his thesis, of course, but NSA conspiracy blah adds nothing.

Also, lots more went into catching him than just _heyy_, but that was the lucky break that had him caught. Now he shares a prison with Dr. Unabomber Kazinsky.

twobitshifter wrote at 2021-12-01 01:59:49:

That could be the story but since parallel construction is routinely used to hide the existence of surveillance tools and back doors it’s not unreasonable to doubt it.

I thought I had heard it was stackoverflow, is that looped in somehow?

ichydkrsrnae wrote at 2021-12-01 02:21:55:

I don't recall StackOverflow being mentioned, no, but it's been a few years since I've read it.

twobitshifter wrote at 2021-12-01 13:22:13:

https://slate.com/technology/2013/10/silk-road-s-dread-pirat...

ichydkrsrnae wrote at 2021-12-01 02:30:18:

Correction: He was transferred to a penitentiary in Tucson, Arizona.

ichydkrsrnae wrote at 2021-12-01 02:48:43:

Have to admit. I was impressed with the USGOVs ability to recover bitcoin ransoms paid for cyberattacks. I'm not sure if impressed is the right word.

rzz3 wrote at 2021-12-01 07:32:13:

Wtf, who doesn’t add extra y’s to hey sometimes? That wasn’t evidence.

ichydkrsrnae wrote at 2021-12-01 07:54:12:

I don't want to spoil the book; but, yes, that detail got him caught.

acoard wrote at 2021-12-01 15:26:49:

It’s not fiction you’re spoiling, but a factual conversation about events that you’re not going into due to spoilers. It is an odd defence that kills the conversation when other people bring up good points.

The parallel construction argument seems way more plausible if there’s nothing else besides “heyy”. If there is more, please say what it is instead of mentioning it exists but refusing to say it.

SavantIdiot wrote at 2021-11-30 23:51:08:

Where is any evidence of Tor being a military surveillance project? I find it hard to believe an open source project like this has been infiltrated. Yes, there is suspicion some ECC curves are compromised, but only the ones provided by NIST. I'd really like to see evidence of Tor.

tbihl wrote at 2021-12-01 00:35:42:

The seed for that line of thinking is the fact that a US Navy lab built it.[0] Having said that, I believe that's the only basis and is a far cry from making the theory convincing or even probable.

[0]

https://en.m.wikipedia.org/wiki/Tor_(network)

SavantIdiot wrote at 2021-12-01 00:43:12:

Wow, I feel like an idiot. All this time I had no idea the Navy built it, when a simple Wiki search would have pointed that out. Thanks!

adamfisk wrote at 2021-12-01 02:21:59:

“The Navy built it” is a bit of an exaggeration. Paul Syverson did early work on it at the Naval Research Lab, and Roger Dingledine and Nick Mathewson added to the collaboration at approximately the same time, with neither having anything to do with the Navy. That’s the extent of the military connection - some relationship in the first year or so of an 18 year or so project.

yownie wrote at 2021-12-01 10:03:10:

There's a been a suspiciously downplayed number of ephemeral hidden services that get raided / internationally taken down on the Tor network for it to be mere circumstance.

No one tries to take notice since they're hosting the worst content on the internet regularly.

raxxorrax wrote at 2021-12-01 15:54:32:

Could as well be insiders though and operations that were planned for years.

nova22033 wrote at 2021-12-01 03:41:53:

Did you even click on the link? Signal gives away NOTHING.

suetoniusp wrote at 2021-11-30 23:39:56:

Thank you. I never knew the source of the ridiculous theory that the internet sprang from spying attempts on the Vietnamese. I am always looking for keywords to filter conspiracy weirdos. Yasha Levine added

hutzlibu wrote at 2021-11-30 22:54:06:

"are the fact that current privacy movements like Tor, Signal, OTF, BBG are fundamentally military funded and survive on government contracts."

Are those "facts" avaiable for investigating, without having to buy the book?

(that Tor is partly US administration funded is known, but Signal? And what is OTF and BGG?)

xster wrote at 2021-11-30 23:30:49:

https://www.opentech.fund/results/supported-projects/open-wh...

Funded by Open Technology Fund (OTF)

https://en.wikipedia.org/wiki/Open_Technology_Fund

Which is funded by Radio Free Asia (RFA)

https://en.wikipedia.org/wiki/Radio_Free_Asia

. It had a few reboots but was created as a CIA program in 1951 (

https://en.wikipedia.org/wiki/Radio_Free_Asia_(Committee_for...

) to blast shortwaves into China from Manilla to try to overthrow the Chinese government. Rebooted more recently since the advent of the great firewall of China.

evgen wrote at 2021-12-01 00:32:15:

Wow, that is so thin it is transparent. If this is the sort of 'proof' that we are going to find then I am glad you posted the ref here so that I could add yet another kook to the list of those whose privacy/security rantings and books I can ignore. The biggest danger to long-term privacy projects is not the risk of taking advantage of an opportune partnership with a government agency when incentives align, it is conspiracy nutjobs poisoning the well with their paranoia and delusions.

hutzlibu wrote at 2021-12-01 01:30:40:

And Signal?

The main tool, used for private communication?

t0mas88 wrote at 2021-11-30 20:09:58:

So if you have something to hide, don't use iCloud backup.

And Whatsapp will give them the target's full contactbook (was to be expected), but _also_ everyone that has the target in their contact list. That last one is quite far reaching.

kf6nux wrote at 2021-11-30 21:08:47:

> if you have something to hide

Most people don't realize that most people have something to hide. The USA has so many laws on its books. Many of which are outright bizarre[0] and some of which normal people might normally break[1].

And that's only counting _current/past_ laws. It wasn't that long ago a US President was suggesting all Muslims should be forced to carry special IDs[2]. If you have a documented history being a Muslim, it could be harder to fight a non-compliance charge.

[0]

https://www.quora.com/Why-is-there-a-law-where-you-can-t-put...

[1]

https://unusualkentucky.blogspot.com/2008/05/weird-kentucky-...

[2]

https://www.snopes.com/fact-check/donald-trump-muslims-id/

kingcharles wrote at 2021-11-30 23:53:23:

I always liked this one I found in the Illinois statutes - it basically criminalizes every person online:

Barratry. If a person wickedly and willfully excites and stirs up actions or quarrels between the people of this State with a view to promote strife and contention, he or she is guilty of the petty offense of common barratry[.]

https://www.ilga.gov/legislation/ilcs/ilcs4.asp?DocName=0720...

akira2501 wrote at 2021-12-01 01:58:32:

Barratry typically implies that this is specifically being done with lawsuits and other legal instruments, not in the general case.

raxxorrax wrote at 2021-12-01 15:58:04:

There is a renaissance of such laws regarding causing offense. That would basically be anybody whose face you don't like? I wonder how much considerations go into suggestions like this. Side effects should normally hit your face like a truck.

president wrote at 2021-11-30 21:47:12:

Did you even read the snopes article you referenced before making what seems like a definitive claim about how Trump was suggesting muslims carry special IDs? Because Snope's own rating is "Mixture" of truth and false and if you read the assessment, it is grasping at straws to even make that conclusion.

kf6nux wrote at 2021-12-01 00:28:24:

Yes, "mixed" means you have to read the nuance. I think I accurately captured the reality. If you have a correction to offer, please do.

EDIT: Ultimately, the nuance in that history is not relevant to the point that criminal law changes to include new categories in unexpected ways.

president wrote at 2021-12-03 00:01:56:

Sure, I can accept there is some nuance but the phrasing and definitive manner of your original statement is very misleading. I'm not the biggest fan of the guy but casually mentioning that he suggested the idea when in actuality it was an idea posed by a reporter is bad faith in my opinion.

hunterb123 wrote at 2021-11-30 22:30:21:

> “Certain things will be done that we never thought would happen in this country in terms of information and learning about the enemy,” he added. “We’re going to have to do things that were frankly unthinkable a year ago.”

> “We’re going to have to look at a lot of things very closely,” Trump continued. “We’re going to have to look at the mosques. We’re going to have to look very, very carefully.”

That's all he said to the interviewer. The interviewer was asking the hypothetical and suggested the special identification! He wouldn't take the bait, so since he didn't answer the hypothetical they said "he wouldn't deny it" and wrote the campaign of hit piece articles anyway. Whatever response they got they would have wrote that same piece. If he would have answered one way they would have quoted out of context. Since he responded generically it's obviously drummed up. The fact check is hilarious. "Mixed", lol.

Never answer a hypothetical, it's always a trap.

daqhris wrote at 2021-12-01 00:43:45:

Your last sentence just made me freak out thinking that I've previously done such stupidity in front of a "law officer".

I never for one second thought it could be a trap; I was overly willing to cooperate and truthfully respond to a "theoretical" inquiry. Damn, it hurts in retrospective.

rootusrootus wrote at 2021-12-01 00:48:51:

> That's all he said to the interviewer

And then the next day, he clarified:

Reporter: "Should there be a database or system that tracks Muslims in this country?"

Trump: "There should be a lot of systems, beyond databases. I mean, we should have a lot of systems."

And then he tried to backpedal. Decided it was a watch list, not a database, etc. Basically the usual shtick of his where he tries to say everything and nothing at the same time.

hunterb123 wrote at 2021-12-01 02:14:19:

Again that's a generic response:

> There should be a lot of systems, beyond databases. I mean, we should have a lot of systems

Beyond databases. What does that mean? That could be analog systems, that could be anything not stored in a computer.

Nothing to do with identification which would need a database. It's a generic answer to avoid a hypothetical. It's a nonanswer.

He said nothing, not everything. You are attributing the reporters question to him. The reporter is posing the hypothetical that they created in the first place by the initial interview.

My main point was hypotheticals are always trap (unless among friends!), but that's a great example of an obvious one.

The usual shtick is to say nothing, because the journalistic usual shtick is to ask gotcha hypotheticals.

danShumway wrote at 2021-12-01 04:44:24:

You're kind of quibbling over details. The below quote is already bad enough:

> "We’re going to have to look at the mosques. We’re going to have to look very, very carefully."

I already do not trust the person who has said that. Does it really matter if he proposed a full-fledged ID system? He still proposed monitoring mosques. He still proposed surveillance based on religious identity.

The correct answer to that question, "should Muslims be subject to special scrutiny" is a simple "no". I don't really get the debate about hypotheticals; this a question that does have a straightforward, right answer. And the implications here in regards to surveillance and ordinary people having stuff to hide -- those implications are all the same regardless of whether or not Trump actually proposed a literal database.

He was open to increased surveillance on Americans based on their religious identity, he didn't immediately shut the idea down.

hunterb123 wrote at 2021-12-01 16:10:39:

Details are important. The media campaigns are claiming he wanted Muslim identification, a system THEY proposed in their hypothetical. When he didn't confirm they said "he wouldn't deny it" as their proof of support.

> The below quote is already bad enough. He still proposed surveillance based on religious identity.

He said nothing about citizens or monitoring them based on religious identity. He said look at mosques, that's all. Mosques are often the target of attacks.

https://search.brave.com/search?q=mosque+coordinated+attack&...

danShumway wrote at 2021-12-01 20:23:01:

Are you proposing that increased surveillance of mosques is to _protect them?_ That requires a certain level of imagination given the full context of the quote:

> "Certain things will be done that we never thought would happen in this country in terms of information and learning about the enemy," he added. "We’re going to have to do things that were frankly unthinkable a year ago."

> "We’re going to have to look at a lot of things very closely," Trump continued. "We’re going to have to look at the mosques. We’re going to have to look very, very carefully."

----

And once again, it kind of doesn't matter. An increased focus on monitoring places of worship _is_ monitoring people based on their religious identity. I don't know a single Christian who would argue to me that monitoring churches isn't the same thing as monitoring Christians.

Mosques and churches are not abstract concepts that are divorced from the people inside of them. When you monitor an institution, you are necessarily monitoring the people inside of it, and it is reasonable for them to be concerned about the government taking an interest in their religious-identity. To argue otherwise requires someone to completely divorce religious identity from the _practice_ of religion, and that's just not a reasonable argument to make.

----

> Details are important.

Not in the context of the original statement, "ordinary people often do have something to hide, and should care about privacy." Look, whatever, you trust Trump. You shouldn't, but you do. Fine.

Do you trust Biden? Do you trust the current government not to attempt to monitor you based on your vaccine status?

You're fighting over the idea that "your guy" wouldn't surveil ordinary people, but this also kind of doesn't matter because your guy isn't in the Whitehouse right now, and I can guarantee you that Republicans are never going to have permanent power over the government. No party wins forever. You have as much reason as anyone else to care about personal privacy, why are you fighting over who specifically is a threat? Does it change anything about the overall privacy debate?

rootusrootus wrote at 2021-12-01 16:50:25:

> Again that's a generic response

Like I said, he always manages to say exactly the right things so the people who support him will read between the lines, but leave just enough ambiguity so those same people can quibble constantly over whether that was what he really meant.

> hypotheticals are always trap

He could have just said "No." Or "I have no such plans at this time." if he wanted to sound like a typical politician. His circumlocution is legendary, because it allows _everyone_ to believe what they want to believe. Politicians all have this problem, but Trump elevates it to a whole new level.

georgyo wrote at 2021-11-30 20:14:29:

You and the person you are communicating with must both not use iCloud backup. And since apple pushes the backup features pretty heavily, you can be reasonable sure that the person you are communicating is using backups. IE, you cannot use iMessage.

xanaxagoras wrote at 2021-11-30 20:42:32:

I got off all Apple products when they showed me their privacy stance is little more than marketing during the CSAM fiasco, but IIRC the trouble with iCloud backup is it stores the private key used to encrypt your iMessages backup. Not ideal to be sure, but wouldn't iMessage users be well protected against dragnet surveillance, or do we know that they're decrypting these messages en masse and sharing them with state authorities?

novok wrote at 2021-12-01 10:24:23:

You wouldn't think most large states have hacked apple's icloud backup servers 20 times continuously at this point?

vmception wrote at 2021-11-30 21:22:23:

iCloud backup can backup your whole phone, specifically the files section. iOS and OSX users can save anything to that.

fumar wrote at 2021-11-30 20:27:44:

Has Apple made any public statements regarding iCloud's lack of privacy features. It takes the wind out of their privacy marketing that is effectively hurting ad tech but not truly protecting consumers from state-level actors with data access.

amatecha wrote at 2021-11-30 20:37:33:

Kind of. These details are indeed publicly written on their website[0]. Do many users ever read this page? Probably not.

[0]

https://support.apple.com/en-us/HT202303

fumar wrote at 2021-11-30 21:23:17:

Here is an excerpt. The language sounds like encryption is enabled and the chart includes iCloud features as server and in transit protected. Seems like smoke and mirrors then.

> On each of your devices, the data that you store in iCloud and that's associated with your Apple ID is protected with a key derived from information unique to that device, combined with your device passcode which only you know. No one else, not even Apple, can access end-to-end encrypted information.

nicce wrote at 2021-11-30 22:42:43:

E2EE was in the iOS 15 beta for backups but it was removed? (Did not land for release) after they changed the time table of CSAM scanning feature. So we will see if we get E2EE backups once that image scanning lands.

sschueller wrote at 2021-11-30 20:13:25:

Can you turn that off if you have icloud or do you need to not use icloud all together?

KennyBlanken wrote at 2021-11-30 22:08:21:

Yes, and you can delete old backups on iCloud - and then switch to local, automatic, fully encrypted backups to a Mac or PC running iTunes.

HN tends to get very frothy-at-the-mouth over Apple and privacy but the reality is that iPhones can be easily set up to offer security and privacy that best in class, they play well with self-hosted sync services like Nextcloud....and unlike the Android-based "privacy" distros you're not running an OS made by a bunch of random nameless people, you can use banking apps, etc.

The only feature I miss is being able to control background data usage like Android does.

ceejayoz wrote at 2021-11-30 20:35:51:

You can turn it off individually just for Messages, but you're still left not knowing the state of the setting on the other end.

lupire wrote at 2021-11-30 20:11:09:

If you have something to hide, don't give a copy to _any_ third-party.

even a second-party is a risk.

georgyo wrote at 2021-11-30 20:11:54:

It says Telegram has no message content. Isn't telegram not E2EE by default, instead required explicit steps to make a conversation encrypted?

Either way looks like Signal wins by a lot. The size of it spot is so small, it seems almost squeezed in. But only because they have nothing to share.

nimbius wrote at 2021-11-30 20:42:58:

for signal users this means the messages of course _do_ exist on your phone, which will be the first thing these agencies seek to abscond with once youre detained as its infinitely more crackable in their hands.

as a casual reminder: The fifth amendment protects your speech, not your biometrics. do not use face or fingerprint to secure your phone. use a strong passphrase, and if in doubt, power down the phone (android) as this offers the greatest protection against offline bruteforce and sidechannel attacks used currently to exploit running processes in the phone.

leokennis wrote at 2021-11-30 21:13:19:

My advice if you’re not on the level where three letter agencies are actively interested in your comings and goings:

- Use a strong pass phrase

- Enable biometrics so you don’t need to type that pass phrase 100 times per day

- Learn the shortcut to have your phone disable biometrics and require the pass phrase so you can use it when police is coming for you, you’re entering the immigration line in the airport etc. - on iPhone this is mashing the side button 5 times

ndesaulniers wrote at 2021-11-30 23:22:25:

> Learn the shortcut to have your phone disable biometrics and require the pass phrase

On my Pixel (Android), it's hold the power button for ~2 seconds then select Lockdown.

FactCore wrote at 2021-12-01 01:59:25:

In case anyone with an Android is confused because they don't see the option: I believe that you have to explicitly enable the Lockdown option in Android's system settings before it shows up.

stjohnswarts wrote at 2021-12-01 03:29:06:

There are a couple of apps that will also lock down with a tap instantly. I'm sorry I forget the names though, but handy if you have it in hand and "open". I have been using iphone too long now to remember the names of the apps though. you can put a shortcut on every "page" of your android and tap it, it enforces locking the phone by passcode. so on most phones it would be a swipe and a tap, probably less than a 200 milliseconds if you practiced it.

wskinner wrote at 2021-11-30 21:33:33:

On recent iPhones, the way to disable biometrics is to hold the side button and either volume button until a prompt appears, then tap cancel. Mashing the side button 5 times does not work.

minhazm wrote at 2021-11-30 21:42:46:

Not sure how recent you're talking but I have an iPhone 11 Pro and I just tested pressing the side button 5 times and it takes me to the power off screen and prompts me for my password the same way that side button + volume does.

Apple's docs also say that pressing the side button 5 times still works.

> If you use the Emergency SOS shortcut, you need to enter your passcode to re-enable Touch ID, even if you don't complete a call to emergency services.

https://support.apple.com/en-us/HT208076

cyral wrote at 2021-11-30 22:08:22:

Pressing it five times starts the emergency SOS countdown (and requires the passcode next time) on my iPhone XS. Maybe you have the auto-calling disabled?

samtheprogram wrote at 2021-11-30 23:25:54:

It doesn't on my 2nd Gen iPhone SE (2020). That said, anything that causes the "swipe to power off" screen to appear has the same affect, so essentially holding down the button for 5 seconds does the trick.

diebeforei485 wrote at 2021-11-30 23:56:46:

The side button 5 times thing is disabled by default, but can be enabled from Settings > Emergency SOS.

I just verified this on iOS 15.1 on an iPhone 12.

croutonwagon wrote at 2021-11-30 22:13:13:

Works fine on my 11, my wifes 12, her backup SE gen 2 and my backup SE gen1.

Just tested all of them

leokennis wrote at 2021-12-01 07:01:00:

I’m on an iPhone 13 and the latest iOS and it does work here. But so does your method…

But I guess yours is the “official” way to do it indeed:

https://www.imore.com/how-quickly-disable-face-id

kingcharles wrote at 2021-11-30 23:29:42:

If you _are_ at the level where TLAs are interested in you they will not give you a chance to mash that button. You will have a loaded gun pointed at your head out of nowhere and you will freeze. From experience.

quenix wrote at 2021-11-30 23:32:11:

Is that a story you mind sharing?

selectodude wrote at 2021-12-01 02:08:21:

He got popped for pedophilia if I remember correctly.

ChrisKnott wrote at 2021-12-01 07:26:53:

Not sure why this is downvoted; you are right.

normaler wrote at 2021-12-01 21:00:54:

I use this app on my phone

https://f-droid.org/en/packages/com.wesaphzt.privatelock/

It locks the phone when a movement threshold is broken, and then requires the password instead of biometrics to unlock the phone.

So the snatch the phone when it is unlocked vector gets harder.

upofadown wrote at 2021-11-30 23:09:59:

In most cases you are going to want to separately passphrase your messaging stuff so it is locked up when you are not using it. That makes every thing else a lot easier. For example, there is a Signal fork that supports such operation:

*

https://github.com/mollyim/mollyim-android

babypuncher wrote at 2021-11-30 23:48:19:

So you're saying I should have to type a secure passcode every single time I want to read or send a message on my phone?

No thanks.

upofadown wrote at 2021-12-01 06:20:15:

I think that it would stay unlocked for a time, possibly till you locked it. Possibly such an arraignment would be more practical for something offline like encrypted email.

A compromise would be to just save the messages to a passphrase. You could use a public key so that you would only need the passphrase to read the old messages. I haven't heard about anything that actually does this.

14 wrote at 2021-11-30 21:26:41:

I just tried this an it does not work for iPhone is it only on a certain iOS? I am a bit behind on updates. Thanks

ribosometronome wrote at 2021-11-30 21:34:19:

That's actually the old method for iPhone 7 and before. Now, you can activate emergency SOS by holding the power button and one of the volume buttons. Assuming you don't need to contact any emergency contacts or services, just cancel out of that and your passcode will be required to unlock.

https://support.apple.com/en-us/HT208076

david_allison wrote at 2021-11-30 21:35:38:

Try: Hold "volume up" and "power" for 2 seconds

You'll feel a vibration, and biometric login will be disabled until you enter your passcode.

14 wrote at 2021-11-30 23:36:16:

That did the trick thanks. But ultimately I’m behind on updates so my phone could probably be broken into trivial with the forensic tools available to most law enforcement. I’m going to update soon.

kingcharles wrote at 2021-11-30 23:28:22:

Don't have any family or friends, either. If you refuse to talk and invoke your rights the government will just threaten to hurt those you love until you break and give up your passwords. From experience.

I liked it in Wrath of Man where one guy is acting tough as fuck until they bring his girl into the room.

Also, if you can, if you are encrypting data, use a hidden volume inside the first - that way you can give the government the outer password and they'll be happy thinking they have everything.

timbit42 wrote at 2021-11-30 20:59:01:

Signal recently added 'disappearing messages' which lets you specify how long a chat you initiate remains before being deleted.

bigiain wrote at 2021-11-30 22:13:22:

Not "recently". Disappearing messages have been there for at least 5 or 5 years.

Almost _all_ my Signal chats are on 1 week or 1 day disappearing settings. It helps to remind everyone to grab useful info out of the chat (for example, stick dinner plan times/dates/locations into a calendar) rather than hoping everybody on the chat remembers to delete messages intended to be ephemeral.

The "$person set disappearing messages to 5 minutes" has become shorthand for "juicy tidbit that's not to be repeated" amongst quite a few of my circl3es of friends. Even in face to face discussion, someone will occasionally say something like "bigiain has set disappearing messages to five minutes" as a joke/gag way of saying what used to be expressed as "Don't tell anyone, but..."

(I just looked it up,

https://signal.org/blog/disappearing-messages/

from Oct 2016.)

timbit42 wrote at 2021-11-30 22:18:10:

Maybe it was only added recently on the desktop client.

upofadown wrote at 2021-11-30 22:41:37:

Keep in mind that any time a message is on flash storage there might be a hidden copy kept for flash technical reasons. It is hard to get to (particularly if the disk is encrypted) but might still be accessible in some cases.

I think encrypted messengers should have a "completely off the record" mode that can easily be switched on and off. Such a mode would guarantee that your messages are never stored anywhere that might become permanent. When you switch it off then everything is wiped from memory. That might be a good time to ensure any keys associated with a forward secrecy scheme are wiped as well.

noasaservice wrote at 2021-11-30 21:11:37:

And a screenshot, or another camera, or a rooted phone can easily defeat that.

The analog hole ALWAYS exists. Pretending it doesnt is ridiculous.

wizzwizz4 wrote at 2021-11-30 21:14:51:

> _And a screenshot, or another camera, or a rooted phone can easily defeat that._

Not if the message has already been deleted. Auto-deleting messages are so the recipient doesn't have to delete them manually, not so the recipient can't possibly keep a copy.

summm wrote at 2021-11-30 21:37:58:

Exactly this. Even more: Auto-deleting messages are also that the sender doesn't have to delete them manually. Most people do not understand this. I even had a discussion with an open source chat app implementer who insisted on not implementing disappearing messages because they couldn't be really enforced.

hiq wrote at 2021-11-30 22:01:46:

That's a different threat model, no messaging app is trying to protect the sender from the receiver. Disappearing messages are meant to protect two parties communicating with each other against a 3rd party who would eventually gain access to the device and its data.

bigiain wrote at 2021-11-30 22:15:14:

Wickr has a "screenshot notification to sender" feature (which of course, can be worked around by taking a pic of the screen without Wickr knowing you've done it).

timbit42 wrote at 2021-11-30 21:30:00:

What made you think I was pretending it doesn't?

croutonwagon wrote at 2021-11-30 22:11:47:

Also IOS has a panic button. Hit the main/screen button (on the right) five times really fast and faceid/touchid is disabled and passcode is required

vorpalhex wrote at 2021-11-30 21:28:46:

Your statement on the 5th amendment is no longer accurate broadly, but the matter still has some cross-jurisdictional disagreement:

https://americanlegalnews.com/biometrics-covered-by-fifth-am...

torstenvl wrote at 2021-11-30 21:52:21:

District courts don't make law. Magistrates working for those district courts even less so. The case this news article cites has no precedential value anywhere - not even within N.D.Cal. - and should not be relied upon.

IAAL but IANYL

kingcharles wrote at 2021-11-30 23:37:44:

Agreed. That decision is unlikely to be repeated by any appellate court. IMO, all the rulings on biometrics not being testimonial are constitutionally correct, even if that sucks. A lot of constitutional rulings suck.

The real solution is for a federal statute to require warrants.

FalconSensei wrote at 2021-11-30 21:15:31:

> do not use face or fingerprint to secure your phone

but can't they force you to put your password in that case, instead of your finger?

caseysoftware wrote at 2021-11-30 21:20:38:

In general, no.

The contents of your mind are protected because you must take an active part of disclose them. Of course, they can still order you to give them the password and stick you in jail for Contempt of Court charges if you don't.

Check out Habeas Data. It's a fascinating/horrifying book detailing much of this.

ribosometronome wrote at 2021-11-30 21:40:04:

To err on the side of caution, it's best to make all your passcodes themselves an admission to a crime.

shadowgovt wrote at 2021-11-30 21:55:09:

"Your honor, the state agrees to not prosecute on any information inferrable from the text of the password."

"Understood. The defendant's Fifth Amendment right to protection from self-incrimination is secured. As per the prior ruling, the defendant will remain in custody for contempt of court until such time as they divulge the necessary password to comply with the warrant."

kingcharles wrote at 2021-11-30 23:44:17:

I don't know why you're being downvoted. For a start, if it was a third party that had the passcode and refused to divulge it they can be held in jail until they release it, e.g. if your wife knows it. (There are many cases where people have been sentenced to years or decades in prison for not testifying)

If it is you not divulging your own passcode, then legally the judge can't give you contempt, but in reality they could give you contempt until you fought it through the appellate court. Contempt is a special type of thing - certainly here in Illinois you have no right to a jury trial on contempt charges. You're just fucked.

shadowgovt wrote at 2021-12-01 02:20:51:

I believe judges can, in fact, hold a defendant for refusing to give up their own passwords, and that the contempt could be indefinite. This is a point of law that is not settled at the federal level yet, and at the state level it varies from jurisdiction to jurisdiction.

In one case, the appellate court at the federal level simply refused to hear the case that had been decided at the sate supreme court level.

https://www.reuters.com/business/legal/us-supreme-court-nixe...

emn13 wrote at 2021-11-30 22:01:02:

They don't actually need your passphrase to unlock your phone - they just need somebody with the passphrase to unlock in for them. And if there's any doubt about who that is, then having that passphrase counts as testimonial; but if there's not - it might not count as testimonial.

Although there are apparently a whole bunch of legal details that matter here; courts have in some cases held that defendants can be forced to decrypt a device when the mere act of being able to decrypt it is itself a foregone conclusion.

(If you want to google a few of these cases, the all writs act is a decent keyword to include in the search).

The defendant never needs to divulge the passphrase - they simply need to provide a decrypted laptop.

oceanplexian wrote at 2021-11-30 23:18:18:

We really should up our game on encryption, perhaps some kind of time-based crypto rotation that inherently self-destructs rendering the data unusable if you don't authenticate with it every so often. If you are physically unable to unlock a device you can't be compelled to do so.

Y_Y wrote at 2021-11-30 21:57:27:

My passwords are so obscene it's a crime to write them down.

dylan604 wrote at 2021-11-30 22:50:40:

great, so they'll just be able to hit you with lewd charges on top of everything else they are filing.

randomluck040 wrote at 2021-11-30 21:22:05:

I think a fingerprint is easier to get if you’re not willing to cooperate. However, I think if they really, I mean really want your password, they will probably find a way to get it out of you. I think it also depends if it’s the local sheriff asking for your password or someone from the FBI while you’re tied up in a bunker somewhere in Nevada.

chiefalchemist wrote at 2021-11-30 22:05:33:

Apple should allow for 2 PWs, one the real PW, the other triggers a "self-destruct" mode.

Knowing that is possible law enforcement would then hesitate to ask.

detaro wrote at 2021-11-30 22:09:49:

_using_ such a self-destruct mode would be a certain way getting yourself charged with destroying evidence/contempt of court/... though.

kingcharles wrote at 2021-11-30 23:48:47:

This would be difficult to prove. They would have to know for certain the evidence was on there to begin with. I don't see the prosecutor easily meeting their burden of proof on this charge.

This is how the statute is worded here in Illinois:

"A person obstructs justice when, with intent to prevent the apprehension or obstruct the prosecution or defense of any person, he or she knowingly commits any of the following acts: (1) Destroys, alters, conceals or disguises physical evidence."

Ugh. It's a vague law. I don't even know how they would prosecute that for virtual evidence held on a device that they didn't already have a view inside of.

dylan604 wrote at 2021-11-30 22:53:02:

i was under such duress that i was shaking so badly that i made typos in my 30 character password 10 times. the loss of evidence is not my fault as it is the people putting me under that duress. don't think it'll hold up though

chiefalchemist wrote at 2021-12-01 15:31:38:

No 5th Amendment protection? If you spoke the command / "password", would it matter?

oceanplexian wrote at 2021-11-30 23:28:31:

FaceID can already prevent a device from unlocking if someone is sleeping. In theory devices could detect if they were being unlocked "under duress" by using biometrics to look at facial expressions, heartbeat, etc, and then wipe themselves. I don't know how practical in reality but perhaps it could be a feature you turn on in a sensitive environment.

cronix wrote at 2021-11-30 21:25:12:

How? They can physically overpower you and place the sensor against your finger, or in front of your eye and pry it open without your consent and gain access with 0 input from you. How do they similarly force you to type something that requires deliberate, repeated concrete actions on your part?

kingcharles wrote at 2021-11-30 23:50:02:

In my case they threatened to harm my wife if I didn't stop refusing. After my case is over I'll happily release the video tapes so you can see how this shit works.

DoItToMe81 wrote at 2021-12-01 05:34:20:

Please do. Very few people realize just how bad things can get with law enforcement.

xur17 wrote at 2021-11-30 23:27:01:

https://arstechnica.com/tech-policy/2020/02/man-who-refused-...

adgjlsfhk1 wrote at 2021-11-30 21:19:51:

no. The 5th amendment has been read weirdly by the supreme court.

jfrunyon wrote at 2021-11-30 22:09:44:

The fifth amendment doesn't protect either speech or biometrics. Nor does it protect passwords.

kingcharles wrote at 2021-11-30 23:39:05:

You are wrong. It protects passwords as speech, as they are testimonial, per many court rulings. It does not protect biometrics based on law that basically says the police can force you to give up your fingerprints for their records, so they can sure as fuck force your finger onto a reader.

jfrunyon wrote at 2021-12-01 20:51:52:

> It protects passwords as speech, as they are testimonial, per many court rulings.

Not true.

https://www.reuters.com/business/legal/us-supreme-court-nixe...

for example.

xuki wrote at 2021-12-01 00:06:59:

Can they force someone to LOOK at the phone? FaceID with attention check will need you to look before it opens.

fragmede wrote at 2021-12-01 03:58:00:

Arguably, yes. That's why it's important to know the shortcut on iOS to render faceid inoperable until you give it the password - mash the power button five times fast!

skrowl wrote at 2021-11-30 21:23:47:

Telegram is encrypted OVER THE WIRE and AT REST by default with strong encryption no matter what you do. It's E2EE if you select private chat with someone.

Lots of FUD out there there about Telegram not being encrypted that's just not true. There's nothing either side can to do send a message in clear text / unencrypted.

Andrew_nenakhov wrote at 2021-11-30 23:04:57:

"Encrypted OVER THE WIRE and AT REST" means that telegram has easy and unfettered access to chat logs. So they can give it up to authorities. (I don't argue that they DO, just that they very much CAN).

This is proven by an extremely simple experiment: you log in on your new phone, enter password and instantly see all chats.

Another simple experiment points that chats are unlikely to be even encrypted at rest is that Telegram has an extremely fast server side message search. You log into a web client, half a second later you can type a search query and uncover chats from years ago.

nicce wrote at 2021-11-30 23:43:17:

It kinda depends on if images and videos are encrypted separately and only indexed at first.

How much data there are on your chats? 1 megabyte is around one thick book in plaintext.

AES-CBC as example method decrypts more than 2 gigabits per second with hardware opcodes (2012 processor), for example if we look this data

https://www.bearssl.org/speed.html

It is impossible to say based on delay when searching plaintext on this level whether there is encryption.

shawnz wrote at 2021-12-01 00:09:21:

Encryption over the wire and at rest is a basic expectancy of any web service today. They would meet that criteria just by using SSL and disk encryption on their servers. E2EE is a much stronger criteria.

octorian wrote at 2021-12-01 00:28:11:

> It's E2EE if you select private chat with someone.

And its not E2EE if you fail to select private chat.

What this means is that any conversations where you do select E2EE are the ones the "authorities" will take interest in, even if only to the extent of metadata.

That's the fundamental problem with E2EE-by-exception, rather than by default. It calls attention to specific data, even if its not cleartext, rather than obscuring everything.

blueprint wrote at 2021-11-30 21:50:34:

(how) does the telegram server prevent unencrypted content?

also curious - how does telegram support encryption for chatrooms without the parties being known in advance? or are those chats not encrypted?

MertsA wrote at 2021-12-01 10:20:12:

Telegram only uses end to end encryption for secret chats. All other chats are only encrypted on the wire with Telegram's keys. Your comment was encrypted on the wire to HN but that's not going to do anything to keep it away from the FBI. The majority of all Telegram messages are only secured by Telegram's unwillingness to cave to outside pressure. It's in plaintext as far as they're concerned.

542458 wrote at 2021-11-30 21:25:20:

For somebody who isn’t super cyprtography-savvy, what’s the difference between over the wire and e2ee? Does the former mean that telegram itself can read non-private-chat messages if it so chooses?

skinkestek wrote at 2021-11-30 21:42:24:

> For somebody who isn’t super cyprtography-savvy, what’s the difference between over the wire and e2ee?

E2EE: As long as it is correctly set up and no significant breakthroughs happens in math, nobody except the sender, the receiver can read the messages.

> Does the former mean that telegram itself can read non-private-chat messages if it so chooses?

Correct. They say they store messages encrypted and store keys and messages in different jurisdictions, effectively preventing themselves from abusing it or being coerced into giving it away, but this cannot be proven.

If your life depends on it, use Signal, otherwise use the one you prefer and can get your friends to use (preferably not WhatsApp though as it leaks all your connections to Facebook and uploads your data _unencrypted_ to Google for indexing(!) if you enable backups.

Edited to remove ridiculously wrong statement, thanks kind SquishyPanda23 who pointed it out.

SquishyPanda23 wrote at 2021-11-30 21:49:55:

> nobody except the sender, the receiver and the service provider can read the messages

E2EE means the service provider cannot read the messages.

Only the sender and receiver can.

skinkestek wrote at 2021-11-30 21:54:16:

Thanks! I edited a whole lot and that came out ridiculously wrong! :-)

SquishyPanda23 wrote at 2021-12-02 03:46:55:

Haha, no problem. I do that a lot too :)

skinkestek wrote at 2021-12-01 06:56:31:

Forgot to upvote you yesterday, done now ;-)

loeg wrote at 2021-11-30 21:32:23:

Yeah, if you connect to

https://facebook.com

and use messenger, it's encrypted over the wire because you're using HTTPS (TLS). But it's not E2EE.

Gigachad wrote at 2021-11-30 21:29:15:

Pretty much. End to end uses the encryption keys of both _users_ to send. Over the wire has both sides use the platforms keys so the platform decrypts, stores in plain text, and sends it encrypted again to the other side. Over the wire is basically just HTTPS.

Daegalus wrote at 2021-11-30 21:31:03:

over the wire is when its encrypted during transmission between the User and Telegram's servers. HTTPS or SSL/TLS, etc. At Rest is when its encrypted in their DBs or hard drives, etc. Theoretically, Telegram can still read the contents if they wished to do so if they setup the appropriate code, or tools inbetween these steps.

E2EE means that the users exchange encryption keys, and they encrypt the data at the client, so that only the other client can decrypt it. Meaning Telegram can never inspect the data if they wanted to.

Andrew_nenakhov wrote at 2021-11-30 23:08:27:

I very much doubt that Telegram really does encrypt messages "at rest": their server side full text search works extremely fast.

Daegalus wrote at 2021-11-30 23:50:58:

That's a fair assessment, I didn't make the original claim, just answered the definitions of the encryption states.

I haven't dug enough to know what telegram does or claims to do.

andreyf wrote at 2021-11-30 21:33:16:

yes. worth remembering also that even with e2ee, a ad-tech-driven company could have endpoints determine marketing segments based on content of conversations ad report those to the company to better target ad spend.

skinkestek wrote at 2021-11-30 21:44:51:

Also, as is the case with WhatsApp, they siphon off your metdata and even have the gall to make an agreement with Google to download message content _unencrypted_ to Google when one enable backups.

farzher wrote at 2021-12-02 07:52:25:

are you trolling? telegram (are therefore the fbi) has full access to all content of every message. unless you use private chat, which nobody does, and isn't even available on desktop. i use it. but it's about as private as discord. which is to say not at all

makeworld wrote at 2021-11-30 20:43:06:

> the FBI's ability to _legally_ access secure content

Maybe there are laws preventing legal access to message content? Maybe related to wherever Telegram is incorporated.

inetknght wrote at 2021-11-30 21:24:36:

> _Maybe there are laws preventing legal access to message content?_

Well sure. A lot of laws require a court order. In the U.S. that's usually not too difficult.

officeplant wrote at 2021-11-30 21:16:20:

It helps Telegram is HQ'd in the UK and the operational center is in Dubai.

rootsudo wrote at 2021-11-30 21:24:01:

Does it? UK and Dubai are USA partners in Intelligence gathering and work together several times.

Biggest example as of late:

https://www.bbc.com/news/world-middle-east-58558690

to11mtm wrote at 2021-11-30 20:17:47:

I don't know whether Telegram is E2EE by default (probably not.) When you do a call on telegram you are given a series of emoji and they are supposed to match what the person on the other side has, and that's supposed to indicate E2EE for that call.

RL_Quine wrote at 2021-11-30 20:21:57:

Verification in band seems pretty meaningless, approaching security theatre.

nitrogen wrote at 2021-11-30 20:34:11:

For voice? It's hard to fake the voice of someone you know.

Muromec wrote at 2021-11-30 21:00:14:

you don't have to fake the voice, just mitm and record cleartext

ajsnigrutin wrote at 2021-11-30 21:12:40:

But they have to fake the voice, if I call the other person and say "my emoji sequence is this, this and that" for the other person to verify and vice-versa.

wizzwizz4 wrote at 2021-11-30 21:16:27:

Person A calls you. I intercept the call, so person A is calling _me_, and then I call you (spoofing so I look like Person A). When you pick up, I pick up, then I transmit what you're saying to Person A (and vice versa).

How do you know I'm intercepting the transmission? Does the emoji sequence verify the _call_, perhaps?

tshaddox wrote at 2021-11-30 21:42:27:

The emoji sequence is a hash of the secret key values generated as part of a modified/extended version of the Diffie-Hellman key exchange. The emoji sequence is generated and displayed independently on both devices _before_ the final necessary key exchange message is transmitted over the wire, so a man-in-the-middle has no way of modifying messages in flight to ensure that both parties end up generating the same emoji sequence.

I'm not a cryptographer, but that's what I glean from their explanation:

https://core.telegram.org/api/end-to-end/video-calls#key-ver...

summm wrote at 2021-11-30 21:39:57:

Both connections would show different emojis on both sides then. So you would need to somehow deep fake the voice of the one telling their emojis to the other one.

marcan_42 wrote at 2021-12-01 04:42:42:

The emoji sequence represents the secret key exchange between you and the other party. If you intercept the call, you are making one key exchange with person A, and another key exchange with person B. Due to the mathematics involved, there is no way for you to force both key exchanges to yield the same result.

For a "standard" DH key exchange it would be possible to brute force the emoji sequence to be the same (since it's too short to be resistant to brute forcing), but the protocol that Telegram uses specifically defends against that by having both sides commit to their share of the key ahead of time, so they cannot try different numbers.

https://core.telegram.org/api/end-to-end/video-calls#key-ver...

So person A and person B are going to see different emojis no matter what you do. To fake a phone verification while performing a main-in-the-middle attack you'd also have to fake their voices to each other. That's hard.

ajsnigrutin wrote at 2021-12-01 10:54:22:

If i'm talking to a person I know in person, I'd recognise their voice.

Andrew_nenakhov wrote at 2021-11-30 23:14:27:

Real privacy is too burdensome for most users, so they feel just fine if the service owner promises in a stern voice that their chats are _really secure_.

It is not necessary to provide real security, do fingerprint verification, etc if the users are already happy with the level of security they are promised.

marcan_42 wrote at 2021-12-01 04:47:38:

The emoji comparison thing is mathematically solid. Assuming the clients aren't backdoored (and the Telegram client is open source, so that's not that easy), there is no way for an attacker to make both sides show the same emoji. If they want to convince two users that they have en E2EE connection while performing a man in the middle attack, they'd have to fake their voices to each other to change what emoji sequence they each read out. That's hard, and therefore this is real, meaningful privacy.

Andrew_nenakhov wrote at 2021-12-01 04:55:57:

Telegram can potentially perform mitm at any time and generate matching emoji images for both sides of conversation, since you can't really trust the app code to be the same they put on GitHub. If you've built it yourself, that'd reduce the risk, but nobody does that because blind trust is much more easy.

tptacek wrote at 2021-11-30 20:52:19:

It is not, by default, and none of the group chats are.

racingmars wrote at 2021-11-30 23:00:13:

This chart is showing what messaging providers are _willing_ to give to law enforcement, _not_ a reflection of the technical capabilities of the messaging provider.

I assume what they're showing for Telegram (basically no data except IP/phone data if Telegram decides it's for a legit counter-terrorism activity) is a matter of Telegram business policy.

Signal gives the limited information they do because I assume they are subject to warrants from U.S. courts. Telegram is run, to my understanding, from jurisdictions where enforcing a U.S. court order would be difficult-to-impossible, and they keep the private keys to decrypt their stored message content split between servers in relatively non-overlapping legal jurisdictions, so even a successful seizure of data in one wouldn't be enough to decrypt message content.

That's all well and good -- and I appreciate Telegram for setting things up that way -- but that means at any time Telegram _could_ make a policy decision to cooperate with law enforcement and provide much more than what is shown on this chart. Signal, on the other hand, could choose to cooperate as much as they want but not have the technical capability to provide more information. (Barring them updating their client to intentionally build in a backdoor, etc., but I'm basing this on what the current implementation is.)

The other important thing about this chart: this is the unclassified version. Is there another classified document out there which says "we have a secret relationship with Telegram/whomever and they give us all the message content we want" but they don't advertise to the law enforcement community at large? They secretly use it to aid in parallel construction so they don't ever have to reveal that a messaging vendor is giving them message content in court? We have no idea.

tl;dr: Telegram looks great on this chart because of _policy_, not _technology_. I love Telegram, but I'm under no illusions that it's appropriate for talking about things I wouldn't want law enforcement to have access to. Luckily, I haven't found myself needing to talk to my friends about illegal activity.

MertsA wrote at 2021-12-01 10:34:32:

>Telegram looks great on this chart because of policy, not technology.

This is what puzzles me about Apple, they absolutely have the capability to mitm iMessage pretty discreetly. Because Apple just completely hand waves away key distribution and they can silently add and remove keys at their leisure it's largely only policy that underpins their security. They're not Telegram, they aren't structured to be in a position to be able to ignore demands from the justice system to assist with some agent's latest fishing expidition. How are they getting away with not providing stuff that they obviously have access to? The PDF lists "Pen Register: no capability"

godelski wrote at 2021-11-30 23:33:16:

TLDR: Telegram depends on trusting Telegram. Signal is trustless.

sgjohnson wrote at 2021-11-30 23:29:14:

Telegram isn’t E2EE by default.

My bet is on the fact that they are based in Russia, so they don’t give a shit about a US warrant or subpoena.

keraf wrote at 2021-12-01 08:19:50:

Telegram isn't based in Russia (anymore). The company is incorporated in Dubai since 2017 [0]. They opposed Russian warrants in the past, resulting in the blocking of the app in the territory for some time [1].

[0]

https://www.bloomberg.com/news/articles/2017-12-12/cryptic-r...

[1]

https://en.wikipedia.org/wiki/Blocking_Telegram_in_Russia

vadfa wrote at 2021-11-30 20:22:37:

That is correct. By default all messages sent over Telegram are stored permanently in their servers unencrypted.

Borgz wrote at 2021-11-30 20:44:49:

Not exactly. Non-secret chats are stored encrypted on Telegram's servers, and separately from keys. The goal seems to be to require multiple jurisdictions to issue a court order before data can be decrypted.

https://telegram.org/privacy#3-3-1-cloud-chats

https://telegram.org/faq#q-do-you-process-data-requests

anon11302100 wrote at 2021-11-30 21:02:25:

"Not exactly" means "completely incorrect" now?

Telegram doesn't store your messages forever and they are encrypted and seizing the servers won't allow you to decrypt them unless you also seize the correct servers from another country

vadfa wrote at 2021-12-01 14:48:29:

Of course they store your messages forever... They've kept all of my messages for over 7 years now.

vadfa wrote at 2021-12-01 12:31:56:

If you really think that kind of shit will float...

skrowl wrote at 2021-11-30 21:24:58:

Source for Telegram storing the information unencrypted at rest?

pgalvin wrote at 2021-11-30 22:36:12:

It is widely known and confirmed by Telegram themselves that your messages are encrypted at rest by keys they possess.

This is a similar process to what Dropbox, iCloud, Google Drive, and Facebook Messenger do. Your files with cloud services aren’t stored unencrypted on a hard drive - they’re encrypted, with the keys kept somewhere else by the cloud provider. This way somebody can’t walk out with a rack and access user data.

Andrew_nenakhov wrote at 2021-11-30 23:15:58:

How do they provide near-instant full text search on server side if the chats are "encrypted at rest"?

marcan_42 wrote at 2021-12-01 04:51:12:

Encrypted at rest means the data is encrypted as stored on disk, not that they do not have access to the keys. That would be end-to-end encryption.

What Telegram claims to have done is set this up in a way that makes it very hard for a single party/state to get these keys. It's not possible to make this completely impossible (if you have a server processing user data, it will have the keys loaded at some point, and there is always _some_ way to physically attack it), but it is possible to make it very hard (physical tamper detection on the servers, secure boot tied to machine identity credentials required to access key material, etc - it's hard, but not impossible, to make this too difficult for any nation state to bypass). We don't know how good their set-up is, but it's certainly possible to do a good job at doing what they claim to be doing.

Andrew_nenakhov wrote at 2021-12-01 08:08:31:

It doesn't matter _at all_, if you consider the risks of FBI (or FSB) accessing your chat logs. Telegram can produce your unencrypted chats to them, wether they are encrypted at rest or not.

I just don't see why they would make life harder for themselves developing stuff, given how often Durov lies. He claimed that all Telegram developers are outside of Russia, but then it turned out that they were working next floor from his old VK company office, right in Saint Petersburg.

skinkestek wrote at 2021-11-30 21:53:23:

Check the difference between Telegram and WhatsApp.

Add to this the fact that WhatsApp

- uploads messages unencrypted to Google if you or someone you chat with enable backups

- and send all your metadata to Facebook.

Then remember how many people here have tried to tell us that Telegram is unusable abd WhatsApp is the bees knees.

Then think twice before taking security advice from such people again.

PS: as usual, if your life depends on it I recommend using Signal, and also being generally careful. For post card messaging use whatever makes you happy (except WhatsApp ;-)

zer0zzz wrote at 2021-11-30 22:28:08:

This isn’t the case anymore with WhatsApp. Backups to iCloud and google drive are optionally fully encrypted. You have the choice of storing the decryption artifacts on facebooks servers (which are held on a Secure Enclave) or to backup the 64 character decryption code yourself.

KennyBlanken wrote at 2021-11-30 22:21:11:

Telegram defaults to no encryption, does not do encrypted group chats, has a home-rolled encryption protocol which almost guarantees it's weak as nearly every home-rolled encryption system always is (if not also backdoored). Coupled with it being headquartered in Russia means it is completely untrustable.

The only reason Telegram comes out on top of Whatsapp in the document in question is because Telegram is a foreign company with little interest in cooperating with a US domestic police agency; the FBI has no leverage over Russian companies.

What that list doesn't show is what Telegram does when the FSB knocks. By all means, give your potentially embarassing message content to a hostile nation's intelligence service.

skinkestek wrote at 2021-12-01 06:54:37:

> Telegram defaults to no encryption,

This is plain false as can be verified by anyone who can check Telegram GitHub repos or run the app in a debugging environment.

Telegram defaults to point-to-point encryption. Same as banks and gmail.

Fun fact: back in the days WhatsApp sent messages unencrypted (i.e. as plain text) over port 443(!).

> does not do encrypted group chats,

again, point-to-point encryption

> has a home-rolled encryption protocol which almost guarantees it's weak as nearly every home-rolled encryption system always is (if not also backdoored).

Earlier versions had serious problems. Newer versions are supposedly better.

Also there is a lot of difference between home-grown cryptography by a math wizard, made open source for everyone to inspect and various secret sauce variants.

HN has a long history of claiming it can be trivially broken, yet despite source code being available no one has done it? Lazyness or incompetence? Or maybe it isn't so simple?

I don't know but if you want to shut me up and make your claim to fame: do break Telegram cryptography. You'll do the world a service both by exposing it and by shutting up people like me.

Meanwhile, stop spreading lies. Telegram is not unencrypted. It is point-to-point encrypted by default.

If the encryption is weak, prove it or shut up.

KennyBlanken wrote at 2021-12-01 13:42:44:

I obviously was referring to e2ee; everything is point to point encrypted these days. e2ee is turned off by default and cannot be enabled for group chats.

I stand by my assertion that Telegram's proprietary secret encryption is nearly guaranteed to be weaker than industry-standard encryption. "Home grown is always weaker" is a well known position of almost the entire crypto community.

I further stand by my assertion that Telegram's encryption is nearly guaranteed to be backdoored, because there is literally zero reason for a startup to invest the massive engineering resources needed to successfully develop and maintain its own encryption algorithms, unless they were being paid to do so.

The NSA has a long history of backdooring private encryption technology through industry "partnerships."

Do you seriously think Putin would allow a domestic company to develop a communication tool that would allow Russians to communicate with each other in complete privacy?

> prove it or shut up.

Go read the HN commenting policy (specifically around civility) or shut up.

skinkestek wrote at 2021-12-01 14:51:59:

> I obviously was referring to e2ee;

So you admit you weren't just spreading inaccuracies you heard from someone else but you knew you were posting disinformation.

> I further stand by my assertion that Telegram's encryption is nearly guaranteed to be backdoored, because there is literally zero reason for a startup to invest the massive engineering resources needed to successfully develop and maintain its own encryption algorithms, unless they were being paid to do so.

This is a good argument.

> Do you seriously think Putin would allow a domestic company to develop a communication tool that would allow Russians to communicate with each other in complete privacy?

Telegram is not a Russian company?

>> prove it or shut up.

> Go read the HN commenting policy (specifically around civility) or shut up.

Sorry. I was too harsh. I actually regret.

Compared to willfully spreading disinformation however it seems pretty minor though?

-----

A bit more: I know local police used to use Telegram. That worries me.

It is actually even more complicated:

If Putin reads my most personal messages I don't care.

If NSA or even worse, local police actually took their time to read my messages I'd be mad or worried.

However if FSB asked for help they would need a very good reason and I'd try to consult with local law enforcement first.

If local police however asked for help I'd go out of my way to help them.

nicce wrote at 2021-11-30 22:51:47:

That is a lot of speculation. If you read the encryption protocol, actual methods being used for encryption are well known. Client is open source and supports reproducible builds. If there is a backdoor, it is in front of our eyes.

> What that list doesn't show is what Telegram does when the FSB knocks. By all means, give your potentially embarassing message content to a hostile nation's intelligence service.

Telegram is in a lot of trouble in operating in Russia. It was blocked for two years. [1]

If they are so co-operative, why pass the opportunity to watch on their own people. Or did they become co-operative after unblock? It seems, that they help on some level [2], but does this threaten to other countries? Hard to say so.

[1]

https://en.wikipedia.org/wiki/Blocking_Telegram_in_Russia

[2]

https://www.independent.co.uk/news/world/europe/telegram-rus...

Andrew_nenakhov wrote at 2021-11-30 23:19:10:

Telegram's block in Russia was likely a very successful PR action coordinated with authorities.

It was never removed from national appstores, and Google/Apple usually comply with such requests, and the fact that it was unbanned is unprecedented.

nicce wrote at 2021-12-01 00:08:17:

Apple did stop updates for the Telegram. Google and Apple has weak history on compiling Russian requests. Maybe they complie with other countries more, but not Russian.

https://www.pcmag.com/news/after-almost-2-months-apple-stops...

Andrew_nenakhov wrote at 2021-12-01 04:05:52:

They did have linkedin app removed, and that was a rather mild transgression against Russian laws, compared to Telegram.

no_time wrote at 2021-11-30 20:21:37:

LINE,telegram,threema and WeChat are not even american companies. Can't they just tell the FBI to suck a fat one when they ask for user data?

colechristensen wrote at 2021-11-30 20:28:31:

Not if they want to operate in the United States or have access to our banking system.

You don’t get to pick your jurisdiction and then operate globally. You’re obligated to follow the laws where you want to operate.

xxpor wrote at 2021-11-30 20:41:43:

Fwiw if you want to do any sort of FX whatsoever or accept credit cards, you need access to the American banking system.

no_time wrote at 2021-11-30 20:37:10:

I wonder how this affects nonprofits like Matrix/Element and Signal. What can they do with them? Gangstalk their developers? Coerce big tech to ban them from their appstores?

ev1 wrote at 2021-11-30 20:39:42:

Doesn't Signal's dev already get bothered every single time they travel?

smoldesu wrote at 2021-11-30 21:01:28:

The design of these decentralized/federated platforms is specifically so they _can't_ easily coerce their owners into disclosing incriminating information. In some sense, it's similar to how Bittorrent implicates it's users.

viro wrote at 2021-11-30 21:11:22:

Refusing a valid court issued search warrant/order is a criminal offense. I think 180 days for each refusal of a legal order.

no_time wrote at 2021-11-30 21:22:37:

The issue is a bit more complex. I was thinking more on the lines of "will I get bothered for making crypto available for the masses that nobody can crack?"

millzlane wrote at 2021-11-30 22:42:14:

IRRC Didn't they jail a guy for that?

colechristensen wrote at 2021-11-30 21:21:10:

Well signal does not have the data, they comply with such orders with the tiny amount of metadata they have (like a timestamp of when your account was created and that’s about it)

er4hn wrote at 2021-11-30 21:12:55:

Telegram, as I understand it, can access your messages when stored in their Cloud[1]. They just make a choice to not provide the content of those to anyone.

[1] -

https://telegram.org/privacy#4-1-storing-data

cunthorpe wrote at 2021-12-01 02:17:31:

Yes but they can still:

- block the company nationwide, see Huawei (also includes ceasing contracts with app stores, both of which are American)

- block access to your website, see TPB

- harass you when you travel, either to the US or fellow countries.

Decabytes wrote at 2021-11-30 20:28:32:

Depends on whether the countries these companies exist in have agreements with the U.S for surveillance and stuff

dotancohen wrote at 2021-12-01 06:47:12:

Do you want this to stop? Raise awareness, add this to your mail sig:

> This electronic communication has been processed by the United
  > States National Security Agency.

If it makes people uncomfortable, GOOD. Pretending that your mail - and their mail - is not being accessed is not the way to resolve this uncomfortable situation. Ending it is the way. And that demands awareness.

baby wrote at 2021-11-30 20:57:54:

I'm wondering how this was obtained, and how old this is?

For WhatsApp:

if target is using an iPhone and iCloud backups enabled, iCloud returns may contain WhatsApp data, to include message content

Probably not true since WhatsApp launched encrypted backups.

gnabgib wrote at 2021-11-30 23:59:38:

I mean the document says the data is accurate "as of November/2020", and the slide was prepared 7-Jan-2021

cunthorpe wrote at 2021-12-01 02:14:10:

WhatsApp has its own backups in addition to regular full-phone iCloud backups.

If WhatsApp does not encrypt its content at rest locally, then “iCloud backups” still contain everything unencrypted too.

vorpalhex wrote at 2021-11-30 22:52:05:

Reading the document answers this for you: It is a declassified government document originally produced by the FBI and was prepared on Jan 2nd, 2021.

fractal618 wrote at 2021-11-30 20:38:28:

Now I just have to get my friends and family to use Signal.

yabones wrote at 2021-11-30 21:58:31:

I've had surprisingly good luck with strong-arming people into switching. The important part is having their trust, if they don't believe you they won't listen. The next part is to make simple, verifiable, and non-technical arguments for switching. Believe it or not, almost everybody is willing to take small steps if they're free.

Instead of rambling on and on about "end to end encryption" or "double-ratchet cryptographic algorithms" or other junk only nerds care about, approach it like this:

* There are no ads, and none of the messages you send can be used for advertising

* It's not owned by Facebook, Google, Microsoft, or any of the other mega-corporations, and you don't need an account on one of their sites to use it

* It will still work great if you travel, change providers, etc

* It's much safer to use on public Wi-Fi than other services or SMS

Honestly, don't even touch on law enforcement access as in the OP. That can strike a nerve for some people. The best appeals are the simple ones.

godelski wrote at 2021-11-30 23:42:45:

Also, a big one that works for me (especially iPhone users, which are the hardest to convert): "You can send full quality images and videos to Android users." The fact that Apple shots themselves in the foot is an advantage to Signal.

FearlessNebula wrote at 2021-12-01 11:52:53:

That’s not Apples flaw, it’s a flaw with SMS. It can only handle file sizes up to a certain limit, and during periods of congestion they lower that limit.

ViViDboarder wrote at 2021-12-01 01:13:47:

It’s also an issue the other way around. MMS is the limitation.

godelski wrote at 2021-11-30 23:41:39:

The best advice I have to give to get people to switch is showing that you have cross platform capabilities. Essentially everyone can have the features of iMessage/WA: full resolution images and videos, responding to messages with emojis (WA doesn't have), stickers (unfortunately you have to grab from signalstickers.com instead of in-app), voice and video calling, etc. If Apple didn't have such a closed ecosystem then I think it would be harder to get people to switch. In this respect, Signal is more feature rich than anything else (except Telegram, but Telegram doesn't have the same security and isn't trustless).

I think the common mistake is trying to convince people with the security. Use that as a bonus, not the main feature. You're talking geek to people that don't speak geek (convince geeks with these arguments, not mom and dad). I also suggest strong arming people and using momentum (if 4 people in a group of 5 have Signal, switch the group to Signal. Or respond to WA messages on Signal).

djanogo wrote at 2021-11-30 21:31:13:

I switched to signal and got few people to switch too, then they started their shit coin(MOB). IMO Signal Messenger is just a way for that company to reach their shit coin goals. Uninstalled and never recommending that again.

MatekCopatek wrote at 2021-11-30 22:01:56:

I remember many people being pissed off when these features were announced some months ago.

As far as I can tell, nothing really happened afterwards. I use Signal on a daily basis and haven't noticed any coin-related functionalities. Either they were canceled, haven't been released yet or they're just buried somewhere deep and not advertised.

Do you have a different experience?

godelski wrote at 2021-11-30 23:44:36:

MOB is in beta and I think getting moved (if not already) to main soon. But it is non-intrusive and you won't notice it unless you look for it. People are just complaining about a feature that you have to look for. I'm not a fan of MOB and how the situation was handled, but I also think the reactions people are having are a bit over the top.

fossuser wrote at 2021-11-30 22:39:45:

It's in beta, you can enable it in settings.

It's still a pain to buy MOB in the US so it's not that usable in the states. It would have been interesting to me if they just used Zcash instead of rolling their own, but I'm not sure what's supposed to be special about MOB vs. Zcash.

I also don't think it's that big of a deal.

godelski wrote at 2021-11-30 23:47:01:

I'd love Zcash (forced private transactions). But honestly I'd also like if we could use different currencies. My dream was that you could send cash and they would just use MOB as the intermediate transaction (so your bank would just see a transaction to/from Signal and not who you were sending/receiving to/from). But that also has technical challenges and legal issues so I understand why not. I think a multi-currency wallet is the next best option imo.

fossuser wrote at 2021-12-01 01:08:10:

Yeah, my long term hope for this stuff is that Urbit succeeds and then a lot of the UX here gets fixed by that and all of these apps become redundant and unnecessary. I'm definitely in the minority there but I think there's a future path where that's possible and works well.

arthurcolle wrote at 2021-11-30 22:15:00:

Are you sure you're not thinking of Telegram? They had a thing called Telegram Open Platform or something (TOP rings a bell for some reason)

_-david-_ wrote at 2021-11-30 22:34:08:

Signal has some coin

https://support.signal.org/hc/en-us/articles/360057625692-In...

millzlane wrote at 2021-11-30 22:41:19:

I think it's

https://mobilecoin.com/

catlikesshrimp wrote at 2021-11-30 21:02:10:

I have been trying to get people to install signal for 2 years. No one has budged.

The day facebook went down for some hours I got phone calls.

basilgohar wrote at 2021-11-30 21:15:12:

I have had some success. It helps that many of the people I regularly contact were willing to migrate, even after some time. Most already used WhatsApp, so the friction to installing a new app was less than someone not accustomed to using a dedicated app for messaging.

But most of my American friends that don't have international contacts still just use SMS because they are not really accustomed to an app such as WhatsApp and so on.

anonporridge wrote at 2021-11-30 21:09:26:

It's incredibly disheartening how difficult it is to get most people to care about digital privacy.

upofadown wrote at 2021-11-30 22:45:58:

Even if you do it is pretty much impossible to get them to check their safety numbers and keep them checked.

hexis wrote at 2021-12-01 00:10:52:

I wonder where Matrix/Element would fit into this chart.

derbOac wrote at 2021-12-01 02:02:28:

Yeah I was thinking maybe some of the most secure platforms aren't on the list.

Briar was another one.

bredren wrote at 2021-11-30 21:46:48:

The way these became bullet points on the slide is ~

An active investigation leads an agent to a suspect known to have used one of these applications

An administrative subpoena is issued to the company asking for what information is available

The company is then ordered by a federal judge to provide information related to a particular account or accounts

The company complies.

This is why it is important to understand how your messaging service handles data and how you can compromise your own safekeeping of all or part of that data.

goatcode wrote at 2021-11-30 22:44:08:

It's kind of funny that WeChat seems pretty locked-down to the FBI, especially for Chinese citizens. Makes sense, really, but still funny.

sgjohnson wrote at 2021-11-30 23:39:27:

Like Telegram, they simply don’t care about US warrants.

stunt wrote at 2021-11-30 20:23:52:

Well, who cares when all they need is to use something like Pegasus to obtain full access to your phone simply by sending you a WhatsApp message (without having you even open the message).

Knowing how well guarded IOS is against app developers, I wonder what kind of zero-day would suddenly turn a message received in WhatsApp to full system access. I think NSO found a WhatsApp backdoor, not a zero-day bug.

JBiserkov wrote at 2021-11-30 21:37:45:

NSO can't send you an WhatsApp message if you don't have WhatsApp on your iPhone.

catlikesshrimp wrote at 2021-11-30 21:07:52:

Whatsapp is owned by facebook, not by apple. I don't think Apple wants to share a backdoor with facebook.

I don't know any detail of the whatsapp vulnerability that NSO exploited.

mmazing wrote at 2021-12-01 00:20:41:

Or compromise the device in some other way.

At the risk of being cliche here's a relevant xkcd -

https://xkcd.com/538/

b8 wrote at 2021-11-30 23:55:13:

Can't the FBI get chatlogs from WeChat?

https://www.youtube.com/watch?v=N5V7G9IBomQ

In the short documentary that the FBI made about catching Kevin Mallory they mentioned catching him sending classified stuff via WeChat.

Larrikin wrote at 2021-12-01 07:08:35:

I use LINE a fairbit, have a number of Japanese friends as well as friends that have traveled to Japan. I had no idea they had implemented much better encryption [1]. I'm convincing all my contacts to turn on the option now.

[1]

https://engineering.linecorp.com/en/blog/new-generation-of-s...

timbit42 wrote at 2021-12-01 13:50:37:

It's about time they implemented 1970's encryption technology. It should be on by default.

timbit42 wrote at 2021-11-30 20:37:32:

I'd like to see Tox and Jami.

wizzwizz4 wrote at 2021-11-30 21:19:48:

I read somewhere that Tox's security was compromised.

timbit42 wrote at 2021-11-30 21:33:17:

I'd like to see that. Was it not fixed?

wizzwizz4 wrote at 2021-11-30 22:49:30:

I found

https://media.ccc.de/v/rc3-709912-adopting_the_noise_key_exc...

which might be it.

NmAmDa wrote at 2021-12-01 18:53:53:

The funny thing is that sometimes when I search for Arabic words about Islam I get results for some old and usually the extremest books on CIA library (direct links to PDFs) which I wonder why?

the_optimist wrote at 2021-11-30 21:00:11:

Isn’t this simply imaginary, where in practice all the FBI has to do to up the ante is to request military-grade interception from a willing foreign counterpart?

anonporridge wrote at 2021-11-30 21:13:53:

The point of promoting and using privacy respecting software is not necessarily to make it _impossible_ for law enforcement to get what they want. It's to make it somewhat expensive and require targeted probes.

You simply want it to be cost prohibitive to engage in mass surveillance on everyone, because that is an immensely powerful tool of totalitarian oppression that get really bad if we happen to elect the wrong person once.

wizzwizz4 wrote at 2021-11-30 21:27:28:

I don't care if they spy on _me_; they probably have a good reason to! But I do care if they spy on _everyone_, so I make it hard to spy on me.

the_optimist wrote at 2021-11-30 21:47:04:

I agree with you on the level of my person, and naturally flag that this economic argument is extremely poor policy. It’s quite unclear that the marginal cost is non-zero, or even flat by person. One might reasonably conclude we are already each inside a high-resolution springing trap, waiting for the moment we find ourselves athwart the powers that be. Imagine in physical space where the local police could simply call in foreign air strikes upon domestic citizens, with only economics to prevent otherwise. We must have transparent and firm laws, reformed at a fundamental level.

upofadown wrote at 2021-11-30 22:48:06:

Can't the FBI do a Pegasus style remote access thing on an appropriate warrant themselves?

the_optimist wrote at 2021-11-30 23:04:05:

Seems like it. And can they also do it without an appropriate warrant [by asking someone else]?

tata71 wrote at 2021-11-30 21:07:09:

Tier of which requires....expense.

beervirus wrote at 2021-11-30 20:20:12:

What about regular text messages?

tag2103 wrote at 2021-11-30 20:29:46:

If I remember correctly standard SMS has no security on it at all and is in the clear during transit. I may be wrong and never scared of being corrected.

markab21 wrote at 2021-11-30 20:28:25:

Assume anything sent over a cellular network carrier via normal SMS can not only be retrieved, but intercepted.

brink wrote at 2021-11-30 20:43:22:

Thank goodness a lot of companies regularly use it for 2FA.

kf6nux wrote at 2021-12-01 17:26:33:

IIUC, they can get 7 years worth of SMS/MMS (including contents) with very little effort/cost.

stunt wrote at 2021-11-30 20:28:54:

Telecommunication is highly regulated. They have to keep records for a long time and make them available to law enforcement.

bonestamp2 wrote at 2021-11-30 20:23:56:

They've been inside the phone companies for a long time so I assume they have full access to SMS.

richij wrote at 2021-12-01 12:11:27:

PSA: This is a year old.

wwww3ww wrote at 2021-11-30 22:13:15:

I do DIY encryption with enigma reloaded and it works

CookieMon wrote at 2021-12-01 07:32:42:

The list doesn't have ANØM

jareklupinski wrote at 2021-11-30 22:45:19:

is this only page two of an alphabetical list?

or are there no messaging systems with a name before 'i'

rodmena wrote at 2021-11-30 23:17:49:

WhatsApp. -> FBI

Telegram. -> KGB

Signal. -> The rest of us?

anderber wrote at 2021-12-01 00:57:55:

Didn't Russia(KGB) try to block Telegram in the past and were unsuccessful? I feel like they are fairly safe and trustworthy. Of course, I like Signal best, but Telegram has so many nice features.

temptemptemp111 wrote at 2021-11-30 23:29:57:

Signal -> Mossad

dollabills wrote at 2021-12-01 03:46:52:

What about Snapchat?

dollabills wrote at 2021-12-01 03:46:25:

Snapchat?

wwww3ww wrote at 2021-11-30 22:14:08:

i use enigma reloaded to manually encrypt my messages

yownie wrote at 2021-11-30 20:05:24:

link seems to be broken

https://propertyofthepeople.org/document-detail/?doc-id=2111...

mmh0000 wrote at 2021-11-30 20:31:23:

Not only is there main link broken, but their silly PDF reader is broken for me.

Here's a direct link to the PDF:

https://assets.documentcloud.org/documents/21114562/jan-2021...

Molly666 wrote at 2021-12-01 10:21:18:

The FBI is monitoring users - it's not a secret for a long time. Enough information has already leaked to the network to make it clear that this is not just a conspiracy theory. But, to be honest, I was surprised that each of the supposedly reliable messengers leaks data to the authorities. I was also surprised that I did not see the Utopia P2P

https://u.is/en/

in the document. Maybe the only reliable application is one that stands for freedom of speech and anonymity and does not obey the authorities?

grouphugs wrote at 2021-12-01 03:14:10:

fuck the police

finite_jest wrote at 2021-11-30 20:09:03:

Dupe of

https://news.ycombinator.com/item?id=29394945

and

https://news.ycombinator.com/item?id=29394945

.

fractal618 wrote at 2021-11-30 20:39:32:

Now I just have to get all my friends and family to use Signal.

tacLog wrote at 2021-11-30 20:57:49:

My family has really taken to it. Granted it's mostly just message family app to them, but they are very not technically fluent but yet seemed to have picked it up just fine.

I really think this is not discussed when hacker news brings up secure messaging. The user experience is so much more important than the underlying tech. My family doesn't care about end to end encryption. They care about video calling with the press of a button, and easy features that are just there and work like zoom or the many other software products that they have to use work.

Thank you Signal team for focusing so hard on the user experience.

headphoneswater wrote at 2021-11-30 21:00:25:

If this is what it takes to keep us safe, I and most americans are ok with it. We live in dangerous times.

The US has a balanced criminal justice system -- as long as due process is preserved privacy from the state should not be a major issue

benlivengood wrote at 2021-11-30 22:36:08:

Current U.S. "due process" includes national security letters and other secret legal requests and secret courts to approve those requests. So there are still some checks and balances but it's less clear that they are working well enough or as intended.

Just look at the transparency reports of major Internet companies; they can report numbers of (certain types of) requests and that's about it. Mass surveillance under seal is not a great trend.

When political parties start advocating for jailing political opponents and treating the supreme court as political office for nominations, I find it harder to trust the current due process.

numlock86 wrote at 2021-11-30 21:10:25:

How does them reading my messages keep me safe, though?

headphoneswater wrote at 2021-11-30 22:24:23:

In that example it's keeping me safe from you

numlock86 wrote at 2021-12-01 17:34:41:

Ok, let's turn it around and say they would keep me safe from you. Why would they? What's their motivation to keep ME save from YOU? Are you even a threat? Am I a threat? And would a real threat even be caught by this system?

wizzwizz4 wrote at 2021-11-30 21:29:52:

> _if it's not we have bigger problems anyway_

Really, we shouldn't do anything; we have the bigger problem of the eventual heat death of the universe.

Take into account not only the size of the problem, but how easy it is to do something about it.

efitz wrote at 2021-11-30 20:26:39:

This discussion is not very interesting from a security perspective. I tuned out at “cloud”.

If it’s not in your physical possession, it’s not your computer. If it’s not your computer, then whoever administers the computer, or whoever [points a gun at/gives enough money to] the administrator of that system can access whatever you put on that system.

If a “cloud” or “service” is involved, then you can trivially use them to move or store data that you encrypted locally on your computer with your key that was generated and stored locally and never left your system. But subject to the limits above, the administrators of the other computers will still be able to see metadata like where the data came from and is going to. And they might be able to see your data too if you ever (even once, ask Ross Ulbrecht) failed to follow the basic encryption guidelines above.

You can make metadata access harder via VPNs and Tor, but you CANNOT make it impossible- in the worst case, maybe your adversary is controlling all the Tor nodes and has compromised the software.

Which leads me to my last point, if you did not write (or at least read) the code that you’re using to do all of the above, then you’re at the mercy of whoever wrote it.

And, if you try to follow perfect operational security, you will have a stressful and unpleasant life, as it’s really really hard.

michaelmior wrote at 2021-11-30 20:31:00:

> if you did not write (or at least read) the code that you’re using to do all of the above, then you’re at the mercy of whoever wrote it.

It's worse than that. Even if you read the code, you have to trust that the code you read is the code a service is actually using. Even if you deploy the code yourself, you have to trust that the infrastructure you're running on does not have some type of backdoor. Even if you run your own infrastructure, hardware can still have backdoors. Of course, the likelihood of any of these things actually becoming a problem decreases significantly as you read through the paragraph.

inetknght wrote at 2021-11-30 21:45:43:

> _the likelihood of any of these things actually becoming a problem decreases significantly as you read through the paragraph._

And yet, "likelihood" doesn't necessarily mean "hasn't been done".

Just look at:

* [0]: Intel ME

* [1]: Solarwinds attack and CI systems

* [2]: Ubiquiti attack and complete infrastructure compromise

* [3]: And the famous Ken Thompson statement

[0a]:

https://news.ycombinator.com/item?id=15298833

[0b]:

https://www.blackhat.com/eu-17/briefings/schedule/#how-to-ha...

[1]:

https://www.cisecurity.org/solarwinds/

[2]:

https://krebsonsecurity.com/2021/04/ubiquiti-all-but-confirm...

[3]:

https://users.ece.cmu.edu/~ganger/712.fall02/papers/p761-tho...

michaelmior wrote at 2021-12-01 14:52:02:

Indeed. I mentioned those specific things because it has been done. However, I think the likelihood of the average user being affected by things near the end of the list is generally quite small. If we aren't willing to accept this, at some point, we can't use technology for anything important.

dointheatl wrote at 2021-11-30 21:57:48:

> Even if you read the code, you have to trust that the code you read is the code a service is actually using.

Don't forget to verify the code for the compiler to ensure that hasn't been compromised in order to inject an exploit into the binary at compile time.

judge2020 wrote at 2021-11-30 20:30:53:

If i'm reading this page correctly, AMD is working on something that would allow you to run trusted code that not even someone with physical access to the hardware could read (without breaking this system).

https://www.amd.com/en/processors/epyc-confidential-computin...

And this tech is already implemented by GCP:

https://cloud.google.com/confidential-computing

> With the confidential execution environments provided by Confidential VM and AMD SEV, Google Cloud keeps customers' sensitive code and other data encrypted in memory during processing. Google does not have access to the encryption keys. In addition, Confidential VM can help alleviate concerns about risk related to either dependency on Google infrastructure or Google insiders' access to customer data in the clear.

efitz wrote at 2021-11-30 20:44:16:

Then you only have to trust that AMD did not accidentally or intentionally introduce a bug in the system. Remember Spectre? Remember all the security bugs in the Intel management code?

You also have to trust that AMD generated and have always managed the encryption keys for that system properly and in accordance with their documentation.

And are you even sure that you’re actually running on an AMD system? If the system is in the cloud, then it’s hard to be sure what is executing your code.

And are you sure that your code didn’t accidentally break the security guarantees of the underlying system?

I have worked on all these problems in my day job, working on HSMs. At the end of the day there are still some leaps of faith.

smoldesu wrote at 2021-11-30 21:04:28:

_puts on tinfoil hat_

You'd also need to consider AMD's management engine, the Platform Security Processor. If we're really slinging conspiracy theories, AMD processors are likely just as backdoored as Intel one. I don't mean to be grim, but I think it's safe to assume that the US government has direct memory access to the vast majority of computer processors you can buy these days.

[/conspiracy]

123pie123 wrote at 2021-11-30 21:32:00:

if you're going to that level, then have a look at five-eyes (and it's derivatives)

https://en.wikipedia.org/wiki/Five_Eyes

/ Echelon

smoldesu wrote at 2021-11-30 21:35:35:

I probably shouldn't have removed my tinfoil lining yet but yes, you're correct. Any information the US government has access to through these channels is also probably accessible by our surveillance/intelligence allies. It raises a lot of questions about how deep the rabbit hole goes, but I won't elucidate them here since I've been threatened with bans for doing so. I guess it's a do-your-own research situation, but always carry a healthy degree of skepticism when you read about anything government-adjacent.