Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Still trying to grasp the idea of archiving messages from E2E encrypted communication system into a storage that entirely breaks the purpose of using something like Signal.

It’s like encashing on the trust of Signal protocol, app while breaking its security model so that someone else can search through all messages.

What am I missing here?



> What am I missing here?

OK, say you're a bank. The SEC states you need to keep archives of every discussion your traders have with anyone at any time (I'm simplifying things but you get the point). You keep getting massive fines because traders were whatsapping about deals

So now you've got several options - you can use MS Teams, which of course offers archival, compliance monitoring etc. But that means trusting MSFT, and making sure your traders only use Teams and nothing else. You can use a dedicated application for the financial industry, like Symphony or ICE Chat or Bloomberg, but they're clunkier than B2C apps.

And then the Smarsh (owners of Telemessage) salesman calls you, and says "your users can keep using the apps they love - WhatsApp, Signal - but we make it compliant". And everyone loves it (as long as no-one in your Security or Legal teams are looking too hard at the implications of distributing a cracked version of WhatsApp through your MDM...)

Edit: here's the install document for their cracked WhatsApp binary https://smarsh.my.salesforce.com/sfc/p/#30000001FgxH/a/Pb000...


> say you're a bank. The SEC states you need to keep archives of every discussion your traders have with anyone at any time

These records are encrypted in storage.


That is more than overly optimistic given how slow the pace of any technical innovation in finance is. The recent and not so recent issues with Citi are a good example of that.


Seems like it doesnt resolve the trust issue it just shifts it to a smaller firm with more to lose.


It definitely doesn't resolve the trust issue! I would trust MSFT a million times more than these cowboys. What it does give you is peace with your traders (who can be real divas..) - they can keep using "WhatsApp" and "Signal" and you can monitor everything


Oh wow! There are other ways to archive whatsapp messages that don't involve modified WhatsApp apks. Meta lawyers do not take kindly to modified WhatsApp apks.


> There are other ways to archive whatsapp messages that don't involve modified WhatsApp apks.

What other ways are there that don't involve WhatsApp's Google Drive backup feature or scraping the web interface?


The clean way to do it (which is how Telemessage’s competitors do it) is to use WhatsApp business APIs with dedicated phone numbers.


Most traders I dealt with want to do it on personal cell phone so they can keep contacts as they move around. Most of them are like salespeople, they know exactly how much money they bring in and successful ones refuse to do anything that will impede THEIR method of working. SEC Fines, Regulations? Those are for less successful people.

EDIT: There was another post calling them divas, alot of them act that way.


yes, that's why TeleMessage managed to sell such a hacky solution - there was a clear desire for that product


I mean if they're going to use their personal devices for this then a cracked Whatsapp wouldn't help the business anyway.

For devices the company controls they can of course use the API the above poster mentioned though


Yes it does. They convince the traders to let them install MDM and take over the phone, WhatsApp/Telegram/Signal still works, the business gets the records and if they leave the trading firm, the phone gets wiped but traders can just reinstall Store versions, log back in and everything is back.


I had scraping the web interface in mind.


ok, this absolutely reminds me of using indian whatsapp mods years ago. stickers, more features, local and portable backups... wouldn't try that as a member of the government though


Is it a coincidence that it reads almost exactly like SMERSH?

https://en.wikipedia.org/wiki/SMERSH


Probably coincidence. The founder of the company was named Stephen Marsh.


There's a point at which coincidence and opportunity meet.


Probably not. It's trendy to give edgy names to companies. See: Palintir.


You mean Palantir


And the name is not very edgy and a pretty exact mission description - it describes exactly what it grows from. Seeing stones aka cellphone data of everyone, collected, analyzed and turned into predictions for kings.


Not even pretending to not be evil is what makes it edgy.


To be honest, the good guys - turned out to be pretty embarrassing too.

The whole "everyone thinks like us" delusion bought with the surplus of a good times window distributed all around and its still willing to return to this delusional state of affairs.

The obvious plot-holes they reveal when it comes to we do not discuss nature (the bugs in the human mind are all fixable with education) and we do not discuss nurture (all cultures are equal, and equally capable - disregard the evidence before your eyes).

You don't get to juggle and drop so many balls and do not massively loose confidence!

The rule of (finger in ears) "La-La-La" is over - the problem is- the right is a reactionary mess, that has no solutions, analysis and tools to exploit these weaknesses.


> You don't get to juggle and drop so many balls and do not massively loose confidence!

You don't get to run all this circus if you don't intend to run it only as a circus. It's time to stop kidding ourselves that anybody mainstream is sincere and smart enough to move anywhere different from where they all are told to go.


Huh? If the goal is compliance, you wouldn't use something that's worse for compliance - which is why the Legal and Security wouldn't like it. If it helped with compliance, they'd love it! So the reason can't be compliance.


The goal is the appearance of compliance, not actual compliance. Check the boxes.


Sounds like you've never done compliance.


You can never control what I do on my device with the message received- I can make screenshots, or, if the app prevents that, take a picture of the screen.

The goal of signal is trusted end-to-end encrypted communication. Device/Message security on either end is not in scope for Signals threat model.


TM SGNL changes the security model from "I trust the people in the chat" to "I trust the people in the chat and also the company archiving the chat".

If you don't trust the people in your chat, they shouldn't be in your chat.


> If you don't trust the people in your chat, they shouldn't be in your chat.

I assure you, none of these people trust each other. Backstabbing is normal.

They're also likely using it to talk to foreign counterparts. Again, most of whom they don't trust a bit.

Encryption isn't just about "do I trust the recipient".


You are conflating levels of trust.

The trust level required with Signal is, "do I trust the people in this chat not to share the specific communications I am sending to them with some other party whom I do not want to have a copy".

There are many many situations where this level of trust applies that "trust" in the general sense does not apply. It is a useful property.

And if you don't have that level of trust, don't put it in writing.

TM SGNL changes the trust required to, "do I also trust this 3rd party not to share the contents of any of my communications, possibly inadvertently due to poor security practices".

This is a categorical and demonstrably material difference in security model. I do not understand why so many are claiming it is not.


>TM SGNL changes the trust required to, "do I also trust this 3rd party not to share the contents of any of my communications, possibly inadvertently due to poor security practices".

That's the same level of trust really. Signal provides a guarantee that message bearer (i.e. Signal) can't see the contents, but end users may do whatever.

You can't really assume that counterparty's device isn't rooted by their company or they are themselves required by law to provide written transcripts to the archive at the end of each day. In fact, it's publicly known and mandated by law to do so for your counterparty that happens to be US government official.

The people who assume that they are talking with one of the government officials and expect records not to be kept are probably doing (borderline) illegal, like talking treason and bribes.

No, this is not a "nothing to hide argument", because those people aren't sending dickpics in their private capacity.


If your counterparty is compromised, that still only leaks your communication with that counterparty, but not other, unrelated conversations.


> This is a categorical and demonstrably material difference in security model. I do not understand why so many are claiming it is not.

Because all it takes is one user to decide they trust the third party.

Right now you actually have to do more than trust everyone, you have to trust everyone they trust with their chat history. Which already can include this sort of third party.


One of the most popular “e2ee” communication systems, iMessage, does exactly this each night when the iMessage user’s phone backs up its endpoint keys or its iMessage history to Apple in a non-e2ee fashion.

This allows Apple (and the US intelligence community, including FBI/DHS) to surveil approximately 100% of all non-China iMessages in close to realtime (in the usual case where it’s set to backup cross-device iMessage sync keys).

(China, cleverly, requires Apple to not only store all the Chinese iCloud data in China, but also requires that it happen on machines owned and operated by a joint venture with a Chinese-government-controlled entity, keeping them from having to negotiate continued access to the data the way the FBI did.)

https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...

Yet Apple can still legitimately claim that iMessage is e2ee, even though the plaintext is being backed up in a way that is readable to them. It’s a backdoor by another name.

Everyone wins: Apple gets to say E2EE, the state gets to surveil the texts of everyone in the whole country without a warrant thanks to FISA.


I suppose if both you and the recipient have cloud backups disabled, then Apple can no longer view your messages.

But outside of that scenario, is there any advantage to iMessage using e2ee instead of just regular TLS?

Edit: Apparently it's up to you whether you want your iCloud backups to use e2ee. There's an account setting: https://support.apple.com/en-us/102651. Standard protection is a sensible default for regular who aren't tech-savvy, as with e2ee they're at risk of losing all their iCloud data if they lose their key.


That's an old article. According to Apple docs, Advanced Data Protection covers Device and Messages backups, which means they are E2EE.


Correct, but nobody turns it on because it’s opt in, and even if you turn it on, 100% of your iMessages will still be escrowed in a form readable to Apple due to the fact that the other ends of your iMessage conversations won’t have ADP enabled because it’s off by default.

Again, Apple gets to say “we have e2ee, any user who wants it can turn it on” and the FBI gets to read 100% of the texts in the country unimpeded.

If Apple really wanted to promote privacy, they’d have deployed the so-called “trust circle” system they designed and implemented which allowed a quorum of trusted contacts to use their own keys to allow you to recover your account e2ee keys without Apple being able to access it, rolled that out, and then slowly migrated their entire user base over to e2ee backups.

They have not, and they will not, because that will compromise the surveillance backdoor, and get them regulated upon, or worse. The current administration has already shown that they are willing to impose insanely steep tariffs on the iPhone.

You can’t fight city hall, you don’t need a weatherman to know which way the wind blows, etc. The US intelligence community has a heart attack gun. Tim Apple does not.

Separately it is an interesting aside that Apple’s 1A rights are being violated here by the presumptive retaliation should they publish such a migration feature (software code being protected speech).


And yet, it's somehow so effective that it's illegal in the UK because it doesn't let the government read everyone's messages.


TBF, governments trying to outlaw some kind of privacy doesn't necessarily mean it's a current impediment to them. They can be planning ahead, securing their position, or just trying to move the window of what is considered acceptable.


Are there any stats as to the percentage of iPhone users that enable Advanced Data Protection? Defaults matter a lot, and I wouldn't be surprised if that number is (well) below 10%.

If you are the only person out of all the people you correspond with who has ADP enabled, then everyone you correspond with is uploading the plaintext of your messages to Apple.


The number is well well below 1%. I would bet six figure sums it is below 0.1%.

Effectively nobody has it on. 99%+ of users aren’t even aware of the feature’s existence.

https://daringfireball.net/linked/2023/12/05/icloud-advanced...

You have to remember that there are something like a billion+ iOS users out there. 100 million people have not written down their 27 character alphanumeric account recovery key.


The same applies to WhatsApp. Messages backups are unencrypted by default and even the whole iPhone backup includes the unencrypted chat history of WhatsApp by default. One reason why it was a big deal for UK to disable iCloud’s E2EE backup.


There are compliance reasons where you want the communications encrypted in flight, but need them retained at rest for compliance reasons. Federal record keeping laws would otherwise prohibit the use of a service like Signal. I'm honestly impressed that the people involved actually took the extra effort for compliance when nothing else they did was above board...


> There are compliance reasons

Makes sense. But still debatable if the compliance requirements are acting against the security model or perhaps there are biggest concerns here than just secure communication.


I would not assume the archives were meant for compliance and federal records.


We also have no evidence it was in use back in March. It may be a response to that oops.


Any client-side limitations are not part of the security model because you don't control other people's devices. Even with an unmodified app, they're trivially bypassed using a rooted/jailbroken device.


Not part of Signal's security model, but trusting people in that chat very much can and should be part of the user's security model. If you don't trust them, why are they in the chat in the first place?


It's not a person in the chat, it's an account. The account is usually controlled by the person associated with it, but you can't assume that it's always controlled by that person.


Is it though? I think TM Signal is just emailing the chats to a server from the phone it's installed on.


> If you don't trust them, why are they in the chat in the first place?

Journalist? Taliban negotiator? Ex-wife?


You are conflating "trust in all ways" with "trust to receive the communications in the specific chat they are party to". The former is not relevant.


Well the ex-wife in question can be trusted to receive it a-okay and screenshot them to send to her lawyer and cops too, depending on contents. So do US government officials. Now we just know how exactly they do it.


Or with the more affordable (in terms of skills) method of using another phone to take pictures of key messages on the screen of the first one.


OK, say you're a bank. The SEC does not care what you do and is actively working to make sure nobody else does either. You never get fines and all the traders are whatsapping about deals and it's awesome. But what if the FEC decides to care in the future? Just mark all your messages as self-deleting. But what if you want to be able to read them in the future?

And then the Smarsh (owners of Telemessage) salesman calls you, and says "your users can keep using the apps they love - WhatsApp, Signal - but we archive the self-deleting messages somewhere you can hide from the SEC if they happen to change their mind". And everyone loves it (you already fired all the Security or Legal teams).


The purpose of using something like Signal is not compatible with the needs of the government or the law.

I’ve worked for non-Federal government. Your work product is not your own, and the public interest, as expressed by the law requires that your communications and decisions can be reviewed by the government you serve.

The US government created the dark web to enable espionage — its pretty obvious why they need to read their employees mail.


My guesses:

You want to talk to people who want to use Signal, but you yourself don't care about E2E

You trust Telemedia, but not Telegram, or Meta. And you want convenient archiving.


Maybe someone wanted to please the procedure of law but also had to please the bros. The result is a hack of a secure program that adds conversation archiving.


My wild speculation is that someone wants to use AI to monitor everyone’s communication.


What they should have done is write a bot that you invite to every conversation for "archival purposes". No new app.


Then you have a new attack surface. It's still missing the point of signal.


If your institution has to log the messages, they are a third party to the conversation, I would rather they were "in the chat" than the lowest bidder third party.

A chat participant bot would also be handy if you wanted to feed everything through your Ai bot at the same time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: