The original headline ends in a question mark, cites one person from Twitter who claims a 2004 (?) conversion, and still provides no explanation why any government might possibly want to serve TrueCrypt a warrant, given that they don't store user data (unlike Lavabit) and all their source code was open anyway.
Truecrypt is a high quality open source project that has been updated diligently for many years. For its developers to abandon it in such an immature fashion was highly bizarre. The possibility that they were sending a deliberate signal with this act was one of my first thoughts. I don't consider the article posted here to be plausible evidence, but it certainly gives voice to my suspicions.
Truecrypt's developers, if they indeed live in the U.S., may have done as much as they can do. It is now up to the audit team to evaluate truecrypt as thoroughly as possible and, should truecrypt prove sound, it will be up to international teams to pick up the torch. It's certainly possible that a completely different country is cracking down on truecrypt's developers, but this action fits the NSA's modus operandi perfectly.
Just an old, slightly jaded reverser who hung around the right places, and you don't really _need_ to care: it doesn't tell me what happened, just that a 'something bad happened' whistle was blown. It could just be as simple as one dev leaving it for if another dev went nuts.
Does it actually change the response, at all, I wonder? 7.1a is known, and being independently audited anyway. (It looked pretty good back when I took a look at it, although that wasn't 7.1a and that was a more cursory look than this thorough audit process.) No duress can change the fact of what the code does (although it could change what the site serves, what the 1024-bit DSA signing key signs...). 7.2 really isn't useful for that, at all.
Be interesting to see what ultimately comes of all this mess; maybe someone will fork it from 7.1a (won't be me: build process is way too hairy and licence is troublesome even from back when it was a EFTM fork), or maybe people will move to Diskcryptor or something and improve/audit that (it _is_ under a better licence).
No matter what actually happened - or happens - it's a huge shame. TrueCrypt was practical, effective, easy-to-use, and strong when used properly (as far as I know), with some (now sadly-removed) good documentation on how to use it correctly and what it can and can't protect against well. I really hope this débâcle doesn't drive people to switch to weaker crypto.
As many have pointed out, BitLocker - even if it's faithful - isn't available on many editions of Windows, isn't cross-platform, and only really supports the equivalent of keyfiles if you're not packing a TPM. It's a poor substitute for the ways many people use TrueCrypt.
LUKS is much better, although OS support isn't anywhere near as easy-to-use, and I'd say broadly equivalent with aes-128-xts/aes-256-xts, except that LUKS has an clear-identifiable header (dm-crypt doesn't have to) and TrueCrypt (unless you have the passphrase) doesn't. (Whether that's important is doubtful: a disk full of random data looks 'probably encrypted' to anyone who'd care to look and to the guy with the 5$ wrench or jackboots, that's good enough.)
I can't comment on the Mac one, never really taken a long look.
It would also be interesting to see similarly close audits of the OS built-ins. I know I'm curious about the exact changes Windows 8 made to BitLocker, because they were fairly extensive internally. Maybe I'll take a look. Maybe you should, too.
Could you verify that you are who you really say you are? And sorry for not knowing what you're associated with, a Google search for your name isn't very helpful in find out who you are. What relation do you have with the TrueCrypt devs?
EDIT: I also didn't realize that you had replied to my original comment, sorry about that.
No relation with the devs. Just a decade-old conversation about PGPdisk, IVs and trouble export laws might cause, so it's perfectly OK to not put _too_ much stock in it.
I'm curious, do we have any definitive examples of other warrant canaries being triggered in public? One prominent example is rsync.net [1] and their canary has been publishing weekly updates for a long time.
A canary that no one knows about is a <understate>little</understate> pointless... nonetheless it feels like a lavabit-esque shutdown. The intrigue continues.
Well isn't a warrant canary that is public knowledge just considered a method of breaking NSL silence? To be safe, a warrant canary would need to be plausibly deniable.
If I were a three letter agency who illegally threatened some crypto developer with unsavory things, willing to send him to Guantanamo or to outright kill him (as some people on HN obviously can imagine very well)... I'd just shrug my shoulders after this highly conspicuous way of shutting down the project and think "well played, Mr. Developer, no hard feelings".
How do NSL's work in practice in large organizations? This might not make sense for a small organization like Truecrypt, but if someone like Microsoft or Google receives an NSL, presumably multiple people have to see it and handle it -- what keeps one of them from taking a picture of it on their cell phone and leaking it anonymously? (Of course they need to sanitize the embedded metadata and timestamps.)
If they can't prove which specific employee leaked it, who do they punish?
The question remains how do they find out the dev's identity in the first place even if this was true; yet we don't even know who the hell Satoshi Nakamoto is really. Would secret agency knows who Satoshi Nakamoto is?
If it were a warrant canary. Doesn't stating that the update is obviously a canary put the true crypt team possibly in legal trouble? Communicating the warrant. Even cryptically is still communicating about the warrant.