TrueCrypt was much more flexible than anything Windows has to offer.
Bitlocker is great for enterprise-style encryption, in particular on machines with TPM chips. However many consumer machines do not include a TPM, even in 2015.
TrueCrypt allowed you to encrypt individual drives, even offline drives, with no Bitlocker overhead. You also weren't required to decrypt them upon each boot like Windows' Bitlocker insists upon.
Additionally TrueCrypt would also encrypt directories, USB drives, hidden volumes, various encryption algorithms, double encryption, and so on.
Plus it was cross-platform friendly (or at least more so than BitLocker). What are we meant to use to move encrypted data from Linux to Windows now? 7Zip w/AES 256?
Not to mention how much the US governments advocates against encryption they can't have backdoors to. They have a lot more influence over the encryption in Windows than TrueCrypt. Not suggesting they DO have backdoors to Bitlocker, there's not enough evidence, but the probability is much higher.
Short story is that if you use Windows 8{.1} and have a Microsoft account then it will upload your BitLocker keys by default. Seems to me like a backdoor if ever I heard one.
I find tarsnap much easier/better than any block crypto mount. Intelligent backups that only synch what you've changed. Luks containers and tc-play exist if you must use a mounted container.
Both Bitlocker and Filevault allow this same UX. On Windows, you create and encrypt a VHD. On OSX, you create an encrypted DMG. There's even an OSX tool, "Knox", that (when we used it) did a really great job managing lots of little encrypted volumes.
If you want to move an encrypted file from Linux to Windows, though, you should use something like PGP.
So tptacek's argues that sector-based full-disk encryption is inherently vulnerable, especially if used on the boot volume, because if someone grabs your laptop everything's still loaded in memory.
What if my use case is different: keeping just a particular set of documents not in constant use secret? Perhaps stored on a removable drive? Truecrypt is great for this. It does have the risk of information leakage via tempfiles and swap, but it also makes you a lot harder to raid unless you've got the incriminating document open on your screen in a cafe (you fool).
(I was asked by someone I know who works in international human rights "How do I get my case files safely across borders?" and didn't have a good answer.)
If the documents aren't in constant use, the most secure way to encrypt them is with a userland program like PGP. Userland crypto knows where files begin and end, and can store metadata to improve the encryption. They can provide cryptographic integrity --- far more powerful than the incidental integrity check Bitlocker tried to provide, or the virtually zero integrity that XTS provides. They're randomized, so the ciphertext can have semantic security; it reveals nothing at all about the plaintext, even as the files are edited in place under the same key.
Sector crypto can't do anything even approximating this without contortions like geli.
If I was trying to protect files from nation state adversaries, I would not consider Truecrypt.
That doesn't mean I think you shouldn't run something like Truecrypt. I think you're better off with whatever your OS provides, but some kind of sector-level crypto, be it Bitlocker, Truecrypt, or Filevault, is still useful.
But if you're serious about protecting a specific set of files, encrypt them manually, no matter what else you do.
Can PGP do true edit-in-place, or do I have to decrypt to local disk first? Because decrypting to local disk is very likely to leave plaintext lying around somewhere unless my primary SSD supports "secure erase free space" and I remember to use it.
Good point. It doesn't; you need to securely delete temporary files. Mitigating that:
* Sector-level crypto is cryptographically incapable of secure in-place editing; they can gradually leak information about the plaintext as edits happen. That's not a big deal for a PDF, which aren't on-line live real-time edited, but it can be a big deal for other kinds of files. I tend to err on the side of systems programming weaknesses rather than crypto weaknesses. We're better at dealing with them.
* No matter what kind of cryptography you're using, the assumption you should be making is that plaintext is at some point exposed to someone who owns up your live running system.
I think concern about unlinked plaintext-containing sectors is reasonable, and a good reason to use both sector-level crypto and file-level crypto. I use both, as does everyone at Matasano.
Not sure why you've been downvoted. There is no active development on the project any more and it doesn't work with UEFI.
There's no free full disk encryption for Windows users with modern (UEFI) boxes. The money would be better spent on that, but returning crowdfunded money... tricky.
This is extremely unlikely. Windows API doesn't change that drastically between versions. Which is why people don't have to recompile their programs for newer/different versions of Windows.
Windows maintains good compatibility for userspace applications, but they don't guarantee driver compatibility. A Windows 95 program might run on Windows 7, but drivers for Windows 95 probably wouldn't!
Truecrypt manages to create virtual disks that look a lot like regular disks, so I assume it does some kernel-level stuff to make that work.
Ah, oops. But yah, overall I think auditing TrueCrypt is just as relevant today as it was when it was stalled in development and they raised the funds.
I don't see why people should spend money auditing it, instead of building maintainable alternatives
When someone builds a new one (as happens every other day), we'll have to start again, right back at the beginning, fixing all the bugs in it and auditing it. About halfway through that process, someone will come along and say "I don't see why people should spend money auditing it, instead of building maintainable alternatives"...
There will always be new projects doing things in different ways. That's no reason not to make sure that something we have now works properly.
People have already spent the money to audit it, back before development was officially discontinued. What do you expect them to do with the money? Return it to the people who donated?
One thing I noticed looking through the code is that the key generation on Windows mixes a CRC32 of a MOUSEHOOKSTRUCT. If you look at it, there isn't a huge amount of entropy in there... Some fields, such as the window handle, don't change between callbacks. Others, such as the hit test code are enums with limited possible values, and the way that most people move the mouse around will return the exact same value all the time. The difference in time between two different values is run through CRC32 a few times and then the whole thing is run through a real hash. Most users don't bother adding entropy from the keyboard.
While I don't think any of this is a vulnerability, I think it could be better.
[edit: I'm talking about Common/Random.c in 7.1a. And by better I'm suggesting additional sources of entropy be included in the process]
It's pretty easy to explain - and in fact, you got your answer[0]
> Block-level encryption is a terrible, terrible approach for many reasons (which 'tptacek has referenced a million times). However, Truecrypt is the best such implementation, and it's a required approach in certain cases. You should be doing crypto at the application/filesystem level; if you can't, use Truecrypt. This isn't contradictory advice.
The only reason this even seems remotely contradictory is because you've taken Thomas's statement completely out of context (perhaps because it's nested about 50 lines in from the top-level comment that even provided the context in the first place).
Alternatively, it's only contradictory if you take a black-and-white, all-or-nothing interpretation of what Thomas says... which is quite ironic, because one of his key criticisms of Truecrypt is that it is all-or-nothing, as stated in the very same post that you quote[1].
The money was raised to audit TrueCrypt before the developers abandoned it. The money must go toward that audit. The fact that the TrueCrypt devs abandoned the project changes what we should expect from it. Specifically, we should expect that moving forward, it's going to be a bad idea to use it.
This viewpoint isn't just tptacek's. When the TrueCrypt devs shut down TrueCrypt, they posted in big red letters "TRUECRYPT SHOULD BE CONSIDERED INSECURE" or something to that effect. They did that because, like tptacek, they are responsible crypto devs and are doing their duty: when no one is actively maintaining a project, it is inherently insecure because the security landscape changes so rapidly.
If the TrueCrypt devs were to step out of the shadows with some money to audit TrueCrypt's current codebase, yet maintained their stance that not having an active dev team makes the project insecure, would you aggressively badger them for their "contradictory beliefs"?
There is value to be had in auditing an insecure project's open source codebase. If any security problems are discovered, users will be able to assess their potential impact based on how they were previously using TrueCrypt. If they have old images laying around which are discovered to be decryptable, users will be able to delete them before someone else discovers that flaw and steals their data.
Secondly, if the codebase survives the audit relatively unscathed, then it serves as an example of how to write production crypto code (at the time it was written, not presently!) similar to how tarsnap is currently such an exemplar. The TrueCrypt code can't be used directly due to licensing issues, but it nonetheless serves as a "here is how to use these arcane Windows APIs in the context of security." Such guidance will be extremely valuable for future similar projects.
Lastly, I am kind of afraid to talk to you at all in case I incur your bullying wrath somehow, because if that were to happen, you'd kill the fun of HN for me. I imagine you're killing the fun of HN for tptacek.
Actually, tptacek isn't suggesting that the murky status of TrueCrypt's source code is why you shouldn't use it, he's suggesting that the disk encryption mode itself (XTS) is why no one should use it.
> Thomas changed his comments about 50 times between when we were replying to one another and now, so in all likelihood what I've written makes no sense in relation to what he's written anymore.
I was watching his comments and your replies back then in real time, and this did not happen at all.
You were seriously watching all of our interactions over the course of the past 6 hours, with enough detail so as to be able to diff individual comments over the entire time period?
Not all of your interactions over the past 6 hours as of that time, just most of the back-and-forth, in particular, with comments being posted up to "10 minutes ago" so there is a chance of ninja-edits made to be more polite (something I often do) and checking later I recognized all the sentences I read.
He added in the positive comments about TrueCrypt. Previously, his comments read as if he were completely opposed to all forms of TrueCrypt's use.
He also removed quite a bit of negativity and snark from his comments. Without those, the tone of the conversation shifts considerably.
What I'm trying to get at here is that there's a need for a TrueCrypt like product, and Thomas seems to think there isn't, or didn't last time I read his comments. He was suggesting that anyone who wants to actually encrypt their code should use PGP. I was arguing that such a strategy isn't realistic given the UX/workflow that TrueCrypt gave.
As I said earlier, what do you think the function of this comment is?
Of course you'd say something to this effect. The purpose of a comment would be to elaborate on why what I said was wrong; providing examples or evidence to the contrary is a great way to further any conversation.
I, unfortunately, have no evidence. So it's just my word. I realize that's not super valuable, but that's what I've got.
The point is that you were worsening the quality of HN by prosecuting an off-topic and seemingly personal agenda and taking the thread way into the weeds. Please don't do that. To steal a phrase from the HN guidelines, it never does any good and makes for boring reading.
If you have concerns about other things on HN, we'd be happy to look into them for you, but the way to make that happen is to email hn@ycombinator.com as the guidelines ask.
As a casual user, I would like to know how I am endangering users by encouraging them to use TrueCrypt? And what should I be encouraging them to use instead?
If your disk is mounted when your computer is stolen or confiscated, your data is accessible.
If your adversaries want your data, FDE will help them and not you. If you have secrets that would put you in danger if revealed, you would now be danger.
The alternative is file-level encryption. The only accessible files at any given time are the ones you're using, so if you are relieved of your laptop on short notice, not all beans will be spilt.
File-level encryption is a pain in the neck to work with. FDE is much more convenient, and a pretty good answer if your threat model doesn't include "drive by laptop snatching" as a major concern. This is most people.
> If your adversaries want your data, FDE will help them and not you.
Sorry to nitpick, but it may confuse a casual user: FDE will not help your adversaries, it just won't help you. That is, in those circumnstances (i.e., when your disk is mounted) FDE won't have any effect.
Agreed. The comparison which wasn't clear in that sentence was to file-level encryption.
At the point of seizure, your chosen data protection method is either helping your adversaries by offering all files unimpeded, or helping you by not doing so.
It's equivalent to having no data protection at all, if you adversaries are competent.
Ah, I see. Am I correct then that the cautionary note was just in general about the pitfalls of FDE if a mounted drive is compromised while mounted? I think I misinterpreted it to mean that there was a specific problem with TrueCrypt which leads its use to endangering users.
In my own use case and in that of people I've recommended TrueCrypt to, having the hard drive apprehended while shut down, specifically during customs checks, is a far greater risk than having the computer compromised while the encrypted drive is already mounted.
Given that particular threat model, is it OK for me to continue to use and recommend using TC?
If you cannot be surprised while the drive is mounted, and cannot be compelled to mount it (maybe just by booting the machine) then there is no known specific additional risk to running FDE.
Note that "surprised" might include a networked attack, in addition to being tackled in a coffee shop.
File-level encryption protects your data until you reach the court order level of compulsion, and possibly further. At least in civilized countries.
So, given those caveats, I'd say your answer is "yes", but...the best plan is to do both. FDE as a matter of policy, and file-level on any files of specific value.
Thanks again. Can you please clarify what is meant by networked attack in this context? Someone gaining access to the mounted drive over a network, or something else?
Sure. If your FDE disk is mounted and your machine is susceptible to any kind of remote exploit (OpenSSL, Adobe flash, weak ssh password, etc) then the attacker has full reign over your disk when they arrive.
File-level encryption constrains them to just the files you have open at the time, although of course any breach might be persistent, so they could theoretically wait around until supersecret.txt gets opened and grab it then.
Though I'm no tptacek, I think the reasoning here is that even though TrueCrypt is undergoing an audit, it's not under active development and unfortunately due to the licensing, any patches produced by the community could be on uncertain footing legally.
It's dangerous to encourage truecrypt to the exclusion of other options. It's dangerous to use truecrypt in inappropriate situations. At the same time it's good to make sure truecrypt is the best it can be at what it does, even if that category is fundamentally limited.
To get more explicit, it's the difference between "using" truecrypt and "relying" on truecrypt. You want to give people an accurate picture of their options and the tradeoffs. "relying" on truecrypt is dangerous.
"<Popular product> shouldn't be relied on!" What he doesn't say is exactly what you just said, that it works for the majority of use cases. It all hinges on this liberal interpretation of the word "rely".
"Top 10 things your cryptographer doesn't want you to know!"
It's marketing, and it's annoying. TrueCrypt is just fine for the majority use case, and Thomas knows that, but that doesn't get attention. Saying bombastic things like, "Don't use TrueCrypt!" gets attention.
So I gave him some attention. Hope that's what he wanted.
You're the one making a big deal out of a statement as simple as "use something better if possible", trying to turn it into a contradiction so he can be "wrong".
Use cases where the security fails are a huge red flag. Mentioning red flags is not sensationalism.
That's not what he said, he said it's actively harmful to promote the use of TrueCrypt. That's a world of difference from "use something better if possible".
You can be proud, make 10000% more money that true crypt developer for several years working and now act like nothing happened. This audit was reasons why TT is offline
Think he was getting at a theory that the development of TrueCrypt was abandoned because of the audit, for some reason. Maybe for reasons of pride (questioning their integrity by looking for backdoors) or fear that something nefarious would be uncovered.
The first is plausible, the second doesn't sound like the reaction an anonymous author would have.
Most plausible reason the devs abandoned it imho is that they got bored.
TrueCrypt is a discontinued source-available freeware utility used for on-the-fly encryption. It can create a virtual encrypted disk within a file or encrypt a partition or the entire storage device.
Since TrueCrypt is dead, what is the purpose of auditing it?
Is there an official team that is ready, willing and able to take over maintenance and development?
If anything should be audited I would think it should be one of the forks.
But the author makes the following comment on the other thread:
----------------------------------
Also: speaking in no "official" capacity whatsoever, I'd advise you to stay away from the forks of Truecrypt.
Unless something new has come to light since last I looked, the licensing situation on the TC code is weird:
http://lists.freedesktop.org/archives/distributions/2008-Oct....
... which means there is a pretty strong disincentive for people with serious crypto and systems expertise to invest their time and energy building on it. You don't want to trust crypto platforms with built-in adverse selection problems.
---------------------------
So the forks have issues and the main project is dead??
But we needz more money to audit it more?
I am also unclear what the first expensive audit accomplished if it did not cover encryption. Sounds like having an inspection on a house that covers part of the
roof and two rooms and nothing else like the foundation. Does anyone have a link
to the original campaign to raise money?
Where do you see them asking for more money? They talk about how they're going to make what they have last longer, but they don't ask for additional donations.
> Since TrueCrypt is dead, what is the purpose of auditing it?
To access existing volumes. All my backup CDs were TC encrypted.
And I haven't found a replacement that is all of easy to use, stable, and cross-platform. New encrypted volumes I make are LUKS but the Windows FreeOTFE is even deader than TrueCrypt. It still works, mostly, but could be a lot better.
If someone needs to access existing truecrypt volumes in order to migrate data, I dont see how the audit helps since the only way to get the data is to use truecrypt.
I guess the alternative is to just lose all the data.
If the audit found any problems then it would most likely mean that further use was problematic, not that migrating existing volumes was problematic.
> Since TrueCrypt is dead, what is the purpose of auditing it?
It makes sense to audit the common ancestor rather than a single one of the derivatives because you'll cover more bases. If TC is found to have a critical vulnerability then you'll know all forks have the problem. If one fork is found to be vulnerable then you'll need to figure out whether it's a TC vulnerability or some subtle dependency on the new code.
> Since TrueCrypt is dead, what is the purpose of auditing it?
Because people still use it. There are no good, cross-platform alternatives at the moment. And even though they said it "might" have vulnerabilities none have ever been found despite the project being open-source. So yeah, it's a great thing that the audit is still going ahead.
I don't see why people should spend money auditing it, instead of building maintainable alternatives