TAG | cryptography
In the article below Attorney General Eric Holder said ““It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy”
This is simply not true, and harkens back to the discredited arguments made by the FBI in the 1990’s about the Clipper Chip. It is hard enough to make secure computing systems, and we are not very good at it as all the breaches demonstrate. Intentionally introducing a vulnerability, which is the essential nature of back door or law enforcement access, is madness. If there is a back door, then keys exist, and can be compromised or reverse engineered. It is an added complexity to the system, which is almost certain to introduce other vulnerabilities. Its use would not be restricted to the US. Once it exists every government will demand access.
Social media and the cloud have tilted the balance of power absurdly towards law enforcement. This argument that they must retain access to encrypted cell phones is fatuous.
The Massachusetts High Court recently ruled that a suspect can be compelled to decrypt disks, files, and devices which have been seized by law enforcement. The crux of the question before the court was whether compelling the password for decryption is forbidden by the Fifth Amendment protection against self incrimination.
The analogy one most often sees is to being compelled to provide the combination to a safe, the contents of which are subject to a search warrant. That is well settled law, you can be compelled to do so.
The court said:
We now conclude that the answer
to the reported question is, “Yes, where the defendant’s compelled decryption would not
communicate facts of a testimonial nature to the Commonwealth beyond what the defendant
already had admitted to investigators.” Accordingly, we reverse the judge’s denial of the
Commonwealth’s motion to compel decryption.
In this case, there was nothing testimonial about decrypting the files because the defendant has already admitted to owning the computers and devices, and to being able to decrypt them.
The much more interesting situation will come in a case where the defendants say they never had, or have forgotten, the password. One can not be compelled to do something impossible, but generally the proof of the impossibility falls on the defendant. In this case one would have to prove a negative. How could you prove that you don’t have the password? The only thing that can be proved is that you do, and that only by doing so.
This ruling is only binding in the sate of Massachusetts, but is likely to be influential in cases in other areas.
Steve Gibson shares recent messages exchanges with some of the developers of TrueCrypt. These further suggest a boring explanation of the shutdown, as opposed to more nefarious explanations.
For years, TrueCrypt has been the gold standard open source whole disk encryption solution. Now there is a disturbing announcement on the TrueCrypt website. Right at the top it says “WARNING: Using TrueCrypt is not secure as it may contain unfixed security issues”.
The rest of the page has been changed to a notice that development on TrueCrypt stopped this May, and directions for migrating from TrueCrypt to BitLocker, the disk encryption tool built in to Windows. Of course, this is of little help to anyone using TrueCrypt on Mac or Linux. It is still possible to download TrueCrypt from the site, but the code now will not create new vaults, and warns users to migrate to a new platform.
There are certainly alternatives, but this is a real shock. On Mac, one could always use the built in FileVault tool. Linux users may have a harder time finding a good replacement.
The big question is, what the heck is actually going on here. This is all far too cryptic, with no where near enough actual information to draw intelligent conclusions.
A recent independent audit of TrueCrypt discovered “no evidence of backdoors or otherwise intentionally malicious code in the assessed areas.”
There are a number of theories about what is going on ranging from credulous to paranoid.
- Like Lavabit, they received a National Security Letter requiring compromise of the code. This is their way of resisting without violating the gag order.
- They have been taken over by the government, and they are trying to force everyone to move to a less secure / more compromised solution.
- There really is a gigantic hole in the code. Releasing a fix would tell attackers the exact nature of the vulnerability, which most people would take a very long time to address. Having everyone migrate is the safest solution.
- Some personal conflict within the TrueCrypt developers is leading to a “take my ball and go home” action.
- The developers only cared about protecting windows users with XP or earlier, which did not have the built in disk encryption. Now that XP support has ended, they don’t feel it is valuable any more. This is suggested by the full wording of the announcement.
- The website or one of the developer’s computers was compromised, and this is a hack / hoax.
The whole thing is really odd, and it is not yet obvious what the best course of action might be.
The safest option appears to be to remove TrueCrypt, and replace it with some other solution, either one that is built in to the OS, or from a third party.
Attorney General’s new war on encrypted web services – Security – Technology – News – iTnews.com.au
Australia’s Attorney-General’s department is proposing that all providers of Internet services ensure that they can decrypt user communications when so ordered. Any services where the provider has the keys will obviously be able to do this.
Australians may want to start to start taking steps to protect themselves now.
End to end encryption is your friend. At least that way, you need to be informed and compelled if they want access to your data.
Another important step is to get your “in the clear” communications into another jurisdiction using a VPN service like Anonymizer Universal.
Finally, let your voice be heard on this issue by reaching out to your members of parliament.
There is a good analysis of the nature and implications of the latest “Bullrun” leaks over at A Few Thoughts on Cryptographic Engineering. It is worth reading.
Yesterday Google announced that it was updating its certificates to use 2048 bit public key encryption, replacing the previous 1024 bit RSA keys.
I have always found the short keys used by websites somewhat shocking. I recall back in the early 1990’s discussion about whether 1024 bits was good enough for PGP keys. Personally, I liked to go to 4096 bits although it was not really officially supported.
The fact that, 20 years later, only a fraction of websites have moved up to 2048 bits is incredible to me.
Just as a note, you often see key strengths described in bit length with RSA being 1024 or 2048 bits, and AES being 128 or 256 bits.
This might lead one to assume that RSA is much stronger that AES, but the opposite is true (at these key lengths). The problem is that the two systems are attacked in very different ways. AES is attacked by a brute force search through all possible keys until the right one is found. If the key is 256 bits long, then you need to try, on average, half of the 2^256 keys. That is about 10^77 keys (a whole lot). This attack is basically impossible for any computer that we can imagine being built, in any amount of time relevant to the human species (let alone any individual human).
By comparison, RSA is broken by factoring a 1024 or 2048 bit number in the key into its two prime factors. While very hard, it is not like brute force. It is generally thought that 1024 bit RSA is about as hard to crack as 80 bit symmetric encryption. Not all that hard.
Their Asha and Lumia phones come with something they call the “Xpress Browser”. To improve the browser experience, the web traffic is proxies and cached. That is a fairly common and accepted practice.
Where Nokia has stepped into questionable territory is when it does this for secure web traffic (URLs starting with HTTPS://). Ordinarily it is impossible to cache secure web pages because the encryption key is unique and used only for a single session, and is negotiated directly between the browser and the target website. If it was cached no one would be able to read the cached data.
Nokia is doing a “man in the middle attack” on the user’s secure browser traffic. Nokia does this by having all web traffic sent to their proxy servers. The proxy then impersonate the intended website to the phone, and set up a new secure connection between the proxy and the real website.
Ordinarily this would generate security alerts because the proxy would not have the real website’s cryptographic Certificate. Nokia gets around this by creating new certificates which are signed by a certificate authority they control and which is pre-installed and automatically trusted by the phone.
So, you try to go to Gmail. The proxy intercepts that connection, and gives you a fake Gmail certificate signed by the Nokia certificate authority. Your phone trusts that so everything goes smoothly. The proxy then securely connects to Gmail using the real certificate. Nokia can cache the data, and the user gets a faster experience.
All good right?
The fly in the ointment is that Nokia now has access to all of your secure browser traffic in the clear, including email, banking, etc.
They claim that they don’t look at this information, and I think that is probably true. The problem is that you can’t really rely on that. What if Nokia gets a subpoena? What about hackers? What about accidental storage or logging?
This is a significant breaking of the HTTPS security model without any warning to end users.
Matt Blaze analyzes why the widespread use of cryptography has had almsost no impact on our practical ability to do wiretaps and gather information under legitimate court orders. Not too technical and absolutely worth a read.
3 Comments · Posted by lance in Computer Security, Cryptography, First Amendment, Innovation, Internet, legal, Legislation, National Security, Online Privacy, Personal Privacy, Security Breaches, Surveillance
The EFF has an excellent article on eight reasons why government regulation of cryptography is a bad idea.
The short answer is: the bad guys can easily get it and use it anyway, and it will make security for the rest of us much worse (not including the big brother surveillance and constitutional issues).