Disturbing Trends Across the Pond

Print Friendly

Two convicted for refusal to decrypt data

Since October 2007 when the refusal to disclose decryption keys was made criminal in the UK, the buzz around the smallish digital forensics research community has been alarm. Security researcher, by definition always on the lookout for failings in a system, immediately proposed a situation in which encrypted data is present on a system for which the user did not have the decryption key thus creating a crime through ignorance, not of the law but of the key. As reported by the Register in the above link, two individuals have been convicted under this ridiculous law.

Failure to comply with a section 49 notice [request for decryption keys] carries a sentence of up to two years jail plus fines. Failure to comply during a national security investigation carries up to five years jail.

The States being, some recent events to the contrary, far more fanatical in our defense of privacy have a different position on this matter. Specifically the case of Uninted States v. Boucher in which it was initially ruled forcing the disclosure of a passphrase which unlocks the decryption key violates the 5th amendment protections against self-incrimination, and on later appeal Boucher was only required to turn over files which were encrypted because he had previously disclosed them to border agents.

The question reall boils down to how much we value privacy over punishing the guilty, and how much we emphasis never punishing the innocent vs allowing the guilty to escape punishment. There are two main situations I see as problematic to the UK approach.

First, how do we determine whether something is encrypted data? I put aside encryption products which are obvious about their contents being encrypted like password protected office documents, or pdf files etc to concentrate on indeterminable encryption, or plausibly deniable encryption. The first product the average user may come across in this is True Crypt which has made great contributions in personal privacy in the last few years. The goal of true crypt encryption, as any good encryption scheme should strive for, is for the encrypted information to be indistinguishable from empty areas of a drive filled with garbage – random garbage. Hard drives, generally, always have unused space – how do we determine a section of seemingly random garbage is indeed encryption rather than garbage with sufficient certainty to validate willingly and intentionally prosecuting, convicting, and imprisoning a person for not disclosing a decryption key which may not exist? At this point I will adopt an axiom for my argument to forestall any counterpoints regarding prosecutorial discretion in such cases. Any law which can be used to punish an individual unfairly, is billed as not intended or won’t be used against such individuals where the situation is unclear due to discretion, eventually will be used to punish an individual for a variety of other reasons – intimidation, retaliation, political reasons, etc.The answer to how we determine is we cannot, outside of encryption schemes which are implemented poorly with information leak issues encrypted data should be effectively indistinguishable from random noise.

The True Crypt example may seem cut and dry, if we cannot determine if it is encrypted we cannot prosecute them etc, maybe. Steganography is a more opaque example. Digital steganography¬† involves hiding data inside legitimate files, think of it as a digital equivalent to writing with lemon juice on an existing letter. One such technique is called “Least Significant Bit” (LSB) for bitmaps. In bitmaps each pixel is stored as a number, modern bitmaps as either 24-bit (RGB), or 32-bit (RGBA) with an alpha channel. In such cases of steganography the least significant bits of the pixel channels are slightly altered such as to be indiscernible to the human eye, but sufficiently to encode information. This information is stored encrypted and thus looks random. Tools, such as OutGuess, attempt to detect the presence of steganographic information using statistical analysis, but these tools are not 100% accurate, produce false positives, and are dependant on their training set for accuracy.

Let’s assume an individual is suspected by the police for some crime and the individual’s laptop is seized for analysis. OutGuess or a similar program is run and produces a list of files it suspects are encrypted. The policy, not being researchers, are far more credulous of the tool’s output than they should be and assume the files are encrypted and demand a decryption key be provided. The individual cannot turn over a key which does not exist and may then be prosecuted, convicted, and sentenced to a prison term despite their innocence.

Second, the effect this has on corporate security is a more practical concern. What happens when an IT Director or high level system administrator is traveling with encrypted backup tapes or maybe just a courier sneakernetting large quantities of information encrypted for transport. The individual very likely will not have the decryption keys on them, and if the keys reside outside the jurisdiction of the UK, say in the US, how are they going to handle the individual with encrypted data if the company refuses to turn over the keys and the individual can’t? Makes the UK a whole lot less appealing as a business destination, and would certainly give me pause before attending any conferences there.

This type of invasive big brother law is extremely concerning not only because it can be so easily abused, but also because of its potential to convict the innocent. I always liked France better anyway, but then I am from New Orleans so I might be biased.

Posted in Digital Forensics, Law Tagged with: , , ,