Cryptography ... it's a Love v Hate Thing
Introduction
There are few pieces of technology around that are either loved or hated, and cryptography is one of them.
It is one most useful things ever ... but also one of the most dangerous things ...
it keeps us secure ... it allows some people to hide their tracks when they do bad things ...
it proves our identity ...it allows others to steal our identity (by stealing our keys) ...
it keeps our money safe ... it has flaws which allow intruders to steal our money.
And it's as dangerous as a bomb but as protective as an underground nuclear bunker.
So, as part of the Queen's speech, the UK Government has outlined planned for survillence on computer networks, in order to tackle terrorism. The detail has to be worked out, but one major thing to be sorted out is where the UK government will draw the line on encryption, and how this will contribute against the threats to our society. Some have talked in the past of banning it usage, but while this might be grand in its scope, and makes a great sound bite, but it is impossible to implement, and extremely na?ve from a technical point-of-view. It is also negligent in protecting users from a range of threats.
There are so many things that need to be address, and the usage of the dark web should not be bound up with the usage of encryption, as it is a core part of the Internet that we use every day. I'm a technologist, and not a politician, so I can only outline my technical viewpoint, and do understand that there are many risks to our society, and increasing the Cloud provides a great deal of data in detecting criminal intent.
For this article does not focus on the examination of network packets, but purely on the risks around restricting encryption.
Banning the teaching of cryptography?
Education is power, and most developed nations in the world have built an economic advantage in applying science and technology into the development of new ideas. Within science there are things that can do good, and also things that can do bad, so we weigh up the advantages of developing knowledge, in order that we understand both sides of the equation.
Cryptography is sitting in a rather difficult space just now, as being so useful and at the core of fixing most of the technical problems on the Internet, and being seen as being involved in the most evil of acts that the Internet is used for. In Australia, for example, we see there is talk of criminalising the teaching of encryption, But under the Defence Trade Controls Act, which restricts the export of cryptography material:
Under these laws, such “supplies of technology” come under a censorship regime involving criminal penalties of up to ten years imprisonment. Link
In fact, many problems in computer security have been caused by a lack of education in areas like cryptography, and if we are to build an information economy, we need more people, especially software developers to understand how to use cryptography correctly.
The Trouble Caused by Cryptography
Most encryption uses a secret encryption key, which is used to encrypt and also to decrypt. This is known as private-key encryption, and the most robust of these is AES (Advanced Encryption Standard). The key must be stored someone, and is typically placed in a digital certificate which is stored on the computer, and can be backed-up onto a USB device. The encryption key is normally generated by the user generating a password, which then generates the encryption key.
Along with this we need to provide the identity of user, and also that the data has not been changed. For this we use a hash signature, which allows for an almost unique code to be created for blocks of data. The most popular for this is MD5 and SHA.
Encryption is the ultimate nightmare for defence agencies, as it makes it almost impossible to read messages from enemies. The possibilities is to either find a weakness in the methods used (such as in OpenSSL) or with the encryption keys (such as with weak passwords) or, probably the easiest is to insert a backdoor in the software that allows defence agencies a method to read the encrypted files.
There has been a long history of defence agencies blocking the development of high-grade cryptography. In the days before powerful computer hardware, the Clipper chip was used, where a company would register to use it, and given a chip to use, and where government agencies kept a copy of it.
in 1977, Ron Rivest, Adi Shamir, and Leonard Adleman at MIT developed the RSA public key method, where one key could be used to encrypt (the public key) and only a special key (the private key) could decrypt the cipher text. Martin Gardner in his Mathematical Games column in Scientific American was so impressed with the method that he published an RSA challenge for which readers could send a stamped address envelope for the full details of the method. The open distribution of the methods which could be used outside the US worried defence agencies, and representations were made to stop the paper going outside the US, but, unfortunately for them, many papers had gone out before anything could be done about it.
Phil Zimmerman was one of the first to face up to defence agencies with his PGP software, which, when published in 1991, allowed users to send encrypted and authenticated emails. For this the United States Customs Service filed a criminal investigation for a violation in the Arms Export Control Act, and where cryptographic software was seen as a munition. Eventually the charges were dropped. Phil now has a strong viewpoint on the banning of certain types of cryptography:
Intelligence agencies have never had it so good. To complain that end-to-end encryption is crippling them? It’s like having a couple of missing pixels in a large display. They have the rest of the display!
The tension of Governments bubbles to the surface
Governements around the world are struggling with cryptography, and many are considering a new jerk reaction, without really understanding the ramifications on the Internet, and also whether some of the solutions can actually be implemented. President Obama makes the tension clear with his recent statements:
He says that he believed in secure email, and with the news around leaked Sony emails, he pushes the viewpoint of being:
... a strong believer in strong encryption... I lean probably further on side of strong encryption than some in law enforcement.
which is a double usage of "strong" ... a "strong believer" and "strong encryption". The deepness of his understanding of the rights of individuals to privacy is then put perfectly as:
"Ultimately everybody, and certainly this is true for me and my family, we all want to know that if we’re using a smartphone for transactions, sending messages, having private conversations, that we don’t have a bunch of people compromising that process.
and then completely goes against David's viewpoint with:
There’s no scenario in which we don’t want really strong encryption."
I must admit this is some of the most informed dialogue I seen on the subject, and perhaps some of the best advice that anyone has provided on the subject. Never have two nations been so far apart on a policy.
For the risks to our nation from terrorists, he does highlight that there needs to be monitoring in certain circumstances, which is equivalent to monitoring the phones of those suspected of serious crimes. He understands, though, that law enforcement can't stop terrorism, but the opportunities for scanning our digital footprint probably gives law enforcement a better opportunity to monitor activity than ever in the past (as clearly defined by Phil Zimmerman).
It's just not possible ...
The debate around the usage of cryptography in its application within evil things, such as the usage of the Dark Web and in terrorists passing secret messages, focuses the mind, but there so many technical holes in the argument to restrict encrypted content, including:
- It is a core part of the Internet, and its usage increases by the day. Every single time we connect to Google, we are using an encrypted stream where is almost impossible for anyone to decipher the contents of the searches. So every time we see the padlock on a browser connection know that we have a secure connection, with the traffic encrypted, typically, with an encryption key created purely for that session.
- Much of the content we use is actually stored and processed outwith the UK. Someone who suspects that they are going to be monitored will simply setup a secure connection of a remote Cloud site, and store and process information there. It is almost impossible to crack the streams of data that are involved in the connection. With an almost infinite resource for processing and storage on the Cloud, increasingly users are storing their content on remote systems.
- We have a right to have some privacy. It becomes almost impossible to pick-off good traffic from bad, and many people would balk at the thought of their letters being examined, or with phone taps, so the rights we enjoy with traditional communications technologies seems to be the ones that should be applied to Internet-based communications.
- We have a right to protect ourselves. As data breaches are occurring on a regular basis, users need to protect themselves, including using encryption on the storage of their data, and on their communications, so it is difficult to say on the one hand that we should protect ourselves from external hackers, and then on the other that we allow our data to be picked apart by security agencies. It should be remember that the tools that security agencies have are often the same ones that the hackers have, so reducing encryption levels will expose our own data to malicious parties.
- There's no secrets anymore. In cryptography the methods are well-known and there's a wide range of software code libraries and they are fairly easily to integration into applications.
- It's unfair to pick off certain applications. PGP and Tor are two of the applications areas pin-pointed, but there are so many other applications which could be used, so restricting to a limited selection of applications seems to be wrong.
- It's just impossible to ban. There is no way that you can define a law which constrains the usage of encryption. Would it be just certain applications (such as email) or could it be certain methods (such as using (PGP)? Overall it is not possible to draw a line in defining what would be allowed and not. Would using a Caesar code by seen as illegal?
- It would make the UK an unsafe country to do business. Few free countries would consider switching off encryption, as it create an environment for insecurity for both consumers and businesses.
- Would it be limited to encryption? Where would the ban end? Do we include the encoding of charaters to other formats which are difficult to scan, such as for Base-64 or non-Engish character sets?
- Deep packet inspection at the core of the Internet is not really possible. At the core of this argument is the examination of data packets within the Internet. The deep inspection of data packets might be possible on home networks, but at the core of a network is it almost impossible to examine each of the packets for their contents. With streams now running at over 100Gbps, where are few systems which could have the processing capacity to actually read the network packets for threats.
- It's not possible to detect it. The thing about encrypted content is that it looks a lot like random ones and zeros, thus it would be almost impossible to detect it. Even if a random amounts of ones are zeros where detected, it is possible for this random data to be converted into another format, that can actually look like a valid file format.
- How would it be policed?
- Who would be allowed to use cryptography?
- and so many more questions.
The legal system often takes a while to catch-up with technology, and generally new laws are created from a foundation built from the past. With the Internet and on computer systems, law enforcement has led a privileged life, where the easily investigation the disks of suspects, as there was little in the way of security. The increasing focus on privacy, especially on encrypting by default, is placing a major challenge on law investigators. The increasing use of multi-factor authentication also provides major challenges, and thus there is a major on-going battle between the rights of privacy against the rights to investigate.
This article outlines some of the tensions that are being felt on both sides of the argument. For many reasons, many of the rights we have built up in traditional investigations, such as the right to remain silent, are now being challenged in this Cyber Age.
Keeping a secret
The ability for defence agencies to read secret communications and messages gives them a massive advantage over their adversaries, and is the core of many defence strategies. Most of the protocols used on the Internet are clear-text ones, such as HTTP, Telnet, FTP, and so on, but increasing we are encrypting our communications (such as with HTTPS, SSH and FTPS), where an extra layer of security (SSL) is added to make it difficult for intruders to read and change our communications. When not perfect, and open to a man-in-the-middle attack, it is a vast improvement to communicating where anyway how can sniff the network packets can read (and change) the communications. The natural step forward, though, is to encrypt the actual data before it is transmitted, and when it is stored. In this way not even a man-in-the-middle can read the communications, and the encryption key only resides with those who have rights to access it.
While many defence mechanisms in security have been fairly easy to overcome, cryptography – the process of encrypting and decrypting using electronic keys – has been seen as one of the most difficult defence mechanisms to overcome. It has thus been a key target for many defence organisations with a whole range of conspiracy theories around the presence of backdoors in the cryptography software, and where defence agencies have spied on their adversaries. Along with the worry of backdoors within the software, there has been several recent cases of severe bugs in the secure software, and which can comprise anything that has been previous kept secure.
It's open source
Much of the software used within encryption is open source, including OpenSSL for secure communications, and TrueCrypt for disk cryptography package. TrueCrypto has been around since February 2004 and maintained by the TrueCrypt Foundation. It has versions for Microsoft Windows, OS X, Linux, and Android, and supports 30 languages. David Tesa?ík registered the TrueCrypt trademarking the US and Czech Republic, and Ondrej Tesarik registered the not-for-profit TrueCrypt company in the US. It works by created a virtual drive on a computer, and then anything which is written to the disk is encrypted, and then decrypted when the files are read back.
OpenSSL, the source of the problem around HeartBleed, was started with Eric A Young (see picture) and Tim Hudson, in Dec 1998, who created the first version of OpenSSL (SSLeay - SSL Eric A Young), which then became Version 0.9.1. Eric finished his research and left to go off and do other things, and was involved with Cryptsoft (www.cryptsoft.com) before OpensSSL before joining RSA Security, where he is current a Distinguished Engineer. After Eric left, it was then left to Steve Marequess (from the US) and Stephen Henson (from the UK) to continue its development through the OpenSSL Software Foundation (OSF). The code continued to grow with TLS support added to 1.0.1a on 14 March 2012. Unfortu-nately a bug appeared on 1 Jan 2012 which implemented the Heartbeat protocol (RFC 6520), and which ended-up resulting in Heartbleed (Figure 1). The bug was actually introduced by the same person who wrote the RFC - Robin Seggleman, a German developer.
Your must reveal your finger
For years security professionals have been outlining that usernames and passwords are too insecure on their own, and new methods are needed to properly authenticate users, especially for high-risk access. This leads to multi-factor authentication, where users use two or more methods to authenticate, typically: something you known (your password), something you have (an access card), and something you are (your fingerprint) - and we can also add somewhere you are (such as your GPS tracked location). Many users now have fingerprint scanners on the mobile device and can even make payments with Paypal using a scan of a fingerprint.
So the tensions between the two camps have been highlighted this week with a Circuit Court judge in Virginia ruling that fingerprints are not protected by the Fifth Amendment. This means that users who use fingerprints for their mobiles devices may have to reveal their fingerprints when they are being investigated. The judge (Steven C. Fucci) outlined that a user would not have to pass over a passcode, but they would have to give up their fingerprint.
The ruling itself relates to the Fifth Amendment which outlines:
"no person shall be compelled in any criminal case to be a witness against himself,"
and which protects things that are memorized (passwords and passcodes), but not fingerprints, as these are similar to giving DNA samples or someone's signature.
The case related to a person (David Baust) who is accused of strangling his girlfriend, and where it was suspected that he stored the video of the attack on his phone. With the ruling, the mobile device protected by a pass code would not be allowed to be investigated, but a fingerprint protected device will. It should be said that the ruling my be overturned by an appeal or by a higher court.
Encryption-by-default
Google’s Lollipop will be released next week, and security will be at the core of its changes. An important element of this is in encryption-by-default, where users will have to opt-out of encryption of their files. Apple, too, with iOS 8 have taken the same route, and users must ask: “Why didn’t it happen before this?”
Our file attributes and content types have developed with little thought on keeping things truly private, and where systems are often still viewed as stand-alone machines. We also created an Internet which is full of the same protocols that we used in the days of text terminals and mainframe computers, where users typed in commands to access data, and where there was little thought about protecting the data as it is stored, analysed and transmitted. As we are increasingly move mobile, we are now carrying around our sensitive data, that at one time was protected behind physical firewalls, and the risks to our data increases by the day.
The major tension, though, is between law enforcement and the right to privacy. The FBI currently see the status quo as a way of investigating criminals and terrorists, but can see this opportunity reducing with encryption-by-default, such as with the file encryption system used in Apple's iOS 8. With iOS 8 and Google Lollipop there will be no electronic methods to access encryption keys from existing digital forensics toolkits, and thus the encryption method breaches current laws, which force users to reveal their encryption keys when requested by law enforcement investigators. This would mean that users may be breaching current laws in both the US and the UK. The same battle too exists with Tor, where law enforcement are scared that crime can go un-noticed, whereas privacy advocates promote the rights of privacy of using Tor.
No right to remain silent with Cryptography
In the UK, citizens have the right to silence (a Fifth Amendment Right in the US – related to the right against self-incrimination) but there is an exception to this related to encryption keys, and the failure to reveal encryption keys can often be seen as a sign that someone has something to hide, and is covered by Section 49 of RIPA. The move by Apple and Google may thus breach law as they must be able to hand-over their encryption key when required. This was highlighted in 2014 when Christopher Wilson, from Tyne and Wear was jailed when he refused to hand encrypted passwords related to investigations related to an attack on the Northumbria Police and the Serious Organised Crime Agency’s websites. He handed over 50 encrypted passwords, but none of these worked, so a judge ordered him to provide the correct one, but after failing to do this, he received a jail sentence of six months.
In 2012, Syed Hussain and three other men, were jailed for discussing an attack on a TA headquarters using a home-made bomb mounted on a remotely controlled toy car. Syed, who admitted have terrorist sympathises, was jailed for an additional four months for failing to hand-over a password for a USB stick.
The Perfect Storm
The main problem that we have with computer system security is that as computer systems have evolved we created file systems which only protect using file attributes. This works well from a corporate point of view, where we can keep compatability with previous systems, and also allow system administrators to keep full control of them. The mobile device operating system creators (mainly Google and Apple), though, have different issues to the traditional desktop operating system creators, as their devices are on-the-move, and often stolen or left behind.
As we increasingly integrate the mobile phone with our lives, especially in creating a digital shadow on the Cloud, the devices need to be more protected that our traditional desktops. Along with this, Apple and Google have complete control over their operating systems, and can implement radical changes in a way that Microsoft would have struggled with (and still keeping compatibility with an operating system released over a decade ago: Windows XP). So Apple and Google are not constrained by the past, and find their hardware platforms are whizzing along with increased processing speeds and memory capacities, in a way, again, that Microsoft would struggle with, as they have so much legacy hardware that would struggle with modern cryptography.
So Apple and Google now find themselves with a market that will quick change their mobile devices and keep up-to-date, and this do not have the long tail of devices to support. If a user wants to stick with a certain operating system, they can, but there's a good chance that their applications won't work. With phone manufacturers pushing new phones all the time, both Apple and Google are keen too to plug the gaps in traditional operating systems, especially related to security, and they have the perfect storm with SSD (rather than the horribly slow HDDs), and fast multi-core processors, each which now make encryption possible on a device that fits in your hand. Gone are the days when you needed a special maths chip to do complex cryptography.
Conclusions
With data breaches rising by the day, such as with 150 million passwords cracked with the Adobe infrastructure and over 120 million credit card details skimmed for Home Depot and Target, Apple and Google feel they have to build up trust with their users in their operating system. For this they are looking at encryption-by-default, where they encrypt file data (which is now stored on flash memory), and which now may breach the laws around reveal encryption keys.
The technical case of switching off encryption in certain applications is a non-starter, and is really just a sound-bite. Overall we are moving towards getting rid of the "Old Internet" and moving towards a new Internet, which removes the old portocols of Telnet, HTTP, and so on, and moving towards improved methods of SSH, HTTPS, and so on. These new protocols protect users on-line, and are the natural step forward for all, and to protect them on the Internet. To say that some applications cannot use these improved protocols would be a massive step backwards, and, knowing the Internet, would cause a backlash from those who can easily create new applications, which are even more secure.
For a country, the UK has lead the development of the Internet, and with one of the largest on-line communities, and if implemented move the Internet back to a time when there was little thought of security.
John Shinal, in USA Today, crystalises the debate around the rights of society against the rights of the individual to privacy with:
Banning encryption is digital equivalent of banning books. Cameron wrapped his proposal in a speech that stated that the most important thing a government can do for its people is to keep them safe. I would argue to Cameron that the most important thing a democracy can do for its people is to keep them free.
Watch the evil ...
So, if you want to understand one of the most beautiful things ever created in technology, and watch pure evil in action, here's a presentation on public key:
I'm a technologist and I don't do politics!