HP boots Hurd after compromising plays
Old HP hand voices new hope on Hurd firing

An Introduction to Encryption's Powers

By Steve Hardwick
Certified Information Systems Security Professional

Editor's note: The 3000 community now counts on another provider of commercial encyption software in FluentEdge Technologies, which sells a programmer's toolkit to encrypt sensitive fields in IMAGE/SQL databases. To help explain the technical specifics of encryption and its value to IT, security professional Steve Hardwick of Mobile Armor offers this primer. While the article gets detailed and technical, it can help a manager have a more productive auditor experience, educated with crypto security fundamentals.

Over the past 30 years the science of cryptography has been significantly influenced by computers and networking technologies. Not only has this technology driven the demand for solutions, but it has also fueled the ability to crack cryptographic algorithms. This has spawned various forms of encryption and encryption algorithms.

Encryption systems provide a fundamental function: to prevent unauthorized access to secure data. Put simply, you have to keep the hackers out. Standards are crucial to successful encryption. By defining an open standard, security communities can analyze it and look for weaknesses that may be exploited by one of the bad guys. Another benefit is that less-sophisticated users get the results of work by more educated user communities. Further, since the standard is made public, knowing the mechanism of how the encryption is accomplished does not help you break it. It is the key that makes it secure.

Evolution to AES
In the 1970s through the early 1990s several different encryption algorithms were in play. Through the Federal Information Processing Standards (FIPS), the US government defined a common algorithm for encryption in 1976 called the Data Encryption Standard (DES). A second landmark FIPS publication was issued in1994 that detailed how DES should be used in a full encryption solution.

This standard remained in effect until the late '90s. Due to the advent of better and more widely available computer platforms, DES was considered hackable by 1999. In fact, the initial public crack of DES was done in 1997 by a network of over 14,000 computers working in parallel. By 1999, DES keys were being publicly cracked in about 22 hours.

A short-term solution was to use the DES algorithm three times over, called triple DES. This had its shortcomings. In 1997 the search started to choose a successor to DES, later named the Advanced Encryption Standard AES. (Ed. note: You hear AES all the time now when vendors describe their encryption engines.)

Not only does this standard outline how encryption should work, but there are independent labs to validate the implementation. Upon successful completion of a validation, a certificate is issued to the encryption product manufacturer to show it has been tested and validated. This gives a public seal of approval for the encryption implementation.
Kinds of encryption

DES, triple DES (3DES) and AES are all examples of symmetric encryption. This means that the same key is used to encrypt and decrypt the data. The big advantage of this type of encryption is that it is very fast. Large volumes of data can be easily and quickly converted into illegible information.

There are other algorithms that are available for symmetric encryption. Twofish, created by cryptography icon Bruce Schneier, was one of the contestants for the AES challenge, but did not make it. Although the implementation, and that of it predecessor, Blowfish, are still around today. In some other cases, Elliptical Curve Cryptography for Smart Cards, have found specific uses due to their design.

One immediate challenge that all symmetric key schemes face is how to distribute the symmetric key. In the early days of symmetric key cryptography, keys were exchanged using physical, or “out of band” methods. For example, a password could be given to the end user over the phone and the encrypted message sent via a network connection. As the internet grew, the limitation of exchanging symmetric keys significantly stifled the growth of encryption.

In 1976 a scheme was published that allowed two different keys to be used, one for encryption and one for decryption. The initial goal of the algorithm was to create a way to securely pass information between two anonymous parties using an insecure communications channel. The two keys were labeled public and private key. A user would generate both keys and then distribute the public key. Anyone using the public key could encrypt data that could only be successfully decrypted using the private key.

There was one other problem asymmetric key cryptography solved, authentication. Instead of using the public key to encrypt and private key to decrypt (to preserve confidentiality of information), the private key can be used to encrypt the data. But of what use is this, as anyone with a public key can read it. Well the advantage is that if a message can be decrypted with a public key, there is a reasonable chance that it was sent by the owner of the private key.

By encrypting data with the sender's private key (for authentication) and then encrypting again with the recipients public key (for confidentiality) it became possible to have a system that the recipient could use to identify the sender of the message. Once the bond of trust has been established, then a symmetric, or session key, can be exchanged to allow for data to be securely exchanged. This is the basic operation of Secure Socket Layer (or SSL) information exchange that is the backbone of secure information communication in the internet.

Hashing and public keys

One more product of encryption technology is hashing. In this form of algorithm, the encryption is one-way. There is no attempt to reverse the process. The goal is to produce a unique value for a block of data. This results in a technology that can generate a small value, or digest, that uniquely represents the data used to generate it. This value is typically a small number of 160-256 bits. The recipient can then use the same hash method on the decrypted data and compare this to the hash value that was received. If they match, then there is a very good chance that its integrity has been maintained.

An alternative approach is to use a hash value that is encrypted with the sender's private key. The encrypted product is called a digital signature. The recipient can decrypt the digital signature with the sender's public key and derive the hash value of the message. A locally generated hash value is compared with the one decrypted from the signature. Digital signatures rely on public/private keys and do not need to keep a record for unique symmetric keys.

The final piece of the puzzle was how to determine if the public key you receive is really from a valid sender. An initial attempt at solving this problem was Pretty Good Privacy (PGP). In this method, a public key was initially obtained from a known second party. Then a public key of a third could be obtained from that second party as long as they had vouched for its authenticity. This was known as a web of trust and is still in use today.

But soon, trusted suppliers became established that would validate the authenticity of a public key and create a digital certificate, using their private key, for dissemination. These trusted third parties are known as certificate authorities (CA). VeriSign is a good example of a public CA. A standard, X.509, is in place to allow for the easy exchange of digital certificates to allow the transfer of public keys. This is more commonly called Public Key infrastructure, (PKI). Public keys of trusted CA's are widely distributed in browsers to facilitate the authentication process.


AES defines a system where symmetric keys are generated that can be used to secure bulk data. Asymmetric key exchange can be used to exchange the symmetric key between two parties. PKI can be used to ensure that the public key is authentic. Hashing, (SHA256) is used in conjunction with the private/public keys to generate digital signatures. Secure Socket Layer can be used to verify senders, exchange symmetric keys and transmit encrypted bulk data. These building blocks are used extensively throughout the information exchange industry to provide confidential, integral and authenticated communication.

Steve Hardwick is Partner Manager at Mobile Armor, a leader in federal and commercial data protection solutions. He's also led in security posts at Dell, VTEL, Infraworks and Message One. As a holder of a CISSP certification, he coordinates Mobile Armor's Common Criteria evaluation programs and leads CISSP training courses.