When should you encrypt?
July 18, 2007
In yesterday's entry, we tracked the options available for HP 3000 data encryption. None looked simpler than the Orbit Software product, Backup+/iX, now engineered for 256-bit encryption of data during backups only. The backup-triggered encryption minimizes performance drain, a potential pitfall of encrypting data.
But the question of when to encrypt surfaced just a few hours after the discussion of product and freeware solutions. Tracy Johnson observed
Encryption of data on the host itself is really a waste of time. Why? Unless there is no access control at the host? Encryption during transmission between two computers is usually how it is done because that is when data is vulnerable.
Pete Eggers, whose name has been mentioned as a potential OpenMPE director, replied that the moment of encryption was not clear from the customer's question: how to encrypt a dataset in a TurboIMAGE database.
There is nowhere enough information presented to say that host data encryption is a waste of time — nor enough information to say that any form of transmission of the data warrants
encryption.
Johnson delivered an allegory to explain why host-based encryption appears redundant to him.
I just find it funny that all of a sudden after 60-odd years of computers there is a sudden need for encrypting data where it resides. It still begs the question of lack of access control. If the hypothetical HR department has its data on a host, and the hypothetical Shipping department has access to HR's data, what kind of access control is that?
I recall upon receipt of my set of rainbow books in the early 1980s and a discussion of the (then theoretical) "Class A1" trusted information system holding the highest levels of classified data:
"A blackboard with something written on it can be a Class A1 trusted information system. All you need to do is put it in a locked room and have users sign in and out at the door where the armed guard is."
Taking away the armed guard and lowering the Trusted Criteria a bit, what I understand is being wanted here, is to require users to decode gibberish written on the blackboard after they have already been let in!
If you see my point, it is far more practical, (if not as efficient) to encrypt data as it is being transmitted, to and from a host and decrypted upon receipt. If a key is lost, you may always transmit again using a new key.
There is also additional risk if the data is encrypted on the host. If you've lost the key, you've lost everything.
Encrypting data at the host does have its uses. On a PC where there is no access control and the hard drive can be compromised easily, such as at home or in airline baggage, host encryption makes sense and the user counts on it. But that user also runs the same risk if he forgets the key.
I think the key here are differences between multiuser hosts and PCs. The line became blurred when they starting using PCs as multiuser servers and basic concepts of security became lost.
Eggers replied with another point, about the changes in computing — and how the new world demands a different standard, one that assumes the worst and demonstrates "due care."
Encrypting data is a tool. Misapplying the tool falls under "due care," and not having proper and/or approved procedures in place to safely use it falls under "due diligence’.
The courts seem to be becoming the school of hard knocks for IT and executives alike as most consider security in a heavily networked society: Annoyingly time-consuming and complex; and/or are blissfully ignorant of consequences; and/or will cross that bridge when they come to it.
The world has changed a lot in 60 years, and at an increasing rate. Adaptability is essential to survive. The days of multiuser text interface database servers connected by simple serial lines to dumb terminals are gone. Even when client/server systems begin to look like the old dinosaurs, the invisible underlying functionality and complexity is increasing dramatically. This hidden complexity needs to be secured.
With worldwide high-speed broadband interconnectivity increasing at a rapid rate, the bad guys are probing for holes in this ever-increasing complexity of our interconnected systems, making our systems harder and harder to secure.
Most of the government systems with highly-sensitive information or functions are disconnected from networks, or connected only to internal highly secured and controlled networks. There is no way to guarantee their safety if connected to public networks. But business depends more and more on interconnectivity every year. Therefore, risk analysis is essential — even if it boils down to just purchasing insurance policy to cover a vulnerability.
But it's entirely likely that purchasing a commercial encryption solution could be less expensive, and preventative to boot. That's due diligence, if implemented with proper procedures.