Accepting Irregular Statistics
Catalog to dotcom to dash: Ecometry path

Voting for Security, Obscurity and Propriety

As I write this the polls have closed in the eastern-most time zone for the US elections. Nearly all of the ballots cast in this election have passed through some kind of electronic device, from a touchpad to a click wheel to other, non-uniform interfaces. You might visit a dozen counties in one state alone and see as many proprietary devices. Proprietary carries a negative vibe, this decade as well as this evening. A troubling report in Forbes related how experimental software patches in Ohio might be on live production voting machines today. Those are likely to produce unintended results, as such beta patches often do on HP 3000s.

But the word proprietary has a root of propriety, and that means proper: according to agreed-upon and accepted processing. You'd never sling out beta patches on an HP 3000 because it's just not proper. Your intention is to produce expected, reproducible and fact-checkable results. The fallout from using a proprietary interface, software or patch is simple: someone who's an insider needs to check it. And in a sinister aspect, knows how to crack it.

EugeneblogDecades ago the steady value of the HP 3000 and MPE was its security, one which flowed from privileged mode code. Then during the '90s it was the system's obscurity, once open-source and open system computers took the IT lead. Few people knew the 3000 well enough organize a serious breach. You were much more likely to be hacked from the inside, according to Eugene Volokh's classic Burn Before Reading. The same might turn out to be true this week, if the worriers from Forbes have conjured up a plausible nightmare about election machines. This evening, the biggest news outlets also fretted about the prospects.

Even during this data revolution, the 3000 is remaining settled in its nest of propriety as it's become ever more proprietary. The solution to the balloting mess is to standardize on devices and open the software. Not because the latter is harder to hack, but because an opened-up system is easier to scan for malware. The HP 3000 didn't need security patches after 2008 because the systems practiced propriety to earn their keep, and they were secure through their obscurity. National election voting systems don't have to meet that bar today. It costs too much, apparently.

My security expert colleague Steve Hardwick said that he's been involved in developing a secure voting system which could maintain its propriety. But the project didn't see the light of day in any of the 50 US states.

I did some work on a standard that would have provided very secure voting software. The effort was abandoned due to final cost of the units. Any system, including the physical one in place today, can be circumvented. I have not seen anything showing the comparative security to the physical one. At least the lawyers will have a field day. I wonder if this is the "hanging chad" of this election.

Nobody's hoping for such a thing, especially with so much at stake. But an HP 3000 veteran might be wondering, sometime tomorrow, how such a set of incompatible systems, installed entirely at local whim and desire with software so complex it can't be independently verified for security, could be rolled into production for this country's Big Event. It's the kind of thing that would get an IT pro of age 50 or more fired, if the results became improper.

To establish security of HP 3000s by now is a matter of sticking to an environment that's stable and limiting the access -- just like Eugene wrote in the 1980s. You keep track of who's got access and when to determine where the theft or malfeasance may have occurred.

The very fact that  someone is trying to run payroll across  a  phone  line  at  11 P.M. on a  Saturday is an indication of unauthorized  access. Thus, it is worthwhile to implement some form of security  that  prohibits access to certain user IDs and accounts at certain  times  of  day, days of week,  and/or from certain terminals. Alternatively,  you might want to force people to answer an additional password  at certain times, or especially when signing on from certain terminals.

This  may seem like a poor approach  indeed -- after all, if the thief hits  the time of day, day  of week, or terminal prohibition/password, this  means  that  he  has  successfully  penetrated the  rest of your security system, which will never happen -- right? In reality, this is a   very  potent  way  of  frustrating  would-be  security  violators, especially if the attempted violators are promptly investigated. Thus, another maxim appears:

Some forms of access are inherently suspect (and therefore require Extra  Passwords)  or  are  Inherently security  violations. Thus, access to certain user IDs at certain times of day, on certain days of the week, and/or from certain terminals should be specially restricted.

Eugene Volokh could understand those basics and preach them to his community of computer pros 30 years ago in his paper. By now, he's in government work, in the sense that he's a constitutional law expert and a maven of free speech. We might all hope, before our week is finished, that the philosophy that 3000 users knew in the 1980s about authorized access has traveled from Eugene's first community to his current one: the realm of the US courts.

Nobody ever hopes for a test of security, but it's expected that a system will pass -- or that changes, recalculations and improvements will result from any lack of propriety.

 

Comments