On September 28, 2007, Judge Winifred
Smith of the Superior Court of Alamada County, California, took the
extraordinary measure of invalidating an election result – an event
that has only happened once before in California’s history. Measure
R, originally voted upon in November 2004, was ordered back onto next
year’s ballot not because of electoral fraud or force majeure, but
because 96% of the results from the election had vanished. There was
not any suggestion of dastardly doings; no ballots mysteriously
vanished; no warehouses caught fire under unusual circumstances. These
ballots had vanished because in a very real way they never existed in
the first place. The election deciding Measure R’s fate took place
entirely on computerized voting machines.
the middle of litigation over the fate of the election, the machines
were returned to the manufacturers, without the data having been backed
up. It is unknown why the county returned the machines, or what
happened to the data once the machines were sent back. Computers
lose data all the time; crashes are a fact of life in the modern world.
It is rare, however, that large, professionally designed computer
systems crash quite so catastrophically. While most people are
familiar with Windows’ infamous Blue Screen of Death , however,
companies spending millions (or more) on tabulating data have little
tolerance for errors that destroy the vast majority of the information
they are designed to store. Alameda County, according to its own
statements, rolled out the machines before it knew how to effectively
handle the data they generated.
This is one recent example of the short but extremely tumultuous
history that computerized touch-screen voting has had in the US.
It’s easy to see their attraction – small, sexy information age
devices, they seem a panacea for a voting system that often borders on
antique. But touch-screen voting is a concept whose time has not yet
come; rather, now is the time the nation needs to wake up, take a deep
breath, and stop throwing good money after bad on systems that will
never accomplish the job for which they are designed. This article will
give a geek’s-eye view of the current state of computerized voting,
discussing its history, why it is so problematic, and finally giving a
suggestion as to what is the best direction to go towards in the future.
2. The State of Computerized Voting
Since the ‘butterfly
ballot’ debacle of the 2000 election cycle, the nation has re-examined
how it votes. Currently, a hodge-podge of methods and devices, some
invented more than a century ago, are being used. The Help America
Vote Act of 2002 (HAVA) grants, among other things, money to precincts
to update their voting apparatus if in the November 2000 election that
apparatus used punch cards or levers.  Computerized touch-screen
systems called DREs – Direct Recording Electronic voting machines 
– were a popular choice of many large precincts. Their main
competition in the battle to replace older systems is optical scan
voting technology, similar in many ways to the Scantron sheets used for
a. Older systems
A central problem of voting in the
United States has been balancing accurate, secure counting of votes
with rapid tabulation. A population of three hundred million and
counting, even if only a fraction can and do vote, requires quite a bit
of counting. For the last hundred years, companies have been designing
systems that are easier and easier to count by machine. The punch-card
ballots of Florida, for instance, evolved from a system for operating
mechanical looms. Lever machines keep an ongoing tally for the
machine, but do not produce actual ballots.  Optical scan machines
have been in use since the 60s, and are an elegant combination of old
and new technology. They are a paper ballot for all intents and
purposes, capable of being reviewed by election personnel during a
recount. However, they are run through a tabulator, which saves almost
all of the time, effort, and expense of an initial hand count.
each of these systems are vulnerable in many ways to the same problems
that occurred in Florida: poor ballot design. Like any ballot, their
layout is designed by election officials who may or may not have
experience or expertise in doing so. A voter using an optical scan
machine does not see how the computer reads their vote, so confusion is
still possible. Consider:
They are the same graphic; the one on the left is simply cropped to
cut out one of the circles. The line on the right seems clear, a voter
in a rush or simply not paying much attention may not see a bubble or
may automatically assume the bubbles are on a specific side. Ballot
design will never be perfect, and people will continue making mistakes
b. A new choice
DRE voting machines came to the
forefront after Florida because they have the potential to sidestep
many of the issues of other voting systems. Among the most powerful
of these features is the idea of a confirmation screen. No matter how
confusing the ballot might be, the computer presents the voter with a
screen detailing the choices the system has recorded. If the voter made
a mistake, or the computer made an error recording it, the voter can
return to the previous screen to correct the error.
At the same time, computerized systems have all of the other
advantages of being a computer, as well. Access for the disabled is a
huge issue in favor of DREs. These machines are generally built on
Windows, which comes built-in with features such as screen readers for
the blind. In addition to being forgiving of mistakes, DREs do not
require the manual strength to pull a lever or the dexterity to fill
out a bubble. To millions of Americans, these machines make voting
easier; to some, they make it possible to vote unassisted.
DREs do all of this, and make the back-end advantage even more
noticeable. There is no need to even feed forms through an optical
reader; the computer system tabulates the entire precinct at the end of
the day. It can then spit out a disc of the results, which can be
manually carried to a central location, or can upload the results
automatically to another computer compiling the entire district. In
a best case scenario, official results can be ready mere minutes after
the last vote is cast.
c. Who builds them?
Most Americans have used automated
teller machines (ATMs) hundreds of times, but have never thought for
even a second about where they come from; which company makes them, who
owns them, who installed them. And that is a good thing; if we need
to make the decision at each ATM whether or not we trust the company
that made it, then a lot of the convenience of the machines is lost.
ATMs need to be bulletproof, figuratively and quite literally. They
need to be trusted by walk-up clients who want to withdraw or deposit
funds, as well as the banks that purchase them and the banks whose
systems they debit money from. It seems logical, then, that companies
making ATMs would be the perfect choice to design voting machines.
The problem with that logic is that voting machines pose a very
different problem than either ATMs or traditional voting systems do.
ATMs are permanent installations in heavily-surveilled areas – they
even have their own surveillance systems built in. They’re a mature
technology with billions, if not trillions, of individual transactions
to build upon when refining the process. Finally, they’re created
for businesses with experience in the area and incredibly exacting
standards; an exploitable flaw in one model of ATM could potentially
lose millions of dollars, a risk no bank will ever be willing to take.
voting machines, on the other hand, have none of the features that make
ATMs secure. They’re only used one day a year, which means they need to
be easy to move around. By nature – by the very laws surrounding
their reason for use – they can not be under any kind of
surveillance. They are technologically immature, in wide scale
deployment for less than a decade. As for the target consumer, few
of the officials in charge of buying voting systems in the US are
likely to be qualified to make purchasing decisions for large computer
With almost two billion dollars in HAVA funds up for grabs since
2002, there’s been no shortage of incentive for companies with
experience making either ATMs or voting machines to throw their hats
into the DRE ring, then trying to convince those voting officials to
purchase their products. The challenges inherent to DREs, however,
are significant enough to not only negate any advantages experienced
companies might have, but also to make their ‘expertise’ – not to
mention their reputations – full-on liabilities. The three major
manufacturers of DRE voting systems – Diebold Election Systems, Sequoia
Voting Systems, and Election Systems & Software (ES&S) – have
had every problem imaginable. There is no limit to the number of
audits the three companies have failed between them, but more troubling
are accusations of outright malfeasance on the part of these companies.
The FDA wouldn’t allow these companies to sell aspirin, but through a
combination of political connections, lobbying, and simply being the
only options, they are supplying devices intrinsic to the health of a
3. Looking forward
Security is a hard problem. There are
references to locks in the Old Testament, and Daedalus’ Labyrinth at
Crete was designed to prevent people from getting at the Minotaur at
the center (and to prevent that pesky Minotaur from getting out, as
well). As long as we’ve had doors, we’ve been trying to figure out
how to keep people from opening them, and failing. Locks still get
picked and spies still steal secrets.
Designing computer systems is even more difficult; as time goes on,
computers get more advanced, programs get more complex, and there are
more and more places for things to go wrong. And this is not likely to
get better any time soon; the trend in computer programs seems to be
towards more flaws, not fewer. Managing security in a computer
system piles these problems on top of each other, and it is nearly
impossible to live in the modern age without hearing a litany of terms
like “virus” and “worm”, or reading how a bank accidentally posted half
a million credit card files on its website. It’s not a problem with an
easy or conventional answer, but it’s a problem which needs to be
solved if people are to trust the democratic process into the 21st
a. Trust, but verify
Suppose the existence of a rare
stamp which people would steal, if they could find it. After thinking
long and hard, its owner comes up with two alternatives; take it into
the woods behind his house and bury it, or put it in a safety deposit
box in the local bank. Which seems more secure? Few people would choose
to bury the stamp; it seems immediately apparent that the bank is more
secure. But why? Everyone knows that banks can be broken into and
robbed. In fact, the bank on the corner is a target; if the vault were
robbed, the stamp might get stolen even if the thief didn’t know it was
there when planning the robbery. On the other hand, if the owner is
careful, it is likely that no one will know about the box in the woods.
Doesn’t that make it safer?
Again, it is unusual if intuition suggests that the box in the woods
is safer. However likely it is that no one will know about the stamp,
however careful the owner is, there is always a chance that it will be
found. Someone might see the box being buried; someone might walk by
and notice that a hole had recently been dug; someone else might be
burying their rare stamp, and happen do it in the exact same place.
Once that happens, the other person will have total access to it. On
the other hand, the owner probably wouldn’t mind telling someone “My
stamp is in box 145-H at Blackwater Savings Bank, on the corner of
Fifth and Main.” Knowing the stamp is in the bank doesn’t accomplish
very much for a potential thief; they still need the ability to get
into the bank vault. If they had that in the first place, they would
likely rob it regardless of the presence of the stamp.
Hiding your stamp in a box in the woods is what computer security
experts pejoratively call “security by obscurity,” short for “security
by obscurity is no security at all.”
Security by obscurity means that the only thing protecting a secret
is the fact that it’s somewhere out of the way. Think of the Sherlock
Holmes story “The Purloined Letter” – a blackmailer had stolen an
extremely important letter, and had hidden it well. Holmes tricked him
into revealing the hiding place, and after that, was able to recover it
with no trouble at all. Good security can include hiding something, but
what is reassuring about a bank vault is that it is secure despite
everyone knowing where it is. A bank which told customers “We take your
money somewhere really secret, then bury it inside of a shoebox” would
quickly go out of business.
When someone is attacking something secure – trying to break into a
computer system, or a house, or a bank vault – the best strategy is
always to go after the weakest point. That’s why most computer systems
won’t let a password be the same as a username, for instance. All
of the security on that system is worthless if it relies on a secret
that’s easy to guess. That maxim can be generalized – the security on a
system is only as strong as its weakest point. There’s no point to
having a $500 lock on your front door if your windows are unprotected.
And if all of your security relies on you having secrets that nobody
else knows, then your security can only be as good as the protection on
that secret. Design a system that doesn’t need secrets, and it may not
be impregnable, but at least you are assured that there are no easy
ways to bypass your security.
Given this basic knowledge, it seems logical that companies with a
great deal invested in assuring their clients that their systems are
secure would go out of their way to make their systems as transparent
as possible. Instead, however, the major election system vendors have
spared absolutely nothing to try to keep as many secrets as possible,
including one crucial one – how the programs that run on their machines
work. They argue that their computer code is central to their business,
and that it is a secret they should be able to keep for competitive
reasons. They have used every method possible to make sure this
viewpoint prevails, from lobbying efforts to flat-out refusing to
follow laws which would require them to disclose their code. In
contrast, the Nevada Gaming Control Board requires that the code for
all software installed on slot machines be available for them to
Secrets are only as secure as long as they’re kept secret. In 2003,
both Diebold and Sequoia suffered similar gaffes – they left their
computer code, supposedly so secret it could not even be shown to the
government, openly available on public web servers. Each company
was supposedly expert enough in computer security to protect an
election, but they could not even protect themselves. Sequoia’s
software was stored in binary code in a directory which allowed people
to write as well as read, leaving the possibility that someone
unnoticed altered their program. Diebold, on the other hand, lost
their source code – the actual work done by their programmers before it
is made into a machine-readable form. Over the next couple of
weeks, media, researchers, and interested amateur computer scientists
picked through the program, exposing dozens of flaws and security
holes. For the first time, the public could see beyond the secrets,
and what it saw was a system that was wide open to malicious users.
Diebold claims that all of the holes are patched in its current
version, but how would anyone know? The current version is, again,
b. Security without secrets
Banks certainly use some
secrets – PIN codes, for instance, or combination vaults – but they try
to reduce them as much as possible. If all an intruder needs to get
into a vault are the blueprints, then the architect, or anyone who sees
the plans, can walk right in. True security uses as few secrets as
possible – in fact, truly secure systems want people to be able to see
how they work. Some bad guys will be deterred (why go after a bank
vault you can see is incredibly hard to crack, when there are so many
easier ones?) and some good guys will point out mistakes that were
made, so they can be corrected. Just as importantly, everyone
interested in buying the product, or using the bank, can be confident
in the strength of the security.
A real-world example of this principle was the creation of AES, the
government Advanced Encryption Standard. When the National
Institute of Standards and Technology (NIST) decided that their old
cipher, DES, was getting outdated, they held a public competition to
replace it. Public, in this case, didn’t simply mean that anyone
could enter; it meant that once you did enter, your work was open for
everyone to look at. The security experts who participated in the
process didn’t just submit new algorithms, they actively examined the
others’ work, searching for flaws that would leave it vulnerable to
attack. There were no secrets. Today, AES is used in thousands of
applications because people can trust that it does not have hidden
The lesson here cannot be overstated. If your data is protected
using AES, someone who wants to uncover your secrets can go to a public
website and learn everything there is to know about AES itself. They
can see every line of the algorithm and exactly what steps it puts your
data through, but this will not help them get at your data. There’s
only one secret, and that is the password you use to encrypt the data
AES’s openness didn’t just make it stronger, it made it much, much
more valuable. Banks, stock markets, and even foreign governments trust
AES because they can look at it themselves, see that there are no
traps, no secret passwords or back doors. Think about the lock on an
average house’s front door. Who might have a key to it, other than the
owner, and anyone else directly authorized? Well, the locksmith who
installed it may, as might the company that made it. Either of them
might also have a master key, which can open all locks of that type.
Generally, people don’t care because there’s generally nothing in our
houses worth that much – after all, if it turns out that a lock company
is snooping around in its customers’ houses, the company will not last
long. It’s not worth losing a business simply to peek into a random
house. But what the owner was storing a bag of diamonds in the house?
Or a file describing a billion dollar invention? All of a sudden, those
locks may not be quite as secure as the owner thinks. All of a sudden
the owner might wonder if anyone other than the locksmith has a copy of
that master key, and even a reputable lock company might be considering
the pros and cons of breaking the law. When guarding something
important, even the company providing the security itself may be a weak
point. What secrets do they have? Does the customer really know who
they are, and who else might know them? Would it not be better if they
had no secrets at all?
How a person votes is supposed to be secret – from everyone but that
person. However, when a vote is entered into a DRE voting system on any
given election day, the simple truth is that it becomes a secret to
everyone - the voter has no idea what has happened to that vote. The
system is a black box, a secret in and of itself, and all she can do is
trust that the vote it spits out on the other side is the same vote she
entered. There is a chain of trust involved in any election; vote
tampering and fraud are problems as old as democracy itself. But up
until the advent of DRE voting, what a citizen’s own ballot said was
never a secret from her, never something she needed to trust.
c. Chains of trust
Think, for a moment, about that
“chain of trust” a voter must have in an election that uses a simple
paper ballot. The voter goes into the booth, checks a box on a ballot
handed to her at the door, and puts it into a ballot box. She then
leaves the voting booth. She trusts that systems are in place to
prevent the workers at the polling place or another voter from stuffing
the ballot box; to prevent the counters from intentionally miscounting
or destroying ballots; to prevent the people tallying the various
precincts from coming up with the wrong numbers. That’s a lot of trust,
to start with, but we think it’s worth it for a system which allows
people to cast secret ballots.
Why is it relatively easy to trust paper ballots? One important
reason is that they are big. True, a piece of paper hardly seems big,
but carrying around enough ballots to actually ‘stuff’ the ballot box
becomes easy to spot at numbers far smaller than needed to influence
most elections. If the cheating happens during the counting process,
it’s easy to detect with a recount. Election officials know enough to
make the ballots themselves distinctive, so forged ones can’t easily be
prepared ahead of time. But again, merely keeping the ballot secret
isn’t effective security; with many modern voting machines, voters
never get the opportunity to physically handle the ballots, though they
may see them in the machines. What, then, is different about DRE voting
that would make a voter less confident in it? There are several, but
the biggest difference is this: in traditional voting systems, the
ballot box itself can’t cheat.
That may seem an odd if not
incredulous statement at first. A ballot box is a thing; it has no will
of its own with which to cheat… and that itself is the problem. A
computer program does exactly what it is programmed to, and computers
can be programmed to break the rules. Punch cards, and optical scan
machines differ from DREs in one phenomenally critical way: they keep
their memory on paper, which the voter can see. DREs keep their memory
hidden deeply away, but perhaps not deeply enough. And while that big
direct recording machine in the school gymnasium with the giant lever
may be clunky, it has its own key difference from its electronic
descendents. After the election, it can be examined for signs of
tampering. Computers can be programmed to not only cheat, but wipe away
the fingerprints when the job is done.
i. Voting with a DRE system
To really understand
how complicated the chain of trust a voter needs to have in a DRE
voting system versus a paper ballot, it is necessary to examine a
theoretical voting procedure using a DRE. Before the voting starts,
technicians from either the locale or the manufacturer show up to set
up the election; this is the digital equivalent of writing the ballot.
They do this on a central server, which all of the voting machines will
then download the ballots from. The poll workers set up the machines,
connecting them over the network to the central server. The workers
test each machine to make sure ballots are recording properly, then
each machine then spits out a “zero tape” which shows that there are no
ballots currently registered at the machine. As voters come in, they
are given a special digital keycard which, when inserted into the
machine, allows the casting of one vote and then disables itself. The
voter returns the keycard to the poll workers, who can then re-enable
it and give it to the next voter. At the end of the day, the poll
workers print a result tape which tallies the total number of votes
cast, check to make sure it matches up with the number of people they
have manually recorded as voting, and uploads the vote tallies to a
central location within the city.
Who does a voter need to trust? Where might things go wrong? Data is
transmitted over the internet, so there’s always the possibility that
someone malicious might be able to intercept and change what is being
transmitted. Hundreds of voters might have access to that machine
itself; any one of them could tamper with it. Imagine a specially
designed keycard that did not shut off after use; one voter might be
able to cast a hundred votes in a few minutes. Someone with the
know-how could open the machine and upload a virus, or simply tinker
with the memory. An attacker could simply remove the computer’s memory
card and snap it in half, destroying the votes for that machine so far,
a useful strategy in a precinct expected to vote strongly for the
opposition. None of these attacks are novel, but the sheer clunkiness
of paper ballots makes them difficult; imagine how long it would take
someone to set a stack of a hundred ballots aflame with a lighter, for
instance, or how obvious carrying those votes in under a coat would
ii. Trusting the ballot box
If that were the only
problem, though – that the sheer efficiency of computer ballots makes
them more vulnerable to attack – then we could find ways around this.
But we’ve left off the second, more critical assertion – that unlike
other types of voting machines, DREs are capable of putting the fix in
for an election all on their own. Unlike an optical-scan machine, for
instance, or a punch-card ballot, a DRE runs on a general-purpose
computer, similar to a laptop and capable of running any piece of
software its operating system can handle.
In this sample case, the voting machine runs Windows CE 3.0, a
smaller version of Windows built by Microsoft for “embedded devices” –
things like cell phones, palmtops, and, yes, ATMs and voting
machines. The device is programmed so that it only does two things;
first, when booted up, it checks to see there are any updates on its
removable memory cards, then it runs a program called
To try to list the security problems which have been associated with
various versions of Microsoft Windows would be an exercise in futility.
Windows is popular and effective for many reasons, but its security is
not among them. BallotStation is an unknown; it’s not a widely used
program, so there’s not a lot of data on security problems inside of
it. However, just like biological organisms, computer programs develop
resistance to attacks; the more a program gets exposed to malicious
behavior, the more it develops defenses and immunities, by programmers
patching holes and creating protective software like virus scanners.
The other place the voting machine can get software from is the
memory card. Unlike the ballot station software, this is not an
unknown; it is an easily-spotted security nightmare. Imagine a mailbox
– a secret mailbox, admittedly, a locked mailbox. Every morning, the
mailbox’s owner wakes up, opens it, and then does whatever any letter
in that mailbox tells him to do, without question. If a letter managed
to get into the mailbox, it must be important, so its orders get
carried out. That is a more or less accurate description of the voting
computer; once a card is inserted into the slot, it simply does
whatever it’s told. It may ask for confirmation – by flashing a
“yes/no” message on the touch screen – but it seems obvious that anyone
inserting a card with malicious software will be around to press
iii. A needle in a million haystacks
7, 2006, seven U.S. attorneys were dismissed without explanation and
replaced with interim appointees. This was made legally possible by
a provision slipped into the USA PATRIOT reauthorization act in the
dead of night by one of Arlen Specter’s assistants. A few short
lines in a bill two hundred pages long; virtually impossible to notice,
even for someone actively looking for it. Now imagine a programmer
working on that BallotStation software. The only other people who will
ever see this program are the few dozen other programmers working for
the company; once it is shipped out, the software itself is turned from
a readable script into the ones and zeroes that computers run on. That
program is another secret; it is incredibly difficult to take those
ones and zeroes and turn them back into something a human can make
sense of, and even if it were possible, doing so is prevented by both
contract and copyright law. The USA PATRIOT reauthorization bill was at
most eight or ten thousand lines, and someone was able to insert a
clause into it without being noticed. Even a moderately-sized computer
program can approach a million lines of code, and larger programs can
be huge – Windows XP, for instance, has an estimated forty million
lines of code. That’s about million printed pages, or about
thirteen hundred bound volumes – at eight hundred pages a book. Even if
it took fifty of those pages to write the code to steal an election
(and there’s no reason to believe it would take nearly that much space)
no-one goes through the code line by line before it is delivered. If
the program works, it goes out to customers. And even if a malicious
piece of the program was detected, it would be almost impossible to
track the perpetrator down. Unlike the Senate, most software companies
use programs which track any changes made, but security is not a
primary priority for those applications. In other words, anyone who
could make the change in the first place could easily hide their own
Finally, someone falsifying ballots in a traditional
election can’t have them self-destruct after the count is over. On a
computer, such a thing is trivial. A vote-stealing program could delete
itself after the results had been turned in; alternately, it could
delete all the election data, or just crash the DRE itself. Once that’s
done, the only people qualified to do a post-mortem on the machine work
for the company itself. The economic fallout for a company whose
machine was hacked during an actual election would be enormous... and
in any investigation, a strong desire by the examiner for a specific
outcome often slants the investigation towards that outcome. This has
parallels to the situation in Alameda County; there, the vote was close
enough for a recount, but none was possible. If the vote was a strong
victory for one side, but tampering was suspected, what better way to
head off an investigation than to simply have the systems crash and get
sent back to the manufacturer?
To a hypothetical election-rigger, then, it seems that a precinct
which uses DREs has a lot of advantages. In addition to access, which
any potential rigger needs to have, rigging an election that uses
computerized systems may requires more technical know-how than rigging
one with only paper ballots. In return, however, it is nearly
impossible to detect while happening and nearly untraceable afterwards
– ‘virtues’ which traditional methods of electoral fraud do not have.
In an age of close-call elections that seem to be decided as often in
courts as in the voting booths, flipping just one out of every hundred
votes from one candidate to another turns a neck-and-neck race into a
solid 2% lead.
So how much do you trust the programmers at Diebold
Election Systems, a company with strong political ties, whose managers
have included a man jailed for falsifying computer records? How
much do you trust Microsoft, a company which has recently started
updating its customer’s computers without their permission, and even
when they specifically denied permission, with an update that can break
some of those computers? How much do you trust every single person
who has ever handled that machine, every election worker, every techie?
How much can any voter trust that within that entire chain, there is no
person who is interested in subverting the process? Simply handling a
clunky, old-school lever machine accomplishes little, but anyone who
touches that DRE can potentially install software on it to steal an
election. How much can any voter trust that within that entire chain,
there is no person who can be bribed sufficiently to put the fix in?
How much can you trust that the person in line before you, who took
just a minute more than usual, didn’t pull a memory card out of his
pocket the second he got into the booth?
4. How do we fix it? Legal and policy recommendations
are many things that are fundamental to the functioning of a democracy
–freedoms of speech, the press, and assembly, as well as an engaged
populace participating in the process. Even more fundamental, however,
is simply that the process work – that when the people cast their
votes, they are counted properly. If voters cannot trust that the
process works on this central level, why should they bother to vote in
the first place? How can people be reassured that their vote will be
counted when they can not even see it?
a. Voter-verified paper trails
There has been plenty of
writing about this one particular issue, but it is impossible to say it
too many times; a voting system which does not leave any kind of paper
trail is too vulnerable. Many of the problems discussed in the last
few pages can be avoided simply by having the DRE spit out a receipt
which the voter can then put in a secure ballot box just as in
traditional voting. This does not negate the primary benefits of the
DRE system, ease of use to voters and ease of counting to
administrators, but it provides the ability to do a recount independent
of the computer system. It does make the machines significantly more
expensive. It also provides another potential wrench on election day (a
printer jam) but these technical hurdles shouldn’t be seen as barriers
to use a paper trail, but rather barriers to use DREs at all. If the
systems cannot be brought up to this basic level of reliability, our
nation cannot afford to be using them.
b. Design competition
The idea of a federal
government-mandated rollout of a single, centrally-manufactured voting
machine is not likely to reassure anyone. Any new device designed to
ensure people’s confidence in the voting process cannot come from a
group few have confidence in the first place. Large government
procurement efforts, additionally, tend to turn into political
windfalls to specific private companies; this is a big part of the
problem that currently exists. And yet, the assurance that a vote in
precinct A is just as secure as a vote in precinct B, regardless of
which company manufactured the voting machines, is exactly what a
program to generate a new voting system would be designed to ensure.
To resolve this conflict requires a process which takes full
advantage of the benefits of each of the various levels of our society.
Standards are best set at a federal level, to assure unity across the
entire country. Purchasing of the systems themselves belongs at a local
level, so each city can go with a vendor which best meets its own
needs. Manufacturing needs to be open to the market, so that companies
can compete and the people are assured of getting the best devices
possible. To satisfy all of these various ends, NIST should hold a
five-year competition for the creation of a secure voting system, along
the lines of the competition it announced for the creation of AES in
The purpose of the competition is not to award a contract to make
the machine itself, but to get a reference specification – a standard
design, which any company could then manufacture. All of the
specifications for the machine will be open to the public, from the
source code for the programs to the physical design for the circuit
boards. When a company uses this reference specification to produce an
actual machine, they must themselves publish everything they do – from
the design of the housing to the serial numbers on the chips they plan
on using. Since the winners will not themselves make money from selling
the design, a substantial prize needs to be awarded; considering the
billions allocated through HAVA, prizes in the ten thousand range for
finalists, and in the ten million range for the winner, plus the
publicity, should be enough to motivate participants from across the
c. Bounty hunting
An idea which is gaining strength in
tech circles is that of a free market for computer bugs. The
concept is simple; someone who discovers a new way to penetrate a web
site (for instance) may divulge it to the company that built the web
server, for which they might possibly get a bit of recognition. On the
other hand, there are an increasing number of criminal groups willing
to pay a good deal of cold cash for ways to exploit web sites. In the
time before a fix is found, they can steal credit card numbers, learn
corporate secrets, or even hold entire businesses hostage. Criminals
have been quick to seize opportunities in this area, and now pay good
money for so-called “zero day exploits”, bugs that are unknown and
unprepared for. If one such attack can cause millions of dollars in
damage or more, it makes economic sense for companies to compete with
the criminals for these bugs. A systems administrator who has found
something new may be much happier taking ten thousand legitimate
dollars from IBM than five times that from an Eastern European criminal
syndicate. This is an economists’ dream; organizations making the
rational choice to remove the incentive someone might have to commit a
criminal act which the normal justice system is virtually unable to
This is a perfect model for a secure voting system. Once the
reference specification is finished, money should continue to be put
aside to reward the first person who can demonstrate a specific bug in
the system. Prizes would be awarded in varying amounts, based on the
severity of the bug – perhaps ten thousand dollars for a minor bug
which required a lot of time and effort to exploit, up to a hundred
thousand or more for someone who found a flaw which allowed someone to
enter a voting booth and change the voting tallies. Based on the
heartening results of the AES development process, severe bugs should
be relatively rare, if not squashed entirely by the time the system
goes into production. Again, however, a hundred thousand dollars is a
pittance compared to a stolen election.
d. Competition on quality
Business competition is one of
the cornerstones of our economy, driving prices down and putting goods
in the hands of all. However, this can be a huge problem as well, when
companies reduce the quality of those goods in order to make them
affordable. Since this is a good that should not be majorly different
from manufacturer to manufacturer, companies should compete on quality,
rather than price. Companies should allege when bidding for a contract
that their machines make only so many errors per million votes cast,
and then be bound to that number just as a contractor who bids on a
construction project is bound to the budget bid.
At the same time, there should be quite simply zero tolerance for
major errors. A company which delivers product shown to not meet the
quality standards must be fined, and in serious cases, barred from
receiving further government money for manufacturing voting systems.
After putting millions of public dollars into development of a system
all voters can trust, and rewarding pledges of quality, that quality
must be fully delivered on. The delivered machines need to be available
for testing at any point during the year, not simply on the run-up to
Computerized voting systems allow nearly
untraceable electoral fraud on a massive level. Building computer
systems is not an easy task in the best of circumstances, and
computerized voting systems combine many of the hardest problems facing
programmers today. The companies currently involved have shown time and
time again that they are simply not up to the task of producing
cost-effective, secure voting terminals. Part of the problem is the
perceived business need for secrecy, but more often this merely serves
to cloak the truth of embarrassingly insecure systems from the public,
while leaving those security flaws open to attackers. Open systems are
secure in ways closed, privately held systems can never be; they do not
rely on people not knowing how they are built to remain secure, and the
more people checking and testing a piece of software, the more likely
it is to not contain bugs.
Recent history has shown several extremely promising examples of
open public competition spurring great technical strides. Such
contests generate not only return for those willing to invest in
development which might otherwise be unprofitable, but public awareness
and excitement. By sponsoring such a competition at the federal level,
the government can begin to restore faith in a process which far too
many people are starting to believe is no longer effective.
 Henry K. Lee, Judge Voids Results of Berkeley Measure on Medicinal Pot, S.F. CHRON., Sept. 28, 2007, at B1.
“Blue Screen of Death” is the nickname for the blue-colored screen
that comes up when the various versions of Microsoft Windows crashes. http://www.catb.org/~esr/jargon/html/B/Blue-Screen-of-Death.html.
 Lee, supra note 1.
 E.g. David Cho & Lisa Rein, Fairfax To Probe Voting Machines, WASH. POST, Nov. 18, 2003, at B01; Greg Lucas, State bans electronic balloting in 4 counties, S.F. CHRON., May 1, 2004, at A1; Diebold to Settle E-Voting Suit, http://www.wired.com/politics/security/news/2004/11/65674. Alameda County, Ca., is one of the counties Diebold settled with, referred to in the third cited article.
My pre-law professional background includes ten years doing systems
design, management, and support for IT departments of all shapes and
 See Douglas Jones, A Brief Illustrated History of Voting, http://www.cs.uiowa.edu/~jones/voting/pictures/.
 42 U.S.C. § 15302 (2006).
Direct recording voting systems do not use traditional ballots, but
instead use a machine which tabulates the votes as they are being cast.
Lever machines are the original direct recording systems. Direct
recording electronic voting systems simply replace the recording
machine with an electronic device.
 All Levels of
Government Are Needed to Address Electronic Voting System Challenges:
Testimony before the Subcomm. on Financial Services and General
Government of the H. Comm. on Appropriations, 110th Cong. 13 (2007) (statement of Randolph Hite, Dir. Info. Tech. Architecture and Sys.) available at http://www.gao.gov/new.items/d07576t.pdf.
 Jones, supra note 9.
 Catherine Seelye, County in California Touches Future of Voting, N.Y. Times, Feb. 12, 2001.
 E.g. IVOTRONIC TOUCH SCREEN VOTING SYSTEM PRODUCT OVERVIEW 2, http://www.essvote.com/HTML/docs/iVotronic.pdf.
 Kim Zetter, E-Voting Fans: The Disabled, WIRED, Oct. 4, 2004, http://www.wired.com/politics/security/news/2004/10/65206.
 Accessibility Tutorials: Hear text read aloud with Narrator, http://www.microsoft.com/enable/training/windowsvista/narrator.aspx (last visited October 28, 2007).
 Zetter, supra note 18.
 E.g. DATA ACQUISITION MANAGER PRODUCT OVERVIEW, http://www.essvote.com/HTML/docs/ESS_DataAquiMngr.pdf.
Using the 2000 census data and defining “adult” as a person aged 20
or above, there are approximately 125 million ATM users in the US,
making each responsible for an average of 81 ATM transactions per year.
Census 2000 data for the United States, http://www.census.gov/census2000/states/us.html
(follow “General Demographic Characteristics” hyperlink) (201 million
Americans aged 20 or older as of 2000); Standard Register Announces
Results of National Consumer Survey of Plastic Card Usage, http://www.icma.com/info/survey91099.htm
(In 1999, 61% of American adults owned ATM cards); Jack Plunkett,
PLUNKETT'S BANKING, MORTGAGES & CREDIT INDUSTRY ALMANAC 30 (2007)
(In 2006, there were 10.1 billion ATM transactions in the USA).
 At least in the UK. ATMs added to U.K. phone booths, http://www.selfservice.org/article_1860_23.php (last visited Oct. 28, 2007).
 Plunkett, supra note 23.
Todd Jackson, New high-tech purchase will make voting easier, THE
ROANOKE TIMES, Mar. 3, 2004, at B1 (warehousing and transportation
costs for new DRE voting systems represent a significant savings over
old lever machines).
 “Each voting system used in an election
for Federal office shall meet the following requirements. . . permit
the voter to verify (in a private and independent manner) the votes
selected by the voter on the ballot before the ballot is cast and
counted. . .” 42 U.S.C. § 15481(a)(1)(A)(i).
 In 1996, only
7.7% of voters in the US used a DRE system, according to the Federal
Election Commission. Direct Recording Electronic, http://www.fec.gov/pages/dre.htm (last visited Oct. 26, 2007).
For example, the Acting Director of the NJ Division of Elections is a
Deputy Attorney General with no listed experience either in IT or
purchasing. NJ Division of Elections, http://nj.gov/oag/elections/dir_bio.html (Biography of Acting Director Maria Del Valle Koch).
A partial list of companies currently involved some facet of the
business of creating DRE machines can be found at BlackBoxVoting.org, http://www.bbvforums.org/forums/messages/7659/7659.html?1187057600.
In 2007, after failing to sell the unit off, Diebold Inc. distanced
itself by changing the name of Diebold Election Systems to Premier
Election Services. Grant Gross, Diebold Can't Sell E-Voting Subsidiary, PC WORLD, Aug. 16, 2007, http://www.pcworld.com/article/id,136044-c,companynews/article.html.
Even a brief run-down of the scandals, questionable tactics (or
election results!) and unseemly or illegal doings by the major DRE
suppliers is beyond the scope of this article. I will provide one of
the most powerful examples of such as illustration. Walden O’Dell was,
until he resigned in 2005, the head of Diebold Election Systems. Mr.
O’Dell is also one of President Bush’s “Rangers and Pioneers”, marking
him as a fund raiser who has collected more than $100,000 for the Bush
campaign. In 2003, while still head of Diebold, an Ohio company, he
wrote a letter to other wealthy and high-ranking Republicans stating
that he was “committed to helping Ohio deliver its electoral votes to
the president next year.” A predictable furor followed. Two years
later, as his company was hit with class-action lawsuits alleging
insider trading and misrepresentation, he resigned. Melanie Warner, Machine Politics In the Digital Age, N.Y. TIMES, Nov. 9, 2003; Editorial, The Business of Voting, N.Y. TIMES, Dec. 18, 2005.
Nehemiah 3:6 – “Moreover the old gate repaired Jehoiada the son of
Paseah, and Meshullam the son of Besodeiah; they laid the beams
thereof, and set up the doors thereof, and the locks thereof, and the
 Charles Mann, Why Software Is So Bad, TECHNOLOGY REVIEW, July 2002, http://www.technologyreview.com/Infotech/12887/?a=f.
 For an excellent discussion of this idea, also known as Kerckhoffs' Principle, see Bruce Schneier, Secrecy, Security, and Obscurity, CRYPTO-GRAM NEWSLETTER, May 15, 2002, http://www.schneier.com/crypto-gram-0205.html#1.
For example, neither Hotmail nor Gmail will allow a new user for
their email systems to choose their username as a password. Virtually
every article returned by a Google search for “Password best practices”
mentions not to use an account’s username as its password; it is the
most obvious guess for anyone trying to access the account illicitly.
 Or at least a system that needs as few secrets as possible. Again, see Schneier, supra note 35.
Press Release, California Secretary of State, Secretary of State
Debra Bowen Moves To Tap ES&S’s Escrowed Source Code After Vendor
Violates Conditions of Its State Certification (June 21, 2007)
available at http://www.sos.ca.gov/executive/press_releases/2007/DB07_029.pdf.
NGC Reg. 14.110(3) (2005). A sobering comparison of the regulatory
schemes which control slot machines with those which control voting
machines is available at Slot Machines vs. Electronic Voting Machines,
http://www.votingmachinesprocon.org/slotschart.htm (showing point by
point how much easier it is to be certified to provide voting machines
than slot machines).
 Kim Zetter, E-Vote Software Leaked Online, WIRED, Oct. 29, 2003, http://www.wired.com/politics/onlinerights/news/2003/10/61014.
Binary code is a string of ones and zeroes that can only be read by
computers. To turn it into something which humans can interpret
requires a lengthy process called decompiling.
Even after decompiling, binary code gives a viewer only the
instructions to be carried out by the program. Source code, on the
other hand, includes any commentary or annotation the programmers
included. This is similar to the difference between reading a printed
copy of a book and a manuscript with notes in the author’s own hand.
 E.g. Aviel Rubin et al., Analysis of an Electronic Voting System, IEEE SYMPOSIUM ON SECURITY AND PRIVACY (2004).
The public, being law-abiding, couldn’t see the code before this;
contract and copyright law prevent people from attempting to examine or
decompile the computer code. The only people who could see the flaws,
therefore, were people already willing to break the law.
AES is the standard cipher that the government uses for encoding
information, including secrets. The NIST keeps an archive of the
process at http://csrc.nist.gov/CryptoToolkit/aes/.
I refer to AES alternately as both an algorithm and a computer program
throughout; while this is not technically precise, it is workable for
the purposes of this article. More technically, AES is an algorithm
which is implemented by computer programs; a badly designed program can
still leave data encrypted by AES vulnerable if it implements it
improperly. AES may be the lock on your front door, but if the
programmer leaves a window open, people can still get in. It cannot be
stated enough: computer security is difficult.
Development of a Federal Information Processing Standard for Advanced
Encryption Standard, 62 Fed. Reg. 32,494 (Jan. 2, 1997).
Participants had to submit not only a full specification for their
algorithm, but an example implementation in the C programming language.
 IAIK Crypto Group – AES Lounge, http://www.iaik.tu-graz.ac.at/research/krypto/AES/ (surveying available literature on implementing, optimizing, and attacking the AES cipher).
This example uses the Diebold AccuVote-TS machine. This is a slightly
older model voting machine, but one in wide use; in 2006, 10% of
America cast its votes on an AccuVote-TS. It is also one of the few
DREs which has been subject to an exhaustive independent scrutiny.
Computer scientists at Princeton acquired one of the devices and
learned a great deal about what makes it tick. The description of the
procedure here is taken from that paper. Feldman, Halderman, and
Felten, Security Analysis of the Diebold AccuVote-TS Voting Machine, Sept. 13, 2006, http://itpolicy.princeton.edu/voting/ts-paper.pdf.
 Id at 7.
 Id. Windows CE is now part of Microsoft’s Windows Embedded product line. Windows Embedded, http://www.microsoft.com/windows/embedded/default.mspx.
 Id at 9.
 Dan Eggen & Paul Kane, Gonzales: “Mistakes Were Made”, WASH. POST, Mar. 14, 2007, at A01.
 Paul Keil, Specter: “I do not slip things in.”, TPM, Feb. 6, 2007, http://www.tpmmuckraker.com/archives/002487.php.
 VINCENT MARAIA, THE BUILD MASTER (2005).
 David A. Wheeler, Software Configuration Management (SCM) Security, March 13, 2004, http://www.dwheeler.com/essays/scm-security.html.
 Associated Press, Con Job at Diebold Subsidiary, WIRED, Dec. 17, 2003, http://www.wired.com/politics/security/news/2003/12/61640.
 Scott Dunn, Microsoft updates Windows without users’ consent, WINDOWS SECRETS, Sept. 13, 2007, http://windowssecrets.com/2007/09/13/01-Microsoft-updates-Windows-without-users-consent; Scott Dunn, Stealth Windows update prevents XP repair, WINDOWS SECRETS, Sept. 27, 2007, http://www.windowssecrets.com/2007/09/27/03-Stealth-Windows-update-prevents-XP-repair.
is one of many groups working on making paper trails for DREs a
reality. It is a critical start, but not nearly enough to make DRE
 Full discussion of what such a competition
might look like is outside of the scope of this article. Briefly,
however, a competition along the lines of the AES competition might
look like this:
The competition would take place in five phases,
including several conferences open to the public, with invitations
going out to academics and relevant players in the industry. In the
first phase, the minimum requirements would be hammered out – what, at
least, must any secure voting system have? Once those specifications
are finalized and published, phase two is a request for submissions.
Like all phases of the competition, this would be open to anyone, be
they industry player or home inventor. The length of this phase would
be one of the things determined during phase one; once it is finished,
all of the submissions are opened to public viewing. Phase three is a
review period, during which each of the entries is examined and rated
by four separate groups – industry, academia, NIST itself, and the
interested public not aligned with any of those groups. At the end of
phase three, the top five rated submissions become the finalists, and
again, a conference is held for direct public discussion. This gives
each of the finalists an opportunity to present their system and argue
why their specification should be chosen. Based on commentary received,
each finalist may then make improvements and fix bugs. During phase
four, the five finalists are subject to intense scrutiny, but can no
longer make changes. A winner is selected, and during phase five final
commentary is given, final tweaks are made on the chosen system, and a
final specification is published.
 Ryan Naraine, Punditry: Will Microsoft buy flaws?, ZDNET, Mar. 19, 2007, http://blogs.zdnet.com/security/?p=130 (opinions from six security experts).
3Com runs one of the highest-profile of such systems through its
Tipping Point division, called the Zero Day Initiative. http://www.zerodayinitiative.com/.
 Victoria Ho, Asia finds security in open source, ZDNET ASIA, Sept. 28, 2007, http://www.zdnetasia.com/news/software/0,39044164,62032771,00.htm.
In addition to AES’ creation, a similar example of a competition to
encourage private innovation is the Ansari X Prize, which awarded ten
million dollars to the first private company capable of achieving