December 12, 2007
This annotation of expert reports on computerized voting systems should be appended to my January 2007 annotation. These two annotations are combined in a PDF at Where’sThePaper.org. A strict bibliography (no annotations) can be found at Re-Media ETC in PDF form. Over three dozen experts and policy makers are cited. The bibliography also includes reports that were once posted on the web, but which have since been removed (Florida and Maryland). Together, these two documents should serve well those who seek to rid the U.S. of secret vote counting, which is anathema to democracy.
Optical scan and touch screen voting systems manufactured by Diebold (now known as Premier), Election Systems & Software (ES&S), Hart InterCivic, Sequoia, and Nedap (also known as Liberty) are reviewed.
REPORTS ANNOTATED IN THIS SUPPLEMENT:
California Top-To-Bottom Review of Diebold/Premier, Hart InterCivic, and Sequoia touch screen (DRE) and optical scan systems, 2007
3May2012 UPDATED link: http://www.sos.ca.gov/voting-systems/oversight/top-to-bottom-review.htm (from which the TTBR reports can be found; the following links are probably also no longer valid.)
Cleveland State University Diebold’s GEMS Study (Diebold’s Global Election Management System), 2007
Kentucky Voting Expert Letter on Review of Diebold/Premier, Hart InterCivic, and ES&S, 2007
Netherlands, Review of Nedap(marketed in the U.S. as “Liberty”) system, 2006
As with the prior annotation, this one quotes some but not all vulnerabilities reported. Emphasis that appears in this annotation appeared in the original report (except where noted).
As with most prior studies, most of these reports also offer solutions to enhance security at ever-increasing expense to the public. (The Netherlands review herein is an exception, demanding transparent vote counting.) However, if the Pentagon is unable to deter hackers from its computers, surely our less-protected and less-funded election systems are much more vulnerable to attack. There is no doubt that winning elections in the most powerful nation in the world is strong motivation for anyone willing to do what it takes to win.
This supplement should serve to further inform the lay public about the continuing failure of computerized election systems to provide a basis for confidence in reported results.
*Computerized election systems can be remotely accessed and results changed without detection.
*Results transmitted via phone lines can be changed without detection.
*Memory cards containing malicious software can infect an election system countywide, without detection.
*Memory cards can be fraudulently authenticated.
* Voter privacy is lost on systems that use a continuous roll of paper to record voters’ selections, or on systems with radio emanations.
Given that Cuyahoga County, Ohio “lost” hundreds of memory cards for its Diebold touch screen systems in the May 2006 primary, citizens can have no basis for confidence in results reported on these machines.
The cost of high-tech systems continues to drain scarce public resources, requiring the use of expensive experts, expensive environmental controls and expensive testing. The level of continued training required for our nation’s poll workers costs far beyond the training for other, more preferred election systems.
Occam’s Razor applies: the simplest solution is the best. Hand-counted paper ballots (HCPB) are used around the world and cost about $4 per voter, while computerized systems in Ohio run as high as $18 per voter. [Ohio’s election costs derived from county Boards of Election annual expense reports provided by the Ohio Secretary of State in response to a public records request. Final “cost-per-voter” derived by dividing the reported total annual expenditure by number of registered voters, and then dividing that quotient by number of elections held in that year in that county. Ohio counties hold two or three elections every year.]
But cost and lack of securability are not our only considerations when contemplating HCPB. The use of any machinery renders a secret vote count, yet transparent vote counting is a necessity of democratic elections. Josef Stalin warned, “it’s not who votes that counts; it’s who counts the votes.” Abby Hoffman advised, “Democracy is something you do.” As more citizens become involved in counting the vote, the more confidence we can begin to have in reported results.
HCPB represents the best system for democratic elections. It is the least expensive, the easiest to secure from fraud, and the most transparent. Paper ballots should be hand-counted at the polling site on election night before all who wish to observe. The count could be videotaped and web-streamed to ensure greater access in observing the vote count. Precinct level (polling site) results should be immediately posted at the polling site for public inspection over the next several days, to ensure that county level reporting matches polling site reports.
As the nation continues to move toward a third questionable presidential election, ignoring the science, the cost, and the objections of informed citizens, we feel grave concern for our democracy. The solution is simple, though. Citizens who want to be assured election results are accurate can demand hand-counted paper ballots. Advocates of transparent vote counting have an ideal opportunity to demand this.
New York, right now, is facing a lawsuit by the U.S. Department of Justice, seeking to force NY to use these scientifically condemned machines in the 2008 election. An amicus brief is being contemplated, which offers to hand count the two federal races on the 2008 NY ballot. Andi Novick, an attorney in New York, will be filing the brief on behalf of the people, since the NY Attorney General represents the interests of the State Board of Elections.
Novick provides this legal research:
The right of an elector to vote is conferred by the Constitution…. [the elector] is entitled to see that his vote has been given full force and effect…. any method of holding an election which would deprive the electors…. of the right of casting their ballots and having effect given to the votes so cast would plainly be unconstitutional. (Emphasis supplied) See Deister v Wintermute, 194 NY 99, 108
It is our hope that Americans recognize the only way to ensure honest elections is by our direct observation of them. As long as we continue to vote on systems which count the vote in secret, we lack democracy. Without transparent elections, we are no longer a free people. But by direct participation in a hand-counted process, we quickly move toward the democratic ideal of a free people.
CALIFORNIA 2007 Top-To-Bottom Review (TTBR) http://www.sos.ca.gov/elections/elections_vsr.htm The full Red Team reports are at:
- Diebold Elections Systems, Inc. (.pdf, 498KB) http://www.sos.ca.gov/elections/voting_systems/ttbr/red_diebold.pdf
- Hart InterCivic (.pdf, 376KB) http://www.sos.ca.gov/elections/voting_systems/ttbr/red_hart_final.pdf
- Sequoia Voting Systems (.pdf, 108KB) http://www.sos.ca.gov/elections/voting_systems/ttbr/red_sequoia.pdf
The full Source Code reports are at:
- Diebold Elections Systems, Inc. (.pdf, 561KB) http://www.sos.ca.gov/elections/voting_systems/ttbr/diebold-source-public-jul29.pdf
- Hart InterCivic (.pdf, 573KB) http://www.sos.ca.gov/elections/voting_systems/ttbr/Hart-source-public.pdf
- Sequoia Voting Systems (.pdf, 831KB) http://www.sos.ca.gov/elections/voting_systems/ttbr/sequoia-source-public-jul26.pdf
California’s Red Team reports have been summarized by Cleveland State University Center for Election Integrity chief, Dr. Candice Hoke. Immediately below is Dr. Hoke’s statement from a personal email:
“Full disclosure: I was the team leader for the TTBR Diebold Documentation assessment. The TTBR study’s lead scientists provided suggestions for this short summary but it is ultimately my work.
“To reduce over 500 pages to two pages, at least a few important findings — especially about design flaws not relating to security issues — had to be sidestepped.”
Below is a partial reproduction of Dr. Hoke’s summary of California’s TTBR:
Election management/tabulation software
For all voting systems (“VS”), the system architecture depends on a commercial operating system known to have security vulnerabilities. All vendors failed to secure this system properly. System architecture had not been designed with either basic or sophisticated security protections. All systems failed to follow standard security design principles.
All systems were susceptible to viruses that could be introduced from a number of vectors, including from voting device memory cards. (Viruses and other rogue programming can, e.g., “flip” votes among candidates, scramble tabulation data, delete voting data, and cause system programming to fail.)
Viruses could infect the central computer and then be spread to all the voting devices when their memory cards are prepared for the next election.
System logs of operator activity (“audit logs”) could be overwritten or erased, meaning that insider attackers could manipulate voting data and results, and then erase the logging inventories that would show the access and activity; or, could be used to frame a different employee.
Systems permitted relatively easy bypassing of passwords, thus permitting broader access than authorized.
In each VS, many other security holes exist that could compromise the system’s ability to report accurate election results — or any results.
All systems failed to follow standard security design principles, and lacked even basic security protections. All systems’ devices (DREs and precinct-based optical scanners) were subject to easy, undetectable attacks that could occur during the normal time that a voter would be at a voting machine casting a ballot.
Some devices permitted the researchers to introduce malicious code onto a voting machine in under a minute, while appearing to be in the process of voting.
All DRE touchscreen voting units permit a voter to generate and cast multiple ballots during a normal time voting could occur, in ways that would be largely undetectable to poll workers unless they were specially trained and closely supervising the voter’s activity at the unit (voter privacy might still be compromised).
Some DRE devices permitted the researchers to damage the Voter-Verified Paper Audit Trail (VVPAT) covertly, so the voters could verify that their votes were printed correctly, but after the election the VVPAT could not be read.
Other DRE devices could be modified to store votes incorrectly, but print them on the VVPAT correctly (for example, a voter’s choice of John Adams results in the VVPAT printing “John Adams” but the DRE stores the vote as a vote for “Thomas Jefferson”).
The NASED “qualification” (certification) of all systems was based on testing lab (“ITA”) studies that were seriously flawed. While the ITA reports varied significantly, generally it was not possible to ascertain whether the lab had conducted the independent tests needed to determine VS satisfaction of FEC 2002 standards.
Often the ITA would test a device but not the voting system as a whole, despite the guidelines’ requirements for system testing to determine whether the various components worked accurately and reliably in concert.
Documentation was uniformly seriously deficient in alerting officials to security vulnerabilities and the management and training strategies so that election officials could protect the voting systems and accuracy of results.
The VS vendors varied significantly in the adequacy of the documentation they provided to local election officials. Some documentation was clear and well-written for support; other manuals were vague, contradictory and confusing.
Poor quality in a vendor’s documentation for election officials can lead to a series of expensive technical services contracts with the vendors, so that a jurisdiction can run the systems.
Although some voting systems could be used by some voters with certain disabilities, each of the tested systems has accessibility design limitations that will not allow independent voting by voters with other disabilities.
Support stands for all the voting systems impeded physical access by most voters in wheelchairs.
The paper trail printouts of the tested systems cannot be directly read and verified by blind voters, and were also found to be difficult or impossible to read and verify for many other voters with disabilities.
New concerns have arisen over the VS regulatory system for it did not weed out seriously flawed systems. Despite regulatory changes, these studies have raised concerns about the new regulatory system/standards.
[End of summary extraction]
Dr. Hoke continues in her email, “We will be summarizing other independent voting systems studies, including those convened by the Secretaries of State in Florida, Connecticut, and Ohio, and by the Kentucky Attorney General, to facilitate these findings also becoming easily accessible to policy makers, election officials, media, and the public.
“For those concerned with the Diebold/Premier GEMS problems, here’s a NY Times article on the Cuyahoga audit and a GEMS capacity problem, plus a relatively short scholarly article on Diebold GEMS software deficiencies, peer-reviewed and presented this summer at a computer science voting systems conference.”
GEMS Tabulation Database Design Issues in Relation to Voting Systems Certification Standards, Thomas P. Ryan and Candice Hoke, Cleveland State University
Abstract: This paper analyzes the Diebold Election Systems, Inc. election management software (GEMS) using publicly accessible postings of GEMS election databases.
It finds that the GEMS architecture fails to conform to fundamental database design principles and software industry standards for ensuring accurate data. Thus, in election tabulations, aspects of the GEMS design can lead to, or fail to protect against, erroneous reporting of election results. Further, GEMS’ dependence on Microsoft’s JET technology introduces additional risks to data accuracy and security.
Despite these technical and systemic deficiencies, GEMS received approval as complying with Federal Voting System 2002 standards. Questions then arise concerning the adequacy of the 2002 and 2005 regulatory standards.
The paper concludes that the standards structurally encourage and reward election system vendors for using less exacting database design standards.
FLORIDA: Software Review and Security Analysis of the Diebold Voting Machine Software, Security and Assurance in Information Technology (SAIT) Laboratory Florida State University, July 2007.
The two primary systems analyzed consist of the Diebold Optical Scan, firmware version 1.96.8, and Touch Screen, firmware version 4.6.5. We also examined the Diebold Touch Screen bootloader version 1.3.6 as well as GEMS server software version 1.18.25.
We considered flaws in previous versions of the software for all parts of the system, including those found in the AccuBasic interpreters.
Our analysis focuses on two attacker categories… voters and poll workers. Attacks by elections officials and voting system vendors are largely outside the scope of this review. We did not conduct penetration or red team testing for these systems.
Our analysis examined only those flaws previously reported in the cited literature.
Flaws in the Optical Scan software enable an unofficial memory card to be inserted into an active terminal. Such a card can be preprogrammed to swap the electronically tabulated votes for two candidates, reroute all of a candidate’s votes to a different candidate, or tabulate votes for several candidates of choice toward a different candidate.
Data on optical scan memory cards is neither encrypted nor authenticated, leading to many potential attacks that could manipulate vote counts on a memory card prior to or during the voting day.
Unsupervised access allows an attacker to place the Optical Scan terminal into diagnostics mode and obtain all or most of the data on the memory card, or to reset the machine clock.
The hand-coded RSA signature verification is insecure and can be forged. This applies to both the optical scan and touch screen systems. With technical knowledge and unsupervised access, an attacker can copy or dump the memory card contents by connecting a laptop or modem to the optical scanner.
The system uses the same cryptographic key for multiple purposes and is tied to publicly-known machine serial numbers. Its value is never changed after being created. The security key cards are insecurely protected, the same as all other smart cards, which allows anyone to read all data from them.
The public key is hard-coded into the source code. Such key-reuse is discouraged by the cryptographic community since such reuse introduces vulnerability. Supervisor PIN is not cryptographically protected.
System configuration information is unprotected. The “protected” counter is stored in a mutable file, and the ballot definition file is unprotected. Since stored votes are only associated with a candidate number and not a name, the ability to create custom ballot definition files allows one to alter or switch candidate names without any record in the vote counts or electronically stored ballots.
In the Touch Screen software, flaws allow an adversary to prepare official, activated voter smart cards that would enable voters to cast multiple ballots in a ballot-stuffing attack. Once an adversary obtained the necessary information, smart cards could be created and used in any precinct through a county. Even if detected, this attack is not correctable: the malicious ballots, either in electronic or paper form, are essentially unidentifiable and thus cannot be removed.
Memory card update file is unprotected. The file assure.ini remains unencrypted and unauthenticated and is subject to malicious manipulation. Removal of a memory card allows an attacker to create valid voter cards.
If the authentication key necessary to validate voter cards is the same across precincts, as we understand to be common practice in Florida, these cards could easily be modified to be used at any other precinct within a county.
Data and smart card passwords can now be set by election workers. The authentication protocol is not secure, allowing an attacker to create counterfeit, validating smart cards, including voter cards.
There is no integrity protection of stored electronic ballots and ballots are stored sequentially. This defeats voter privacy by allowing a voter’s selections to be tied to a voter’s name.
Audit logs are not cryptographically protected and data transmitted over communication lines is neither authenticated nor encrypted.
A custom, malicious bootloader is possible if the terminal is delivered to a polling place in “debug mode.” If not in debug mode, an attacker can open the case and move a hardware switch to enable this attack. An attacker can hide preloaded votes on a forged memory card that the terminal will recognize.
FLORIDA: Software Review and Security Analysis of the Diebold Voting Machine Software Supplemental Report, Security and Assurance in Information Technology (SAIT) Laboratory Florida State University, August 2007.
This report reflects the narrow investigative scope requested by FLDoS (Florida Department of State). These results are not comprehensive in any sense, nor is this report an endorsement of the system’s overall security. We examined only a small subset of the flaws from the SAIT Diebold Report.
All other flaws identified in that report remain in the code base, including vulnerability to a sleepover attack that may allow an intruder to manipulate vote computation or worse.
Significant, critical vulnerability remains in this code base independent of repairs documented in this report.
Until voting systems are developed for “high assurance”, election officials face an unnecessarily high risk and must exercise significantly expanded election security procedures to mitigate known and unknown software vulnerability.
The signature flaw was fixed. This makes it much more difficult for preloaded votes to be hidden.
(Note: Other flaws reported to have been fixed were not detailed above. ~ RA)
KENTUCKY 2007 Voting Expert Letter to KY Attorney General, public version posted at Review of Diebold/Premier, Hart InterCivic, and ES&S.
The review relies on the completeness and accuracy of the testing by the Independent Testing Authorities (ITA) for conformance to voluntary Federal guidelines (Voting systems Standards 2002). However, it has been well established that the ITAs do not adequately perform this role.
The ITA reports used for Federal certification and included in the review packages used by the SBE certifiers are cursory…. (as) reinforced by the fact that none of the ITAs identified the flaws found by the California or Florida source code review teams.
Because the ITA reports are of limited value, the quality examination of the machines as part of the certification processes is crucial, but it too can best be described as cursory.
The security of all of the machines appears to be extremely dependent on their never coming in contact with malicious code, as once that occurs there are few defenses or recovery mechanisms. This is sometimes referred to as the “M&M model of security”: there is a hard crunchy exterior that protects a soft chewy interior.
Short-term recommendations include developing written rules and procedures avoiding network connectivity and using “sniffers” to detect same, changing and properly storing all encryption keys and passwords, checking that physical seals are unbroken, and checking that the version of hardware and software being used is that which was certified.
Some long-term recommendations include a more thorough certification process, additional security measures, avoiding use of continuous tape so that voter privacy is better protected, and review of software source code for all machines used in Kentucky.
NETHERLANDS Review of Nedap Touch Screen system (marketed as Liberty DRE in the U.S.), October 2006 by independent computer experts without the consent of the manufacturer.
90% of the votes in The Netherlands are cast on the Nedap/ Groenendaal ES3B voting computer. With very minor modifications, the same computer is also being used in parts of Germany and France.
The Nedap ES3B electronic voting computer is a touch screen system that only records votes in memory. The system requires ultimate trust, since it produces an election outcome that cannot be independently verified.
Anyone with brief access to the device at any time before an election can gain complete and virtually undetectable control over election results.
Radio emanations from an unmodified Nedap can be received at several meters distance and be used to tell who votes what.
The over-all security design relies almost solely on the near-universally deprecated concept of ‘security by obscurity.’ Since the problems we found stem from the very design, we see no quick fixes that could make this device sufficiently secure.
We conclude that the Nedap ES3B is unsuitable for use in elections, that the Dutch regulatory framework surrounding electronic voting insufficiently addresses security, and we pose that not enough thought has been given to the trust relationships and verifiability issues inherent in DRE class voting systems.
Given the fact that technical specifications and source code to most electronic voting systems are not publicly available, we see grave danger to our democracy by the use of secret voting technology.
Password stored in the code and quickly found, allowing attacks to read and modify election results.
Software code could be inserted, and in response to Nedap’s challenge, this team programmed the machine to play chess. (Emphasis added. ~RA)
Software could be manipulated to steal a certain percentage of votes, for a given party. In this way, elections could be predetermined without knowing candidate names.
Parallel testing is ineffective, and only tests for outside threats – not insider attacks. The Brennan Center (2006) reached the same conclusion:
“Even under the best of circumstances, Parallel Testing is an imperfect security measure. The testing creates an ‘arms race’ between the testers and the attacker, but the race is one in which the testers can never be certain that they have prevailed.”
In the case of voting systems, the only meaningful security against insider attacks is to have a voting mechanism of which all the details are published and that a substantial portion of the general public is capable of comprehending in-depth.
By adding extra security measures against the over-emphasized threat posed by outsiders, one can actually increase the risk posed by insiders.
For example, today’s mobile phones often combine a processor, execution memory and tamper-resistant key storage to make sure only the manufacturer (who has the cryptographic signing keys) can update the software. These mechanisms can sometimes still be circumvented, but at least they offer a layer of security that is completely absent in the Nedap ES3B. But by adding ‘security’ in this way, the device could also resist any attempts to independent inspectors to see what code it is actually running.
UCONN University of Connecticut Security Assessment of Diebold Optical Scan system, 2006 Abstract:
We identify a number of new vulnerabilities of this system which, if exploited maliciously, can invalidate the results of an election process utilizing the terminal.
An Accu-Vote Optical Scan can be compromised with off-the-shelf equipment in a matter of minutes even if the machine has its removable memory card sealed in place. The basic attack can be applied to effect a variety of results, including entirely neutralizing one candidate so that their votes are not counted, swapping the votes of two candidates, or biasing the results by shifting some votes from one candidate to another.
Such vote tabulation corruptions can lay dormant until Election Day, thus avoiding detection through pre-election tests.
UCONN University of Connecticut Security Assessment of Diebold Touch Screen (TSx) system, 2007.
The attacks presented in this report were discovered through direct experimentation with the voting terminal and without access to any internal documentation or the source code from the manufacturer.
We present two attacks based on these vulnerabilities: one attack swap the votes of two candidates and another erases the name of one candidate from the slate.
These attacks do not require the modification of the operating system of the voting terminal, and can be launched in a matter of minutes, requiring only a computer with the capability to mount a PCMCIA card file system (a default capability in current operating systems).
Security problems are present in the system despite the fact that a cryptographic integrity check appears to be employed in the voting system’s memory card.