Research and Advances
Computing Applications The problems and potentials of voting systems

Voting Systems Standards and Certifications

Surveying the effort to create a new implementation-independent voting system standard.
Posted
  1. Article
  2. Authors

In the U.S., voting systems in each state are purchased by the designated Election Authority, which may be a city election commission, a county, or the state itself. However, voting system vendors cannot merely present voting systems to the election authority for sale: these systems must first be certified by the state before they may be sold, as has been the case for decades. As early as the 1970s, certification requirements varied drastically from state to state. Some states required a demonstration of the proposed equipment before the State Election Board while others only required an application for certification. Still others required a full set of hardware qualification tests to be performed by an Independent Testing Authority (ITA), including many of the hardware reliability shake-and-bake tests that are performed today.

As the variety of devices and sophistication of voting equipment increased and microprocessors were incorporated into the hardware platforms, it became increasingly clear that uniform standards were required for evaluation and qualification of voting equipment. The first national standard in this area was published in 1990 by the U.S. Federal Election Commission (FEC) Office of Elections Administration (OEA), which has now been incorporated into the U.S. Election Assistance Commission (EAC). Although voluntary in nature, the 1990 FEC standard introduced the first set of national requirements for voting equipment. Several years after its introduction, the National Association of State Election Directors (NASED) established a uniform system for testing to the FEC standard. NASED certified and supervised a set of ITAs, which provided a central evaluation of voting systems that states could use as a baseline prerequisite for meeting their certification requirement. Currently, over 40 states require vendors to first have their equipment NASED-certified through the ITA testing process before they can market their equipment in the state.

While laudable, the 1990 FEC standard was limited in many ways and, over time, became somewhat technically dated. The 1990 standard represented the first step in bringing consistency to what had previously been an industry that was independently regulated by each state. The FEC went a long way but did not accomplish everything in this first step and in 1997 began a revision of the 1990 standard. This revision received final approval in May 2002 and is the current standard used by the NASED-certified ITAs to qualify equipment. The 2002 FEC standard makes tremendous progress in the development of voting standards: human factors, disability access, and security are all treated in greater detail, or in some instances are introduced for the first time.

In 2001, the IEEE Standards Association inaugurated Project 1583. P1583 is the first voting equipment standard to be developed in the open, consensus process under ANSI rules. Participation in P1583 is open to all interest groups and has included participation from a wide variety of individuals representing different interests. For a standard to be adopted by the IEEE, it must first be accepted by a formal vote of the active membership of the working group. It must then successfully pass through a series of reviews and ballots, designed to ensure the standard is technically correct and is the consensus view of a balanced representation of all affected interests in the standard. Once published, IEEE standards are technical recommendations and have no enforcement power on their own. However, these standards are routinely adopted by governmental authorities and often become mandatory through a regulatory process. For IEEE P1583 to become mandatory would require its adoption for use in the certification process.


Standards are living documents, constantly undergoing review and periodically updated to address new technology, refinement of test methods, and to clarify and refine requirements.


Requirements for certification still vary from state to state. Some states, such as Florida, have their own Voting System Standard. Even though the FEC Voting System Standards are a "voluntary" recommendation of a federal agency, once they are adopted by a state they become required for certification by that state. Due to the number of states that now require testing voting systems, at least as a prerequisite for state certification if not the full requirement, they are becoming a de facto required standard.

Under the 1990 Standard, each subsystem of an end-to-end voting system was tested and independently certified. The certification identified Model and Version/Revision of each component. The 2002 standard, however, now requires an entire voting system be tested and certified as an entity. This requires that any piece of a voting system must be ITA tested with all associated components of the voting system and the versions of each component used recorded. This is true whether the entire system is being certified or just an upgrade to a single subsystem. Tabulator and/or direct-recording electronic (DRE) election definitions must be defined with the front-end ballot layout software under control of the Software/System ITA and provided to the ITA who will do all of the hardware and function testing of the tabulators/DREs. The Tabulator/DRE ITA will then provide the Software/System ITA the results of all election functional testing to be uploaded into the systems Accumulation and Reporting software. The creation of election definitions and uploading of results are just some of the required functional and reliability testing steps performed as part of the certification testing. Once this process is successfully completed, a certification number is assigned to the system by NASED.

With the passage of the Help America Vote Act (HAVA), the responsibility has been modified. The EAC has been created and is now responsible for the entire process. The National Institute of Standards and Technology (NIST), under the auspices of the EAC, is responsible for certifying testing laboratories and adopting standards for use by these laboratories. Until this process is fully implemented, the NASED Technical Committee will continue to supervise the ITA testing laboratories, approve the reports, and issue the NASED equipment certification numbers. This testing will continue to be done to the FEC 2002 Voting System Standards until a new standard is adopted by the EAC. (See the article "Independent Testing of Voting Systems" in this section for more detailed discussion of the ITA testing laboratories and procedures.)

The FEC 2002 Voting System Standard (VSS) is a single document comprised of two volumes containing sections covering all aspects of the voting system including security, usability and accessibility, software/firmware source code, functional and audit requirements, environments and reliability requirements for hardware as well as accuracy and integrity of all components. The VSS also covers all equipment and software used in the end-to-end election process including ballot layout, programming the tabulators and DREs, the tabulators and voting equipment themselves, and the accumulation and reporting of results. Volume I of the VSS encompasses the performance standards for all categories; Volume II defines testing requirements and methods to ensure that the election system meets the standards defined in Volume I.

Standards are living documents, constantly undergoing review and periodically updated to address new technology, refinement of test methods, and to clarify and refine requirements. However, until a new standard is approved by the EAC, the FEC 2002 standard will continue to be the criteria used by the ITAs for evaluation and certification of voting systems. This provides an opportunity for the IEEE P1583 Voting System Standard effort to be a major contribution to the ITA testing and certification process. Although the P1583 scope is limited to the equipment used by voters in polling places and in-person absentee sites, a standard developed under the IEEE open process would provide an immediate vehicle that could be adopted by the EAC for use by the ITA that performs Tabulator/DRE testing and certification.

The IEEE has been involved in the standards development and the certification process for several years. Stephen Berger, the chair of IEEE Standards Coordinating Committee 38 (SCC38), is the appointed, ex officio IEEE representative to the NASED technical committee. SCC38 is also the sponsor of the P1583 working group. When the IEEE P1583 Standard development was initiated in 2001, the committee was divided up into Task Groups (TGs) corresponding to the major technical sections to be addressed in the standard. These included Security and Confidentiality, Reliability and Accuracy, Usability and Accessibility, Environmental, Electromagnetic Compatibility, and Software. Each TG was to address both the recommended specifications for the topic as well as the testing aspect. A subject matter expert was appointed to be the TG leader of each group. Since its inception, meetings have been held approximately every three months to discuss issues as a committee of the whole as well as to hold TG committee meetings.

The core of the P1583 standard comprises major sections for definition of performance requirements and for definition of testing requirements. Each of these major sections is divided into corresponding subsections that relate to the TG topics. Each TG created a section document for their respective subject matter for the most part using the FEC 2002 material as a starting point. These documents were circulated to the entire TG membership for formal comment and vote. All section comments were resolved and once the section document was approved, it was included in the master document. The first version of the full standard was concurrently distributed in late August 2003 to both the full P1538 membership and to the sponsor, the IEEE SA membership who had previously signed up for the balloting, for formal voting and commenting. This first balloting failed at both the committee and sponsor levels and over 1,000 specific comments were received on the document. Many of these were related to such controversial topics as Voter Verified Audit Trail (VVAT) and the handling of commercial off-the-shelf (COTS) hardware and software.

Due to the importance of these new VVAT and COTS issues, the new HAVA-defined requirement for handling Provisional Balloting and the fact that handling these issues crossed multiple TG sections, three Special Task Groups (STGs) have been created to handle their impact and ensure they are accommodated. Two other STGs have been created to handle considerations not encompassed by the TGs. These are the Technical Data Package (TDP) requirements and a cross-reference between the FEC 2002 and the P1583 standards as well as between the HAVA requirements and P1583. All comments are in the process of being resolved by the five TGs and the five STGs in accordance with their area of responsibility. Once comments are resolved and the TG members’ approved changes are incorporated in the document sections, a new draft will be compiled and circulated to the P1583 committee membership.

At the time this article was written in July 2004, the P1583 working group intended to have the new draft available for consideration by late summer. Once it passes committee approval, it will be submitted for sponsor ballot and ultimately balloting by the IEEE SA. The goal of the committee is that an IEEE standard be issued and adopted by the EAC in time to have an impact on voting system testing and certification.

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More