Research and Advances
Computing Applications The problems and potentials of voting systems

Independent Testing of Voting Systems

Posted
  1. Introduction
  2. Accreditation of ITAs
  3. Voting System Definition
  4. ITA Scope
  5. Testing
  6. Qualification of Voting Systems and Reporting
  7. State/Jurisdiction Verification
  8. The Future
  9. Author
  10. Figures
  11. Sidebar: Decoding the Terminology of Elections

Independent qualification testing is a prerequisite in over 40 U.S. states for seeking accreditation of a direct-recording electronic (DRE) or paper-based voting system. While individual state laws vary, this is generally interpreted as being tested by National Association of State Election Directors (NASED) accredited hardware and software Independent Testing Authorities (ITAs) to the voluntary FEC Voting System Standards (VSS). Election directors require a NASED Qualification Number and Report before accepting systems for testing.

In 1975, the National Bureau of Standards issued the report Effective Use of Computing Technology in Vote-Tallying, which determined a basic problem in computerized elections was the lack of technical skills at the state/local level for development of standards and testing. This report and other studies conducted in the 1980s were the impetus for the ultimate development of federal minimum voluntary standards.

The 1990 VSS (Performance and Test Standards for Punchcard Marksense and Direct Recording Electronic Voting Systems) addressed the risks of security, privacy, accuracy, and problems plaguing jurisdictions, including: orphaned systems as vendors disappeared or went out of business; systems that didn’t perform as expected or that gave no guarantee of minimum functionality or adequate instructions; and systems that broke after a few elections (for example, systems stored in Phoenix, AZ, or Hilo, HI, warehouses were adversely affected by environmental conditions).

By 2000, advances in voting technology, legislative changes, and the Americans with Disabilities Act warranted updating the 1990 VSS. Technical consultants, election officials, and testing labs representatives volunteered their time to assist the FEC in updating the 2002 VSS. In defining the scope of changes the authors noted, “Voting systems marketed today are dramatically improved. Election officials are better assured that the voting systems they procure will work accurately and reliably. Voting system failures are declining, and now primarily involve pre-Standard equipment, untested equipment configurations, or the mismanagement of tested equipment. Overall, systems integrity and the election processes have improved markedly.” The VSS update focused on the areas of change along with strengthening requirements for code review and vendor documentation.

Back to Top

Accreditation of ITAs

While the 1990 VSS called for independent testing, it did not explicitly define ITAs. In 1992, NASED volunteered to create and administer a program to accredit ITAs for qualification testing of voting systems. Labs participation was sought. Auditors volunteered their time to review the labs’ quality systems and test methods. In onsite visits they verified conformance to the NASED Program Handbook. The labs include:

  • Hardware: Wyle Laboratories, Huntsville, AL, accredited in 1994; www.wylelabs.com.
  • Software: CIBER, Huntsville, AL, accredited in 1996 by Nichols Research, which was ultimately acquired by CIBER; www.ciber.com.
  • Full (hardware and software): SysTest Labs, Denver, CO, software accredited in 2001, hardware accreditation in 2004; www.systest.com.

Back to Top

Voting System Definition

A voting system isn’t just the equipment necessary to cast a vote. The VSS has two definitions, addressing the physical and functional components of a voting system. The physical aspect defines a voting system as comprising all the hardware and software, plus the procedures, manuals, and specifications that describe the physical and functional characteristics. The functional aspect states that a voting system has three functions: pre-vote, voting, and post-vote. The Election Management System (EMS) includes the pre-vote and post-vote functions, which occur outside the polls, including ballot preparation and central count. The Polling Place System includes all three functions occurring in the polling place.


ITA testing is built into the price of voting systems and maintenance contracts.


Back to Top

ITA Scope

The terms hardware and software are misleading in describing the scope of the ITAs, because software is addressed by both ITAs. The hardware ITA performs the physical and functional testing of the Polling Place System, including:

  • Review of the polling place technical documentation, testing, and source code.
  • Environmental testing of the polling place hardware including humidity, temperature, power variation, vibration, and various electrical and magnetic tests.
  • Functional and accessibility testing of the polling place software, hardware, and user manuals.
  • Operating tests for Polling Place System data accuracy, maintainability, reliability, and availability.

The software ITA performs the physical and functional testing of the EMS and end-to-end testing of the integrated EMS and Polling Place System, including:

  • Review of the EMS technical documentation, source code, functional testing, and end-to-end testing of the full system.
  • Testing of the EMS software, hardware, and user manuals.
  • End-to-end testing of the integrated voting system.
  • Operating tests for Central Count System data accuracy, maintainability, reliability, and availability.

As part of the vendor’s cost of doing business, ITA testing is built into the price of voting systems and maintenance contracts. The vendor contracts separately with the hardware and software ITA. Each lab establishes its own price and project structure. Once any deliverable is tendered to a lab, the vendor must complete qualification with that lab, with the cost of qualification depending upon the quality of the system submitted. If problems are found, regression tests and reviews increase the cost. Submission of a single voting system for all ITA testing ranges from $150,000 to $400,000 and can take two months to a year. Several vendors have single Ballot Preparation and Central Count Reporting programs that support multiple DREs and optical scanners. These may be submitted in a single ITA effort and would push the overall cost substantially higher.

Back to Top

Testing

ITA testing follows the principles of quality assurance: the ITAs must assure test consistency and repeatability with greatly differing systems. This is accomplished by having a defined, documented, standardized test process easily adapted to various voting systems, regardless of function and configuration. While following these principles of quality assurance, the ITA’s test objective is to verify the system meets the VSS requirements and can be recommended for NASED qualification. ITAs may appreciate elegance in design, function, and execution, but an ugly voting system that meets VSS requirements merits recommendation. The VSS identifies two processes, as illustrated in this figure: the Physical Configuration Audit (PCA-blue) and the Functional Configuration Audit (FCA-green).

PCA Technical Data Package (TDP) Review: The TDP is reviewed to confirm that required documentation is present, conforms in content/format and is sufficient to install, validate, operate, maintain the voting system, and establish the system hardware baseline associated with the software baseline. Results of the review are provided to the vendor in a Pre-Qualification Report.

PCA Source Code Review: The voting system source code is reviewed for:

  • Maintainability—including the naming, coding and comment conventions, adherence to coding standards and clear commenting.
  • Control Constructs—determining that the logic flow utilizes standard constructions of the development language, is used consistently, logic structure is not overly complex, and there is an acceptable use of error handlers. Where possible automated tools are used.
  • Modularity—confirming each module has a testable single function, unique name, single entry/exit, contains error handling and has an acceptable module size.
  • Security and Integrity of the Code—including controls to prevent deliberate or accidental attempts to replace code such as unbounded arrays or strings, including buffers to more data, pointer variables and dynamic memory allocation and management; and other security risks, such as hard-coded passwords.

PCA Test Environment: The Hardware and Software ITAs document the setup of the voting system configuration to assure a consistent test environment. The ITAs observe building of the executable from reviewed source code. We work together to confirm that all testing is performed only on ITA-reviewed code built under ITA observation.

FCA Test Documentation Review: The ITA reviews and assesses prior testing performed by the vendor. Based upon the assessment of vendor testing the ITA: identifies scope; designs testing; and creates the Qualification Test Plan.

FCA Testing: Each ITA tests to its particular identified scope, using its own internal processes:

  • Polling Place System Testing: The Hardware ITA initiates environmental operating and non-operating tests; functional testing of polling place hardware/software, and user manuals for all VSS required and optional vendor-supported functionality; testing the capability of the voting system to assist voters with disabilities or language; and the accuracy and reliability testing.
  • EMS Testing: The Software ITA initiates functional testing of the Ballot Preparation and Central Count hardware/software, and user manuals for all VSS required and optional vendor-supported functionality.
  • System Level Testing: The Software ITA initiates end-to-end testing of the integrated EMS and Polling Place System, including testing of the system capabilities and safeguards, claimed by the vendor in its TDP.

Back to Top

Qualification of Voting Systems and Reporting

Each ITA produces a Qualification Report identifying the voting system tested, scope, approach, environment, and optional functionality supported by the system. Detailed results include documentation of the reviews (TDP and source code) and all testing performed, with description of the issues encountered and resolved. Risks that do not violate the VSS are disclosed in the conclusion along with the recommendation for NASED qualification.

The Qualification Report is released to the vendor and the NASED Technical Committee for acceptance or rejection. NASED provides a reason for rejection, and it is the vendor’s option to resolve the objection. Both hardware and software ITA reports must be accepted by NASED to obtain a qualification number.

Back to Top

State/Jurisdiction Verification

NASED advises states and jurisdictions to implement policies to guarantee validation that executables, documentation, and reports are ITA-qualified. They suggest states obtain qualified versions directly from the ITA for all accreditation testing. Jurisdictions can likewise require that the executable be obtained from the ITA for installation. If logistics make installation impractical, a validation of the version and checksum can be incorporated into the election database installation process.

Back to Top

The Future

With HAVA’s creation of the Election Assistance Commission (EAC) to perform audits on a regular basis, the EAC and the National Institute of Standards and Technology are studying the current NASED model for ITA accreditation and qualification. Whether this program will change or stay the same will be determined by the results of their study.

Back to Top

Back to Top

Figures

UF1 Figure. Overview of the ITA process.

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More