South Carolina voting machines miscounted hundreds of ballots, report finds

Counting errors that were reported as official results are the result of bad software design, a computer science professor tells StateScoop.
electronic voting illustration
(Getty Images / StateScoop)

An analysis of South Carolina’s voting equipment found that state election officials miscounted hundreds of ballots during the primary and general elections in 2018 because of “continued software deficiencies.”

Conducted on behalf of the League of Women Voters by Duncan Buell, a computer science professor at the University of South Carolina, the study published last week found that in one primary race, voting machines in one precinct double counted 148 votes. During the general election in another precinct, more than 400 votes were awarded in the wrong county board race.

In both instances, Buell found, the improperly counted voters were logged by the South Carolina State Election Commission as official results. Neither case involved enough votes to swing the outcome of an election, but Buell told StateScoop the incidents demonstrate the state continues to use poorly designed software that poll workers, many of whom are volunteers working long shifts, struggle to operate correctly.

“People are tired and they don’t check the 95 things you need to check,” Buell said. “Poll workers are supposed to arrive at 6 or 6:30 a.m., and polls close at 7 p.m. If you’re going to have software used by volunteers, you’ve got to bulletproof the hell out of that software.”


But Buell said his analysis of the equipment revealed few safeguards against such errors.

South Carolina conducts its elections using the iVotronic model of voting computer sold by Election Systems & Software, the United States’ largest manufacturer of balloting equipment. The state purchased its inventory of machines between 2004 and 2006, making even its newest computers more than a decade old. Further complicating matters, the iVotronic platform does not produce paper records of votes casted on it, making post-election audits difficult.

Anatomy of an error

Buell discovered the 148 double-counted ballots in a precinct in rural Marlboro County in the results of a June 2018 primary race. Of that precinct’s five iVotronic machines, four performed as designed, but poll workers noticed a fifth machine was malfunctioning after it had been used by five voters. Their votes were not altered, but the machine’s malfunction introduced a complication to the vote tallying process.

When it came time for the precinct to tally its votes, the four working machines had counted 148 total ballots, according the the handheld devices poll workers normally use to open iVotronic computers and collect their results. But the handful of votes from the faulty machine was counted by physically removing its memory card and plugging the card into a chip reader — a process that was then repeated for the four machines that had already been counted, giving the precinct an erroneous total of 301 votes, which was then reported up to state officials as the actual count.


“I am not aware the election commission knows,” Buell said. “The state’s programs have 301 voters in that precinct and not 153.”

The wrongly awarded votes in the November general election, which Buell found in Bamberg County, about 70 miles south of the state capitol in Columbia, occurred thanks to a limitation of how the iVotronic equipment tallies votes, he said. Rather than assigning an identifying code to each race or candidate name, each machine produces a spreadsheet and the corresponding cells of each machine’s spreadsheet are then added together for a total. The mistake came because residents of that Bamberg precinct voted in two county council races last year, but the iVotronic spreadsheet only contained space for a single contest. As a result, 420 votes that should’ve been counted for one race were added to the other.

“It’s totally stupid to add things together without a key,” Buell said.

‘How does one write software that makes that happen?’

These kinds of bugs could’ve been avoided, Buell said. Similar errors had been detected in his studies of previous elections going back to 2010. ES&S issued a software update before the June primary, but the misaligned spreadsheets still occurred in November.


“This is version 2 and you would hope that fixes bugs you saw in version 1, but we still see a couple of things that just shouldn’t happen,” he said.

Other errors in iVotronic machines abounded statewide, Buell said, especially in how the devices logged events — from votes being cast to system errors — using indecipherable numerical codes. Machines also started listing events out of chronological order.

“How does one write software that makes that happen?” Buell said. “Why is the thing suddenly going from the November election day to the June primary day and back? That’s really curious.”

Buell said he suspects the errors he found were the products of shoddy programming, and not foul play. Still, the report questions again the trustworthiness of South Carolina’s voting equipment. The state is one of five in the country to exclusively use direct-recording electronic machines, or DREs — as paper-free touchscreen ballot devices are known — at a time when many states are trending back toward paper-based voting.

A group of South Carolina voters filed a federal lawsuit last summer seeking to force the state to adopt paper-based machines, and a hearing is scheduled for later this month. But Buell, who is an expert witness in the case, said he’s optimistic the state legislature will move quickly on replacing its voting devices with paper-marked ballots that can be read with optical scanners.


“We’ve learned there’s such a thing as too much technology,” he said. “And the solution is to go back to the technology that works best. If we go back to hand-marked paper, that’ll take a lot of the complexity out.”

Chris Wlaschin, ES&S’ vice president for security, told StateScoop the company is reviewing Buell’s report. Last August, Wlaschin told CyberScoop that ES&S has vetted researchers test its equipment “alert us to vulnerabilities so that we can patch them and get certified if we need to.”

But Buell said he is not convinced ES&S’s software is up to snuff, pointing to the errors his report calls out.

“It calls into question the process, and some of it really is a software quality control issue,” he said. “I would not let my students get away with these kinds of errors.”

Benjamin Freed

Written by Benjamin Freed

Benjamin Freed was the managing editor of StateScoop and EdScoop, covering cybersecurity issues affecting state and local governments across the country. He wrote extensively about ransomware, election security and the federal government’s role in assisting states and cities with information security.

Latest Podcasts