http://verifiedvoting.org/verifier/index.php
http://accurate-voting.org/
DEFCON WHAT?
http://www.sos.state.oh.us/SOS/elections/voterInformation/equipment/VotingSystemReviewFindings.aspx
http://www.sos.state.oh.us/SOS/elections/voterInformation/equipment/VotingSystemReviewFindings/EVERESTtestingReports.aspx
Ohio Secretary of State Hires Hackers To Test Voting Machines
[youtube=http://www.youtube.com/watch?v=GIBNYMWwLog]
THE FINDINGS (WORST NEWS FIRST):
MINIBAR KEY CAN OPEN ANY DIEBOLD MACHINE [VIDEO]
https://media.defcon.org/dc-16/video/Defcon16-Sandy_Mouse_Clark-Climbing_Everest.m4v
http://www.defcon.org/html/defcon-16/dc-16-speakers.html#Clark
UNIVERSAL PASSWORD WAS ‘DIEBOLD’
(THEY’VE CHANGED THEIR NAME TO ‘PREMIER’, SO…) [VIDEO]
http://www.wired.com/threatlevel/2009/03/diebold-admits/
http://www.wired.com/threatlevel/2009/03/video-ca-hearin/
Diebold Acknowledging Audit Log Flaws / March 20, 2009
“Earlier this week Premier Elections Solutions (formerly Diebold Election Systems) admitted in a hearing that the audit logs on its tabulation software fail to record significant events that occur on the machines — such as when an error in the software deletes votes or when election officials intentionally delete ballots from the system. These, of course, are the most basic events that an audit log should record.”
ALSO HAS DELETE BUTTON
http://www.wired.com/threatlevel/2009/03/ca-report-finds/
According to the California Sec.of State’s report: “The Clear buttons … allow inadvertent or malicious destruction of critical audit trail records in all Gems version 1.18.19 jurisdictions, risking the accuracy and integrity of elections conducted using this voting system. Five years after the company recognized the need to remove the Clear buttons from the GEMS audit log screens, not only Humboldt, San Luis Obispo and Santa Barbara Counties in California but jurisdictions in other parts of the country, including several counties in Texas and Florida, continue to use Gems version 1.18.19….”
ONGOING PROBLEMS
http://www.wired.com/threatlevel/2009/06/voting-machine-company-agrees-to-hand-over-source-code/
http://www.wired.com/threatlevel/2009/06/voting-machine-adds-nearly-5000-ballots-to-tally/
http://www.wired.com/threatlevel/2008/12/dieboldpremier/
http://www.wired.com/threatlevel/2008/12/straight-party/
OHIO SECURITY REVIEW
http://www.crypto.com/blog/?n=10&offset=14
Ohio Voting Security Review Released / 14 December 2007
Today Ohio Secretary of State Jennifer Brunner released the results of a comprehensive security review of the electronic voting systems used in her state. The study was similar in scope to the California top-to-bottom review conducted this summer (with which I was also involved), covering the systems used in Ohio. The project contracted several academic teams and others to examine the election procedures, equipment and source code used in that state, with the aim of identifying any problems that might render elections vulnerable to tampering under operational conditions.
The ten-week project examined in detail the touch-screen, optical scan, and election management technology from e-voting vendors ES&S, Hart InterCivic and Premier Election Systems (formerly Diebold). Project PI Patrick McDaniel (of Penn State) coordinated the academic teams and led the study of the Hart and Premier Systems (parts of which had already been reviewed in the California study). Giovanni Vigna (of WebWise Security and UCSB) led the team that did penetration testing of the ES&S system.
I led the University of Pennsylvania-based team, which examined the ES&S source code. This was particularly interesting, because, unlike Hart and Premier, the full ES&S source code suite hadn’t previously been studied by the academic security community, although ES&S products are used by voters in 43 US states and elsewhere around the world. The study represented a rather unique opportunity to contribute to our understanding of e-voting security in practice, both inside and outside Ohio.
My group — Adam Aviv, Pavol Cerny, Sandy Clark, Eric Cronin, Gaurav Shah, and Micah Sherr — worked full-time with the source code and sample voting machines in a secure room on the Penn campus, trying to find ways to defeat security mechanisms under various kinds of real-world conditions. (Our confidentiality agreement prevented us from saying anything about the project until today, which is why we may have seemed especially unsociable for the last few months.)
As our report describes, we largely succeeded at finding exploitable vulnerabilities that could affect the integrity of elections that use this equipment. The report is long and detailed, and speaks for itself far better than I can here. A brief statement from Patrick McDaniel and me can be found here:
http://www.crypto.com/papers/ohio-stmt.pdf
Our full 334 page report can be downloaded (11MB, PDF format) from the Ohio Secretary of State’s web site at:
http://www.sos.state.oh.us/SOS/upload/everest/14-AcademicFinalEVERESTReport.pdf .
There were other parts to the study (called “Project EVEREST”) than just the source code analysis, and, of course, there is also the question of how to actually secure elections in practice given the problems we found. The Ohio Secretary of State’s web site has a nice summary of the review and of the Secretary’s recommendations.
CONTACT
Sandy Clark
http://www.aos.princeton.edu/WWWPUBLIC/sandy/
email : clarks [at] seas.upenn [dot] edu
Matt Blaze
http://www.crypto.com/
email : blaze [at] cis.upenn [dot] edu
CALIFORNIA CODE REVIEW
http://www.crypto.com/blog/ca_voting_report/
California voting systems code review now released / 2 August 2007
Readers of this blog may recall that for the last two months I’ve been part of a security review of the electronic voting systems used in California. Researchers from around the country (42 of us in all) worked in teams that examined source code and documents and performed “red team” penetration tests of election systems made by Diebold Election Systems, Hart InterCivic and Sequoia Voting Systems.
The red team reports were released by the California Secretary of State last week, and have been the subject of much attention in the nationwide press (and much criticism from the voting machine vendors in whose systems vulnerabilities were found). But there was more to the study than the red team exercises.
Today the three reports from the source code analysis teams were released. Because I was participating in that part of the study, I’d been unable to comment on the review before today. (Actually, there’s still more to come. The documentation reviews haven’t been released yet, for some reason.) Our reports can now be downloaded from http://www.sos.ca.gov/elections/elections_vsr.htm .
I led the group that reviewed the Sequoia system’s code (that report is here:
http://www.sos.ca.gov/elections/voting_systems/ttbr/sequoia-source-public-jul26.pdf
The California study was, as far as I know, the most comprehensive independent security evaluation of electronic voting technologies ever conducted, covering products from three major vendors and investigating not only the voting machines themselves, but also the back-end systems that create ballots and tally votes. I believe our reports now constitute the most detailed published information available about how these systems work and the specific risks entailed by their use in elections.
My hats off to principal investigators Matt Bishop (of UC Davis) and David Wagner (of UC Berkeley) for their tireless skill in putting together and managing this complex, difficult — and I think terribly important — project. By law, California Secretary of State Debra Bowen must decide by tomorrow (August 3rd, 2007) whether the reviewed systems will continue to be certified for use throughout the state in next year’s elections, and, if so, whether to require special security procedures where they are deployed.
We found significant, deeply-rooted security weaknesses in all three vendors’ software. Our newly-released source code analyses address many of the supposed shortcomings of the red team studies, which have been (quite unfairly, I think) criticized as being “unrealistic”. It should now be clear that the red teams were successful not because they somehow “cheated,” but rather because the built-in security mechanisms they were up against simply don’t work properly. Reliably protecting these systems under operational conditions will likely be very hard. The problems we found in the code were far more pervasive, and much more easily exploitable, than I had ever imagined they would be.
Our reports speak for themselves (and they do a lot of speaking, I’m afraid, with over 300 pages released so far), so I won’t try to repeat what’s in them here. What follows are strictly my own thoughts about what we learned and how we did what we did. My group, which was based in Berkeley, looked at the source code of the Sequoia system. That system includes touch-screen and optical scan voting machines used at polling places and a back-end ballot preparation and vote tallying database at the elections headquarters. At over 800K lines of code, Sequoia’s was the largest of the three codebases reviewed, and ours was the largest team in the project (Arel Cordero, Sophie Engle, Chris Karlof, Naveen Sastry, Micah Sherr, Till Stegers and Ka-Ping Yee — a group of extraordinary talent and energy if ever there was one).
Reviewing that much code in less than two months was, to say the least, a huge undertaking. We spent our first week (while we were waiting for the code to arrive) setting up infrastructure, including a Trac Wiki on the internal network that proved invaluable for keeping everyone up to speed as we dug deeper and deeper into the system. By the end of the project, we were literally working around the clock. To protect the vendor’s proprietary software, our lab was in a small room on the UC Berkeley campus equipped with a lock not on the building master key, a monitored alarm, a safe in which we stored our disk drives when no one was there and an air-gapped isolated network of dedicated workstations. The ventilation in our small windowless room never quite worked, and whenever anyone had a cold we’d all eventually catch it. (Not that I’m complaining; the minor physical discomforts and long hours were tiny prices to pay for the opportunity to study the inner workings of something rarely exposed to outside scrutiny, and yet so central to our democracy.)
Because of the way the project was organized we didn’t have any actual voting machines at Berkeley, only source code. All the vendor hardware was in another secure room at the Secretary of State’s office in Sacramento and was intended primarily for use by the penetration test red teams. We ended up collaborating closely with the red team (based at UC Santa Barbara) that was working on our system (and who issued their own report).
So what can we learn from all this?
In spite of the short time and other sub-optimal conditions, the project found deeply-rooted security weaknesses in the software of all three voting systems reviewed. I was especially struck by the utter banality of most of the flaws we discovered. Exploitable vulnerabilities arose not so much from esoteric weaknesses that taxed our ingenuity, but rather from the garden-variety design and implementation blunders that plague any system not built with security as a central requirement. There was a pervasive lack of good security engineering across all three systems, and I’m at a loss to explain how any of them survived whatever process certified them as secure in the first place. Our hard work notwithstanding, unearthing exploitable deficiencies was surprisingly — and disturbingly — easy.
Much of the controversy around electronic voting concerns the possibility of hidden “backdoors” incorporated by a nefarious vendor. Properly obfuscated, such mischief would be almost impossible to detect. Yet our reports chronicle software weakened not by apparent malice but by a litany of elementary mistakes: static cryptographic keys, unsecured interfaces, poorly validated inputs, buffer overflows, and basic programming errors in security-critical modules. Deliberate backdoors in these systems, if any existed, would be largely superfluous.
Unfortunately, while finding many of the vulnerabilities may have been straightforward enough, fixing them won’t be. The root problems are architectural. All three reviewed products are, in effect, large-scale distributed systems that have many of their security-critical functions performed by equipment sent out into the field. In particular, the integrity of the vote tallies depends not only on the central computers at the county elections offices, but also on the voting machines (and software) at the polling places, removable media that pass through multiple hands, and complex human processes whose security implications may not be clear to the people who perform them. In other words, the designs of these systems expose generously wide “attack surfaces” to anyone who seeks to compromise them. And the defenses are dangerously fragile — almost any bug, anywhere, has potential security implications.
This means that strengthening these systems will involve more than repairing a few programming errors. They need to be re-engineered from the ground up. No code review can ever hope to identify every bug, and so we can never be sure that the last one has been fixed. A high assurance of security requires robust designs where we don’t need to find every bug, where the security doesn’t depend on the quixotic goal of creating perfect software everywhere. In the short term, election administrators will likely be looking for ways to salvage their equipment with beefed up physical security and procedural controls. That’s a natural response, but I wish I could be more optimistic about their chances for success. Without radical changes to the software and architecture, it’s not clear that a practical strategy that provides acceptable security even exists. There’s just not a lot to work with. I don’t envy the officials who need to run elections next year.
NO LONGER HYPOTHETICAL: KENTUCKY INDICTMENTS
http://www.schneier.com/blog/archives/2009/03/election_fraud.html
http://www.lex18.com/Global/story.asp?S=10037216&nav=menu203_2
http://www.bradblog.com/?p=7001
http://media.kentucky.com/smedia/2009/03/19/17/clayindict.source.prod_affiliate.79.pdf
http://www.crypto.com/blog/vote_fraud_in_kentucky/
Is the e-voting honeymoon over?
Electronic Vote Rigging in Kentucky
Eight Clay County, Kentucky election officials were charged last week with conspiring to alter ballots cast on electronic voting machines in several recent elections. The story was first reported on a local TV station and was featured on the election integrity site BradBlog. According to the indictment, the conspiracy allegedly included, among other things, altering ballots cast on the county’s ES&S iVotronic touchscreen voting machines.
So how could this have happened?
The iVotronic is a popular Direct Recording Electronic (DRE) voting machine. It displays the ballot on a computer screen and records voters’ choices in internal memory. Voting officials and machine manufacturers cite the user interface as a major selling point for DRE machines — it’s already familiar to voters used to navigating touchscreen ATMs, computerized gas pumps, and so on, and thus should avoid problems like the infamous “butterfly ballot”. Voters interact with the iVotronic primarily by touching the display screen itself. But there’s an important exception: above the display is an illuminated red button labeled “VOTE”. Pressing the VOTE button is supposed to be the final step of a voter’s session; it adds their selections to their candidates’ totals and resets the machine for the next voter.
The Kentucky officials are accused of taking advantage of a somewhat confusing aspect of the way the iVotronic interface was implemented. In particular, the behavior (as described in the indictment) of the version of the iVotronic used in Clay County apparently differs a bit from the behavior described in ES&S’s standard instruction sheet for voters. A flash-based iVotronic demo available from ES&S shows the same procedure, with the VOTE button as the last step. But evidently there’s another version of the iVotronic interface in which pressing the VOTE button is only the second to last step. In those machines, pressing VOTE invokes an extra “confirmation” screen. The vote is only actually finalized after a “confirm vote” box is touched on that screen. (A different flash demo that shows this behavior with the version of the iVotronic equipped with a printer is available from ES&S here). So the iVotronic VOTE button doesn’t necessarily work the way a voter who read the standard instructions might expect it to.
The indictment describes a conspiracy to exploit this ambiguity in the iVotronic user interface by having pollworkers systematically (and incorrectly) tell voters that pressing the VOTE button is the last step. When a misled voter would leave the machine with the extra “confirm vote” screen still displayed, a pollworker would quietly “correct” the not-yet-finalized ballot before casting it. It’s a pretty elegant attack, exploiting little more than a poorly designed, ambiguous user interface, printed instructions that conflict with actual machine behavior, and public unfamiliarity with equipment that most citizens use at most once or twice each year. And once done, it leaves behind little forensic evidence to expose the deed.
Current electronic voting systems have been widely — and justifiably — criticized for being insufficiently secure against vote tampering and other kinds of election fraud. I led the team at U. Penn that examined the ES&S iVotronic — the same machine used in Kentucky — as part of the Ohio EVEREST voting systems study in 2007. We found numerous exploitable security weaknesses in these machines, many of which would make it easy for a corrupt voter, pollworker, or election official to tamper with election results. Other studies have reached similarly grim conclusions about most of the other e-voting products used in the US and elsewhere. But these results, alarming as they are, also raise a perplexing question: if the technology is so vulnerable, why have there been so few (if any) substantiated cases of these systems being attacked and manipulated in actual elections?
A plausible explanation is simply that the bad guys haven’t yet caught up with the rich opportunities for mischief that these systems provide. It takes time for attackers to recognize and learn to exploit security weaknesses in new devices, and touchscreen voting machines have been in wide use for only a few years (most US counties purchased their current systems after 2002, with funding from the Help America Vote Act). For example, the computers connected to the Internet were for a long time largely vulnerable to network-based attack, but it took several years before viruses, worms, and botnets became serious threats in practice. In other words, new technologies sometimes enjoy an initial relatively crime-free “attack honeymoon” in which even very weak defenses seem to be sufficient. But eventually, the criminals arrive, and, once they climb the learning curve, the world becomes a much more hostile place very quickly.
We might ask, then, what the (alleged) Kentucky conspiracy tells us about the e-voting attack honeymoon. Are the bad guys catching up? On the one hand, we might be comforted by the relatively “low tech” nature of the attack — no software modifications, altered electronic records, or buffer overflow exploits were involved, even though the machines are, in fact, quite vulnerable to such things. But a close examination of the timeline in the indictment suggests that even these “simple” user interface exploits might well portend more technically sophisticated attacks sooner, rather than later.
Count 9 of the Kentucky indictment alleges that the Clay County officials first discovered and conspired to exploit the iVotronic “confirm screen” ambiguity around June 2004. But Kentucky didn’t get iVotronics until at the earliest late 2003; according to the state’s 2003 HAVA Compliance Plan, no Kentucky county used the machines as of mid-2003. That means that the officials involved in the conspiracy managed to discover and work out the operational details of the attack soon after first getting the machines, and were able to use it to alter votes in the next election.
Yes, the technique is low-tech, but it’s also very clever, and not at all obvious. The only way for them to have discovered it would have been to think hard and long about how the machines work, how voters would use them, and how they could subvert the process with the access they had. And that’s just what they did. They found the leverage they needed quickly, succeeding at using their discovery to steal real votes, and apparently went for several years without getting caught. It seems reasonable to suspect that if a user interface ambiguity couldn’t have been exploited, they would have looked for — and perhaps found — one of the many other exploitable weaknesses present in the ES&S system.
But that’s not the worst news in this story. Even more unsettling is the fact that none of the published security analyses of the iVotronic — including the one we did at Penn — had noticed the user interface weakness. The first people to have discovered this flaw, it seems, didn’t publish or report it. Instead, they kept it to themselves and used it to steal votes.
–
MARYLAND SUES
http://www.truthout.org/122608VA
Maryland Files Claim to Recover Voting Machine Expenses
BY Laura Smitherman / Baltimore Sun / 25 December 2008
After years of problems with the state’s touch-screen voting system, Maryland has filed a claim to recover $8.5 million from the maker of the machines, Premier Election Solutions, Attorney General Douglas F. Gansler announced yesterday. The claim seeks costs the state incurred to correct security gaps in the voting system that were uncovered several years ago by independent investigations. The state has paid $90 million under a contract with Premier, formerly known as Diebold, since 2001. During that time, the two parties have had a sometimes-rocky relationship as hitches in the voting system surfaced. “Under basic contract law, this is money that should be paid by Diebold or its successor and not by the taxpayers,” Gansler said in an interview. “This is sort of the final chapter of the touch-screen machines that we’ve had issues with in Maryland since we’ve gotten them.”
Last year, Gov. Martin O’Malley and the General Assembly decided to eventually dump the touch-screen equipment and instead move toward buying new optical-scan machines, which read paper ballots filled in by voters with pencil or pen and allow for a manual recount. The new system is expected to cost about $20 million. Premier President Dave Byrd said in a statement that the state’s claim appears to be based on “inaccurate and unfounded assumptions.” He also said the 2008 election, in which Premier’s machines were used, was one of the “smoothest” in the state’s history, culminating what he called a “seven-year track record of success.” The “claim may be an attempt to retroactively change the rules of the contracts, but it does not change or reflect the actual record of successful performance,” Byrd said.
State officials contend, however, that the November election came off with few glitches precisely because they had spent so much money on upgrades and technical fixes. According to the claim, the state Board of Elections has implemented, largely at its own expense, measures to correct flaws uncovered by assessments ordered by former Gov. Robert L. Ehrlich Jr. and by the General Assembly. Premier and the state haven’t always been on the outs. After warnings about security vulnerabilities from three computer experts – Johns Hopkins University professor Avi Rubin and the two hired by the state – a voter advocacy group sued in 2004. The group alleged that the state should not have certified Premier’s machines for use in elections. The state defended Premier at the time, and won.
That history is not lost on Premier, which said its good relations with the state made the attorney general’s recent claim “all the more of a surprise,” according to the company’s written response. The company said its system satisfies contractual security requirements and that the state decided to incorporate additional measures based on the reports it commissioned. The company’s response relied in part on the state’s legal defense from four years ago that contended no system is perfect and pointed out that there had not been a single report of a security breach. Premier also said that it has provided additional services and materials beyond what was required under the contract at no additional charge.
Other problems have surfaced that aren’t addressed in the state’s claim. Diebold had to replace parts in voting machines used in the 2004 election because of glitches in the “motherboard,” the main circuit board, that could cause the machines to freeze. Then in the 2006 primary election, the state’s new “e-poll books,” electronic check-in terminals made by Diebold that are distinct from the touch-screen voting units, crashed repeatedly. “Voter confidence and the integrity of the process were undermined by the use of these machines,” Gansler said. “It took nearly 10 years for us to figure out we shouldn’t be using them, but during the course of that time we did everything we could to ensure reliability.” The claim now goes before a state procurement officer, whose decision on the matter could then be petitioned to the Maryland State Board of Contract Appeals. Until the dispute is settled, the state is withholding payment on $4 million in bills for services Premier provided for the 2008 elections.
–
GHOSTSCRIPT COPYLEFT INFRINGEMENT
http://www.truthout.org/110508VA
Diebold Faces GPL Infringement Lawsuit Over Voting Machines
BY Ryan Paul / Ars Technica / 04 November 2008
Artifex Software, the company behind the open source Ghostscript PDF processing software, has filed a lawsuit against voting machine vendor Diebold and its subsidiary Premier Election Solutions. Artifex says that Diebold violated the GPL by incorporating Ghostscript into commercial electronic voting machine systems. Ghostscript, which was originally developed in the late 80s, is distributed for free under the GNU General Public License (GPL). This license permits developers to study, modify, use, and redistribute the software but requires that derivatives be made available under the same terms. Companies that want to use Ghostscript in closed-source proprietary software projects can avoid the copyleft requirement by purchasing a commercial license from Artifex. Among commercial Ghostscript users who have purchased licenses from Artifex are some of the biggest names in the printing and technology industries, including HP, IBM, Kodak, Siemens, SGI, and Xerox.
Evidence of Diebold’s Ghostscript use first emerged last year when electronic voting machine critic Jim March was conducting analysis of Pima County voting irregularities. He brought a technical question to the Ghostscript mailing list relating to his investigation and mentioned in passing that Diebold’s use of Ghostscript could potentially fall afoul of the GPL. This view was shared by Ghostscript developer Ralph Giles, who referred the matter to the Artifex business staff so that it could evaluate the legal implications. “Seems likely that they are not respecting our software license in this case. We do not consider bundling as an integrated component intended to work with other software as ‘mere aggregation’ under the GPL,” wrote Giles in a mailing list post. According to InformationWeek, Artifex is seeking over $150,000 in damages and is calling for the court to block usage of the equipment. Security researchers have uncovered numerous security vulnerabilities in voting machines produced by several major vendors, including Diebold. The voting machine company has faced several high-profile lawsuits in the past, including one filed by the state of California, where Diebold machines were subsequently banned over fraudulent claims.
–
VOLUNTARY REFORM
http://www.securityfocus.com/brief/968
U.S. issues revised e-voting standards
BY Robert Lemos / 2009-06-01
“The National Institute of Standards and Technology (NIST) delivered an update on Monday to the United States’ electronic voting standards, adding more requirements to test systems for accuracy and reliability and additional rules to make paper audit trails easier to review. The draft revision, known as the Voluntary Voting System Guidelines (VVSG) version 1.1, adds more stringent recommendations for testing and auditing as well as requirements that election software and updates be digitally signed and improved ease-of-use for poll workers. The U.S. Election Assistance Commission (EAC) announced on Monday that the draft revision will be available for public comment for the next 120 days. “The guidelines announced today are designed to further improve the quality and efficiency of the testing conducted on voting systems,” John Wack, NIST voting team manager, said in a statement. “This enables improvements to be made sooner rather than later when the next full set of standards is finalized.”
Election systems have come under scrutiny following errors that have led to lost votes and software glitches that have shutdown machines on voting day. In 2007, an election system failure may have resulted in a loss for the Democratic challenger in a contest for one of Florida’s seats in the U.S. House of Representatives, when the configuration of the electronic ballot likely resulted in a large number of people in a Democratic-leaning county failing to vote. In midterm elections the prior year, many states took extra security precautions after researchers found that Diebold’s election systems contained a serious flaw.”
VVSG 1.1
Proposed Draft Revisions to 2005 Voluntary Voting System Guidelines
http://www.eac.gov/program-areas/voting-systems/voting-system-certification/2005-vvsg/draft-revisions-to-the-2005-voluntary-voting-system-guidelines-vvsg-v-1-1
http://www.eac.gov/program-areas/voting-systems/voting-system-certification/2005-vvsg/faqs-on-proposed-vvsg-1-1-the-revision-to-the-2005-vvsg
–
E-VOTING RULED UNCONSTITUTIONAL IN GERMANY
DUE TO LACK OF TRANSPARENCY, POSSIBILITY OF FRAUD
http://www.bradblog.com/?p=6961
http://www.dw-world.de/dw/article/0,,4069101,00.html?maca=en-tagesschau_englisch-335-rdf-mp
http://en.wikipedia.org/wiki/Chaos_Computer_Club
–
ANY GOOD NEWS AT ALL? (YES, ACTUALLY)
http://news.cnet.com/8301-1009_3-10258634-83.html
Hacker named to Homeland Security Advisory Council
BY Elinor Mills / June 5, 2009
Jeff Moss, founder of the Black Hat and Defcon hacker and security conferences, was among 16 people sworn in on Friday to the Homeland Security Advisory Council. The HSAC members will provide recommendations and advice directly to Secretary of Homeland Security Janet Napolitano. Moss’ background as a computer hacker (aka “Dark Tangent”) and role as a luminary among young hackers who flock to Defcon in Las Vegas every summer might seem to make him an odd choice to swear allegiance to the government. (Although before running his computer conferences, Moss also worked in the information system security division at Ernst & Young.) I’d like to hear some of the banter as he rubs elbows with the likes of former CIA (Bill Webster) and FBI directors (Louis Freeh), Los Angeles County sheriff, Miami mayor, New York police commissioner, governors of Maryland and Georgia, former Colorado Sen. Gary Hart, and the president of the Navajo Nation.
In an interview late on Friday, Moss, who is 39, said he was surprised when he got the call and was asked to join the group.
“I know there is a newfound emphasis on cybersecurity and they’re looking to diversify the members and to have alternative viewpoints,” he said. “I think they needed a skeptical outsider’s view because that has been missing.” Asked if there was anything in particular he would advocate, Moss said: “There will be more cyber announcements in coming weeks and once that happens my role will become more clear. This meeting was focused on Southwest border protection… With things like Fastpass and Safe Flight, everything they are doing has some kind of technology component.”
Moss, who is genuinely humble, said he was “fantastically honored and excited to contribute” to the HSAC and not concerned with losing any street cred among what some would call his fan base. He did concede that his new position would give him an unfair advantage in Defcon’s “Spot The Fed” contest in which people win prizes for successfully outing undercover government agents. Security consultant Kevin Mitnick, who spent five years in prison on computer-related charges and was once the FBI’s most-wanted cybercriminal, praised Moss’ diplomacy, but said: “I’m surprised to see Jeff on the list. I would have expected (crypto/security guru and author) Bruce Schneier to be on the council.” Moss “is a great crowd pleaser” and “he’s just bad enough for them to say ‘we’re crossing the ranks,'” said journalist and threat analyst Adrian Lamo, who served two years of probation for breaking into computer networks. “But the reality is he’s as corporate as hiring someone out of Microsoft.”
http://www.defcon.org/
http://www.blackhat.com/
BONUS: DEFCON 16 BADGE HACKS
http://www.grandideastudio.com/portfolio/defcon-16-badge/
http://www.youtube.com/profile?user=kingpinempire&view=videos&query=DC16
–
D.C.’s CTO TAPPED AS NATION’S FIRST CIO
http://www.infoworld.com/d/adventures-in-it/meet-nations-first-cio-836
Meet the nation’s first CIO
BY Galen Gruman / 2009-03-06
In a surprise announcement, President Obama has named the nation’s first federal CIO [1]: Vivek Kundra, CTO of the District of Columbia. (He has yet to name the position he did promise he would create: the first national CTO.) So who is Kundra, and what might his appointment mean for the federal government’s direction for and spending on technology? As the federal CIO, he will oversee a $71 billion IT budget and manage technology interoperability among agencies. Kundra told a press conference that he will investigate how the government might improve its technology investments and make more information accessible to citizens through the Internet. He’s done both as D.C.’s CTO.
The District of Columbia has been a leader in smart deployment of technology for years, boasting a succession of strong CTOs. Under Suzanne Peck’s tenure, previous to Kundra’s, D.C. was among the first to use SOA to rationalize software development [2] efforts, to use XML to make government operational data open for mashups [3], and to deploy next-gen wireless technology for public safety [4] and other agency usage. Kundra became CTO in 2006 and quickly staked out his own innovation focus. As D.C.’s CTO, Kundra has emphazied what he calls a stock-market approach to IT project management and the adoption of consumer technologies in business. Both approaches come from the same epiphany he recalls having: The technology most users employ at work is kludgy compared to what they use in their daily routines, even though consumer technologies are often less expensive or even free. “For some weird reason I cannot understand, the way we organize ourselves at work is so much less agile than what we do in our personal lives,” Kundra told InfoWorld. “Why not use consumer technology at work?”
The IT “stock market”
As D.C.’s CTO, Kundra hired a team of analysts to track projects — in the style of a financial analyst — on a daily basis. Smaller projects get bundled into “funds” of related efforts. Pretty quickly, the successes and failures were obvious. For example, the analysts discovered that a three-year enterprise content management project had made little progress and was run by project managers who had four previous failures. “It was not going anywhere. So I decided to ‘sell’ the stock — I killed the project — and put that capital elsewhere,” Kundra recalls. In this case, he redirected the money to add mobile laptops to police cars. The stock metaphor made sense to more business-minded leaders at the district, but Kundra admits he had to really sell the concept to most employees and the 87 agency heads served by his team. “It was an education,” he notes drily. What really sold the concept was the result: lower cost due to fewer long-burning misfires.
The stock approach also supplanted the traditional project management mentality of creating specifications and periodically assessing progress against them subjectively. “I wanted a more data-driven model — after all, the data is the data. If you’re over budget for two or three quarters, you can’t avoid being exposed,” Kundra says. “People don’t make tough decisions easily, so you have to show them the data. [As government leaders,] it’s our duty to make sure they’re not failing,” he adds. Objective measurements make that assessment easier. For Kundra, the stock-market approach is really just a metaphor for a technique driven by ongoing analytics. “You can use a different metaphor if that works better in your industry,” he says. But essential to success is a “ruthless discipline” in your data collection, analysis, and consequent management decisions.
Freeing up resources for meaningful innovation
Kundra was not focused solely on weeding out bad “stocks.” He also used this approach to free up capital for innovative bets. For example, he’s initiated a project that combines YouTube with Wikipedia to increase government’s accountability to citizens. All requests for proposals (RFPs) for city contracts are posted on a Web site in a wiki, with all bids being available as PDF attachments. Attendee lists from public hearings are scanned and posted as well, as are videos of hearings and even RFP presentations. Also posted or linked are any district communications with the potential vendors on the RFPs. If this effort succeeds, “no one can say that there are deals done behind closed doors,” he says. “It’s tough in tight budgets to find the innovative path,” Kundra notes, which is why he was so focused on gaining stock-market-like efficiencies in weeding out wasteful projects and identifying strong ones. Thanks to the savings already established from this approach, he was able to set up an R&D lab to test new ideas. The two areas of Kundra’s fancy are new-generation mobile devices — “I believe the iPhone is the future [5] for integrated voice, data and video” — and Web 2.0 technologies [6], thus the experiments using wikis and YouTube.
http://data.octo.dc.gov/
http://www.appsfordemocracy.org/
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
OPEN SOURCE VOTING MACHINE EFFORTS
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
BALLOT BROWSER [SOURCE CODE]
http://democracycounts.blogspot.com/2008/12/wheres-that-counting-software-youre.html
http://www.tevsystems.com/warning.html
THE HUMBOLDT ANOMALY
http://www.wired.com/threatlevel/2008/12/unique-election/
Error in Diebold Voting Software Caused Lost Ballots in California
BY Kim Zetter / December 8, 2008
“Humboldt County election director Carolyn Crnich discovered the missing ballots only because she happened to implement a new and innovative auditing system this year that was spearheaded by members of the public who helped her develop it. Humboldt County, which is a small county located in northern California near the Oregon border, implemented the Transparency Project, whereby every paper ballot (Humboldt uses only paper ballots) gets digitally scanned by a separate commercial scanner, not made by a voting machine company, so that the ballot images can then be posted on the internet for anyone to examine and conduct their own independent recounts. (See this post for more about how the Transparency Project works.)
It was through the Transparency Project that Crnich and Mitch Trachtenberg, a volunteer who helped design part of the project, discovered the problem with the Premier software on November 30th after they finished scanning all of the ballots through the Transparency Project’s commercial scanner two days before the county was required to certify its election results. After the county had already scanned and tabulated the 60,000+ ballots with the Premier voting system and created the official tally, the Transparency Project workers then spent 65 hours scanning the ballots into a Fujitsu scanner and creating digital images of each ballot. They discovered in doing so, that they had 216 more ballots recorded than the number of ballots that were counted by the Premier tabulation system.
Parke Bostrom, one of the Transparency Project volunteers, wrote in a blog post about the issue, “The audit log is not truly a ‘log’ in the classical computer program sense, but is rather a ‘re-imagining’ of what GEMS would like the audit log to be, based on whatever information GEMS happens to remember at the end of the vote counting process.”
HUMBOLDT TRANSPARENCY PROJECT
http://humtp.com/
http://www.bradblog.com/?p=6733
http://www.wired.com/threatlevel/2008/12/unique-transpar/
“Under the Transparency Project, after the ballots are officially scanned and tabulated by the Premier system, they’re scanned a second time by a separate commercial scanner, not made by any voting machine company, so that the ballot images can then be posted on the internet for the public to examine and conduct independent recounts. Every ballot image is imprinted with a unique serial number as it’s scanned through the commercial scanner to verify its authenticity, and batches of ballot images are hashed to verify that they haven’t been altered before they’re posted online or saved to DVDs.
To make it easier for the public to tally the votes, Trachtenburg, an independent programmer who has launched a company called Trachtenberg Election Verification Software, wrote a program pro bono to allow anyone to sort through the Humboldt ballots by precinct or race. The sorting software, called Ballot Browser (image above right shows the software’s user interface), is an open source program written in Python to run on a Windows or Linux platform. The Humboldt version is running on Debian Linux Etch and uses a Fujitsu high-speed scanner also using Debian Linux.
Ballot Browser displays each ballot in a window and highlights the spot where it thinks the voter has made his choice. The display can be turned off to speed up scanning to 1,000 ballots an hour. Crnich said she got the idea for the project from Kevin Collins, who expressed concerns during a public meeting about the trustworthiness of proprietary voting systems. He wanted to know why it wasn’t possible for everyone to examine every ballot. “That was the seed,” Crnich told Threat Level.
But getting ballots into the hands of the public presented a problem. California’s election law says that once ballots are scanned and sealed in containers by poll workers after an election, they can’t be re-opened except to be recycled or destroyed or unless officials suspect there might be something wrong with the ballots. Crnich determined that the latter provided a possible loophole. Since there was already a strong public perception that there was something wrong with the ballots, she concluded that this was the permission she needed to make the ballots available for public perusal.
Trachtenberg said he came away from his first meeting with Crnich feeling very pleasantly surprised. “I just thought, ’she gets it, she gets it,’” he said. “I had contacted the previous election staff years ago complaining in particular that the [voting] system wouldn’t leave a paper audit trail. So I was really pleasantly surprised when I discovered that Carolyn was behind the idea of election transparency and wanted to get on board.”
Crnich convinced her board of supervisors to purchase an off-the-shelf Fujitsu high-speed scanner and the group launched the project in limited form during the June primary. Trachtenberg said before the launch they had trouble getting the scanner to work with their Linux scanning program, but contacted M. Allen Noah, administrator of the SANE Project (the open scanning protocol known as Scanner Access Now Easy that works with Linux), who advised them on how to make it work.
It took about five days to scan about 32,000 ballots cast in the primary election. The ballot scans amounted to about 8.2 GB of data and filled up 3 DVDs. They didn’t actually do a re-count of the ballots in June, however. They just did random spot checks to establish that their system worked. The number of ballots they scanned with their Fujitsu matched very closely the number they had scanned with the Premier system, with the exception of one or two ballots.
The November election, by contrast, was more complicated and took 65 hours to scan because the election involved 64,161 ballots that were double-sided. The volunteers were deputized before they started the project, and the chain-of-custody on the ballots was carefully controlled throughout the process. A county worker removed the ballots from secure storage, and the ballots were never left alone with one person at a time. The workers had to fill out forms carefully tracking the time the ballots left secure storage, the time at which they were unsealed from containers, and other steps.
Crnich said the partnership of technical experts with election staff turned out to be the perfect combination. “With my willingness to say yes let’s do what we can to make this a transparent and trusted election and with Mitch’s ability to develop the software in open source and make it available, it’s worked out I think to the advantage of voters in Humboldt County,” Crnich said. “[The point] was not to catch anybody or anything, it was just to make the information available to the public. Here it is. If you question our results, please look at it yourself.”
Once they’d finished scanning the November ballots, they knew immediately they had a problem because the number of ballots they scanned through the Fujitsu printer didn’t match the number of ballots that had been tabulated by the Premier system. They discovered that the Premier system had dropped a batch of 197 ballots from its tabulation software. The voting company has acknowledged that a problem with its software caused the system to drop the ballots and that the software has contained the error since 2004.
Trachtenberg said the problem they discovered underscored for him that proprietary voting systems and “secret counting” methods aren’t in the best interest of democracy. “Without any allegation of fraud, programmers make mistakes. And sometimes people like to hide their mistakes,” Trachtenberg said. “If it’s possible for people to do an independent count, they should be allowed, and we’re very fortunate in Humboldt that we had a registrar who not only allowed us to do an independent count but made it as easy as it could be. I think what you’ll find is a couple if years from now, this is going to be thought to be just common sense.”
The ballots from the November election haven’t been placed online yet because they’re still looking for a volunteer with sufficient bandwidth who is willing to host the data. In the meantime, members of the public can request DVD copies of the ballots by contacting the Humboldt County elections office. Here’s a video of Trachtenberg discussing how his open-source ballot software works.
[youtube=http://www.youtube.com/watch?v=UYAb560TpRE]
CONTACT
Mitch Trachtenberg
http://democracycounts.blogspot.com/
http://www.tevsystems.com/about.html
email : mitch [at] tevsystems [dot] com
–
SCANTEGRITY [SOURCE CODE]
http://www.scantegrity.org/wiki/index.php/Getting_the_Source
http://www.scantegrity.org/learnmore.php
http://scantegrity.org/blog/
BETTER LIVING THROUGH CHEMISTRY
http://www.economist.com/science/tm/displaystory.cfm?story_id=12455414
“[One] approach to the idea of encrypted ballots is Scantegrity II, designed by David Chaum, a computer scientist and cryptographer who, among many other things, invented the idea of digital cash. Instead of putting a cross next to the candidate’s name, a voter fills in an oval-shaped space, known as a bubble, next to the name. So far, that is similar to one widely used American system. However, in the case of Scantegrity the voter uses not an ordinary pen but a special one whose “ink” reacts with a pattern of two chemicals that has been printed inside the bubble. One of these chemicals darkens the whole bubble, so that its position (and thus the candidate voted for) can be recorded by a standard optical-reader. The other becomes visible in a contrasting colour to reveal a previously invisible three-character code, derived from a pseudorandom number generator. Since the optical readers employed by this system do not have character-recognition software, this code cannot be read by the vote-counting machine. But it can be noted by the voter on a detachable receipt at the bottom of the ballot paper. He can then, if he wishes, check things are in order by entering the serial number of his ballot paper into a website set up for the election. He should see in return the letter code he noted. If the code does not match, something is awry, and an investigation can start.”
CONTACT
David Chaum
http://www.chaum.com/
email : david [at] chaum [dot] com
http://www.punchscan.org/
http://en.wikipedia.org/wiki/David_Chaum
–
VOTEBOX [SOURCE CODE]
http://votebox.cs.rice.edu/
http://code.google.com/p/votebox/wiki/FAQ
CONTACT
Dan Wallach
http://seclab.cs.rice.edu/lab/
http://www.cs.rice.edu/~dwallach/
email : dwallach [at] cs.rice [dot] edu
U.S. INDUSTRY BLUFFS
http://freedom-to-tinker.com/blog/dwallach/open-source-vs-disclosed-source-voting-systems
http://arstechnica.com/open-source/news/2009/04/voting-machine-expert-criticizes-clueless-industry-report.ars
http://techdirt.com/articles/20090417/0214474537.shtml
E-Voting Firms Recognize That Open Source Software Exists…
But Seem Confused About What It Means
BY Mike Masnick / Apr 20th 2009
We’ve never quite understood why e-voting software shouldn’t be required to be public information. For the sake of actually allowing an open and transparent voting system, it’s hard to understand how any governing body would allow proprietary software to be used. There’s simply no way you can prove that the system is fair and transparent if the counting mechanism is totally hidden away. For years, the big e-voting firms have simply shrugged this off, but it looks like they’re at least open to discussing it. A trade group representing the big e-voting firms has put out a whitepaper discussing open source voting systems, where all they really do is show that they don’t actually understand much about open source technologies.
First, they claim that, even though they understand that “security through obscurity” isn’t effective, “there remains some underlying truths to the idea that software does maintain a level of security through the lack of available public knowledge of the inner workings of a software program.” Computer Science professor Dan Wallach does a nice job responding to that claim:
“Really? No. Disclosing the source code only results in a complete forfeiture of the software’s security if there was never any security there in the first place. If the product is well-engineered, then disclosing the software will cause no additional security problems. If the product is poorly-engineered, then the lack of disclosure only serves the purpose of delaying the inevitable. What we learned from the California Top-to-Bottom Review and the Ohio EVEREST study was that, indeed, these systems are unquestionably and unconscionably insecure. The authors of those reports (including yours truly) read the source code, which certainly made it easier to identify just how bad these systems were, but it’s fallacious to assume that a prospective attacker, lacking the source code and even lacking our reports, is somehow any less able to identify and exploit the flaws. The wide diversity of security flaws exploited on a regular basis in Microsoft Windows completely undercuts the ETC paper’s argument. The bad guys who build these attacks have no access to Windows’s source code, but they don’t need it. With common debugging tools (as well as customized attacking tools), they can tease apart the operation of the compiled, executable binary applications and engineer all sorts of malware. Voting systems, in this regard, are just like Microsoft Windows. We have to assume, since voting machines are widely dispersed around the country, that attackers will have the opportunity to tear them apart and extract the machine code. Therefore, it’s fair to argue that source disclosure, or the lack thereof, has no meaningful impact on the operational security of our electronic voting machines. They’re broken. They need to be repaired.”
The next oddity, is the claim that if a problem is found in open source software, then it won’t get fixed as quickly, because you have to wait for “the community” to fix it. That completely mistakes how open source software works. Again, Wallach points out how silly that is, noting that plenty of commercially-focused companies run open source projects, including maintaining and contributing code to the project. If these companies were to open source their code, there’s nothing stopping them from continuing to improve the security of the code. There’s no need to wait around… The paper has other problems as well, which Wallach discusses at the link above. To be honest, though, it’s quite telling that these firms don’t even seem to understand some of the basics of how open source software works.
–
OPEN SOURCE USED IN AUSTRALIA SINCE 2001 [SOURCE CODE]
http://www.elections.act.gov.au/elections/electronicvoting.html
http://www.elections.act.gov.au/downloads/evacs2008.zip
–
OPEN VOTING CONSORTIUM [SOURCE CODE]
http://www.openvotingconsortium.org/faq
http://evm2003.sourceforge.net/index.html
[youtube=http://www.youtube.com/watch?v=q8CSKdMTARY]
OVC TEST RUN
http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9111820&intsrc=hm_ts_head
Open-source e-voting gets LinuxWorld test run
BY Todd R. Weiss / August 6, 2008
Computer engineer Alan Dechert didn’t like what he saw during the controversial vote tallying in Florida in 2000’s presidential election. That was when he decided that there had to be a better way for U.S. citizens to safely and accurately cast their ballots. More than seven years later, Dechert is here at the LinuxWorld Conference & Expo, publicly displaying the open-source e-voting system he helped develop that fixes some of the problems that he and other critics found in the nation’s voting systems almost a decade ago. “I watched the 2000 election, and I was stunned that we didn’t know how to count ballots,” Dechert said.
In Florida, where paper punch-card ballots were used at the time in many counties, the nation watched in disbelief for weeks as the presidential election came down to the wire over punch cards that were analyzed individually and manually by voting officials. At issue was voter intent, as officials tried to decipher who voters had selected on the ballots, which often weren’t fully punched out by the machines that were supposed to mark the ballots. It took analysis of those ballots and a U.S. Supreme Court decision to finally decide the winner of that election, almost a month after the last polling place closed.
That December, Dechert co-founded the Granite Bay, Calif.-based Open Voting Consortium to try to help come up with a better way to vote in this country. “This was conceived as a pilot project for Sacramento County [Calif.] in December 2000,” he said. The idea was to create an electronic voting system that allows voters to make their candidate selections on a screen, then clearly print their ballots and have them scanned and tallied by reliable machines. By creating such a system, Dechert said, then “there’s no ambiguity about what the voter intended,” fixing one of the most glaring problems of the old punch-card systems and poorly designed ballot layouts. The system, which was set here at LinuxWorld for show attendees to view and vote in mock elections, runs on PCs loaded with Ubuntu Linux and the free, open source e-voting application created by the consortium.
For election officials, the system is a simple one that would allow voters to be sure of their choices before they leave the ballot-casting area, Dechert said. Officials could set up and create the ballot in any elections intuitively with a special software tool that would add candidate names, office titles and other relevant information without requiring major computing skills. The application runs on standard PC architecture and requires no specialized equipment. “They don’t have to do anything special,” Dechert said of local election officials who would use the system. “They don’t have to know anything special.”
By going to an open-source system, he said, the application’s code could be carefully and publicly analyzed for flaws and security issues, then could be fixed and made trustworthy for use. At least, that’s the position of open-source advocates who think they can build a better system than those created by proprietary vendors across the nation. “What we’re trying to advance is full public scrutiny, with many eyes on the code,” Dechert said.
The open-source system aims to address several concerns about traditional vendor-supplied e-voting systems in use across the U.S., he said, including the following:
* By being open source, the code can be checked at any time for flaws or problems by any qualified programmer or developer, making it more transparent and trustworthy.
* By using off-the-shelf PC hardware and printers and other peripherals, it’s much cheaper than custom, purpose-built e-voting consoles and equipment.
* It’s usable by handicapped voters and by voters who speak languages other than English.
* It contains a voter-verifiable and fully auditable paper record that can be preserved and is recountable.
“It could be used now,” Dechert said. Some local voting jurisdictions are in talks with the group now about looking further at the system, including local officials in at least one Maryland county, he said. For use in national elections, the system would have to be heavily analyzed and eventually certified as an election system, Dechert said. That process is part of the group’s future goals, he said.
Here in San Francisco, for the system in display on the show floor, mock voters entered a booth and stood in front of a computer screen that lay flat in front of them on a table. The voters then used a traditional computer mouse to make their selections on the one-screen ballot and then advanced the ballot selections with on-screen arrows. Voters could also choose to go back to check or change their selections. After completing the ballots, participants were asked to confirm their candidate or referendum-question selections several times, then were able to print their ballots on a printer also in the voting cubicle. Each voter then put the printed paper ballot in a manila folder and walked it over to a nearby election official, who electronically tallied and scanned it in front of the voter.
More than 300 people tried out the system yesterday. Project organizers set up a ballot with the three major party candidates in this year’s presidential election, as well as several referendum questions about e-voting and other topical public issues. Dick Turnquist, an IT manager at the Association of California Water Agencies in Sacramento, test-voted on the proposed system and said he liked what he experienced. “It certainly was easy enough to use. I probably would prefer it” to existing e-voting systems, Turnquist said. Greg Simonoff, an engineer at the California Department of Transportation, said he liked using the system but would prefer a touch-screen voting mechanism rather than a mouse-based system. Dechert said the mouse-based system is being used in the demonstration phase of the project to cut costs but would be replaced with a touch-screen system in production.
CONTACT
Alan Dechert
http://www.openvotingconsortium.org/about_ovc
email : dechert [at] gmail [dot] com
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
MEANWHILE
GROSS NEGLIGENCE PAYS OFF
http://www.guardian.co.uk/technology/2009/apr/30/e-voting-electronic-polling-systems
Why machines are bad at counting votes
BY Wendy M Grossman / 30 April 2009
It’s commonly said that insanity is doing the same thing over and over again while expecting different results. Yet this is what we keep doing with electronic voting machines – find flaws and try again. It should therefore have been no surprise when, at the end of March, California’s secretary of state’s office of voting system technology assessment decertified older voting systems from Diebold’s Premier Election Solutions division. The reason: a security flaw that erased 197 votes in the Humboldt county precinct in last November’s presidential election.
Clearly, 197 votes would not have changed the national result. But the loss, which exceeds the error rate allowed under the Help America Vote Act of 2002, was only spotted because a local citizen group, the Humboldt County Election Transparency Project (humtp.com) monitored the vote using a ballot-imaging scanner to create an independent record. How many votes were lost elsewhere? Humboldt county used Diebold’s GEMS operating system version 1.18.19 to tally postal ballots scanned in batches, or “decks”. The omission of votes was a result of a flaw in the system, where, given particular circumstances, it deletes the first deck, named “Deck Zero”, without noting it in the system’s audit logs.
Voting slips
Diebold recommended decertification of its older version, which should force precincts to upgrade and eliminate the flaw. But the secretary of state’s report notes flaws in the audit logs that will be harder to erase: wrongly recorded entry dates and times, and silent deletions of audit logs. “It’s nothing new,” says Rebecca Mercuri, a security consultant who studied voting systems for her 1999 doctoral dissertation. “These are all security flaws that are well known in the industry. Why are they acting as if this is the first time they’ve heard this?” The audit log problems were documented in Bev Harris’s 2004 book, Black Box Voting. Mercuri explains that election software belongs to the class of problems known as “NP-complete”, that is, problems computers cannot solve in a known amount of time. How much time have you got to test that a given voting system will function perfectly under all possible circumstances? “What are people going to do about it?” she asks. “Say we fixed it when it’s theoretically not possible to fix these things at any real level?”
So, it’s not fair just to pick on Diebold. Last month, election officials in Clay county, Kentucky, were charged with conspiring to alter ballots cast on ES&S iVotronic election machines in recent elections. The key: interface design. In most cases, voters cast ballots by pressing a big red button labelled “VOTE”. But some versions of the system require touching a “confirm vote” box on the screen to complete the ballot. It is alleged officials hid this fact from voters and would then “correct” and confirm the ballot after the voter had left. The officials have pleaded not guilty. Matt Blaze, a security researcher at the University of Pennsylvania, writes in his blog that if this were a strategy, “it’s a pretty elegant attack, exploiting little more than a poorly designed, ambiguous user interface, printed instructions that conflict with actual machine behaviour, and public unfamiliarity with equipment that most citizens use at most once or twice each year. And once done, it leaves behind little forensic evidence to expose the deed.”
But Diebold’s current problems aren’t limited to voting machines. More startling was the discovery of malware designed to attack its ATMs. Graham Cluley, a senior technology consultant for the security company Sophos, says the company found a sample in its archives. “If [the malware] were planted on the version of Windows on those Diebold machines,” Cluley says, “you could actually steal information from the cards being used on the device, and hackers with a specially crafted card would get a receipt with people’s information.” Diebold sent out a customer warning in January and provided a software update. As in the Kentucky voting machine case, the attack on Diebold’s ATMs requires inside access. “We’re seeing more and more organised criminal gangs because of the money they can make,” says Cluley, pointing out how difficult it would be to spot a legitimate maintenance engineer who’s been bought off installing an extra patch off a USB stick in a back pocket.
Black box recorder
For consumers, the problem is that both ATMs and voting machines are black-box technologies. You can count your cash and keep the receipt; but if someone else withdrew the money you can’t prove it wasn’t you. “It’s the same with voting,” Mercuri says. “You have no way to prove or disprove how you voted.” At least with voting, citizen groups are motivated to push for greater transparency. Jason Kitcat, Green councillor for Brighton and Hove, organised volunteers to observe e-voting trials in the 2007 local government elections in England and Scotland on behalf of the Open Rights Group. “We saw the same audit log issues,” he says. “We know from a computer science point of view that making an audit log that can’t be changed is impossible. But it seems as if there’s a huge disconnect between people who are computer-science literate, and the people delivering the policy.”
Besides, politicians like making uncontroversial decisions. Who could fault them for trusting a company that makes ATMs worldwide? Again, it comes back to humans. “The folks who buy ATMs [bank managers] and voting machines [election officials] don’t really want to pay for a facility that will make it easier for people to challenge them,” says Ross Anderson, a professor of security engineering at Cambridge University. “In the long run, of course, this ends up costing them more: fraud can lead to challenges that are systemic rather than local. Nevertheless, the purchasers may be rational. Most of the bank managers who bought crap ATM systems in the 80s are retired now – they got away with it. With voting machines, some vendors have been discredited in some countries, but lots of money has still been made.” That is, from us – the taxpayer and the bank customer.
–
R.I.P. JOHN GIDEON of DAILY VOTING NEWS + VOTERS UNITE
http://ncvoters.blogspot.com/2009/04/tributes-to-john-gideon-rip-national.html
http://www.votersunite.org/info/johngideon.asp