Politics in Chicago is a rough-and-tumble sport only rivaled for mindshare by the city’s beloved Bears. Since Mayor Richard M. Daley announced that after 21 years in office he wouldn’t seek re-election, the ensuing mayoral race has proven to be wildly contentious, with both outsiders and machine politicians vying to fill the power vacuum left by the departing imperial mayor. Frustrated with the existing online news and information sources, Dan Sinker of Columbia College Chicago built the Chicago Mayoral Election Scorecard, aggregating news, social media chatter and data about the candidates in one central location. With frontrunner Rahm Emanuel’s eligibility up for debate in the Illinois Supreme Court, the site has become an essential online resource for voters. I interviewed Dan (disclosure: a friend and former editor) about the genesis of the project and the open-source technologies he used to build it.

What inspired the idea for the Chicago Mayoral Election Scorecard?

Purely one of those things that I really wanted to exist. In the days following Daley's announcement that he wasn't going to run for reelection, there was a continual stream of news about people announcing they were in, or thinking about getting in, or rumored to be in. It was really, really confusing. Hard to keep everything straight. There was one news organization, a small outfit called Progress Illinois, that was keeping track, but just as an embedded Google Doc, and it was really hard to parse as the list grew longer and longer. So I thought, "well there's the raw data, what can I do with it?" And I figured out how to import in their document, and present it in a way that was a lot easier to read–color-coded based on who was in, considering, rumored, or out. That was version one of the scorecard. It was pretty straightforward then.

What service does the Scorecard provide? Is it a streamlined public data resource? An exercise in data-driven journalism? What would you like users to get out of it?

It gives a heads-up-display on the election based on information that exists on the Internet, but collected in one place. It allows for just a quick look at who is actually in the race and how their polling and then, through some interface niceties, it allows you to take a deeper dive into each candidate, see their social stream, get more news about them, etc. It's not public data per se, it's just collecting a lot of disparate threads around the Internet into one place. And doing it in a way that's kind of pretty. Hopefully users get decent information easily and engage more fully in what is a very important race.

Photo of Dan Sinker by Rachel Morris, courtesy of Columbia College Chicago

What technologies/platforms are you using on the backend? Where is the data coming from and how are you pulling it all into the site?

The whole thing is built on top of a single Google Doc–that's what's driving the entire site (actually, the polls are a separate Doc now, and the contribution information is a whole different thing–more on that in a bit). Updating the site is just a matter of updating the Doc. The site itself is pretty simple: it's a single HTML page bolstered with *a lot* of javascript to import the Doc (actually, it's passed through Yahoo Query Language formatting just to clean it up), then parse it and operate on it in different ways. The page eventually pulls from Twitter, Google News, and Facebook to bring different bits of information in for each candidate. As well as a separate spreadsheet chart for the new polling box, and a separate Google News search for general election news in the new news box. But, at its core, it's a single page. It's hosted in Google's App Engine infrastructure, so it's entirely free to operate. Every single tool used is free, and all the javascript is Open Source.

One of the latest features on the site is a searchable Google Map overlay of campaign contributions. How did you build that?

It was a little complicated. I had to download the contribution databases for each candidate from the Illinois Board of Elections, then upload them to Google Docs and reformat them so that the addresses (originally broken into five separate columns) would be geolocatable. From there, the Doc was imported into Google Fusion Tables and geocoded. That created the raw maps. Then, back to HTML and Javascript to build the actual page. The page dynamically loads/unloads four separate maps, actually. But it all looks like one. Tricky that.

Say a coder wants to build something like this in their own community–where would they start?

The easiest thing to do would simply be to view source on the page–it's all there! The document is also linked at the bottom, so it's easy enough to see what's being pulled as well.

You've made some comments on Twitter about the…rather lacking…online presence of some of the candidates. What were those candidates missing about the importance of engaging with voters online?

The final six candidates are all pretty good on the web at this point. But, back when there were 20 it was kind of staggering how little web presence some of the top-tier candidates had, particularly Danny Davis, who was in the race until New Years Eve of this year and never had a working website, Twitter account, or official Facebook page.

Paul M. Davis


Paul M. Davis

Paul M. Davis tells stories online and off, exploring the spaces where data, art, and civics intersect. I currently work with a number of organizations including Pivotal and

Things I share: Knowledge, technology, reusable resources, goodwill.