Chris Essig

Walkthroughs, tips and tricks from a data journalist in eastern Iowa

Archive for the ‘CSS’ Category

Final infographics project

leave a comment »

For the last six weeks, I’ve taken a very awesome online course on data visualization called “Introduction to Infographics and Data Visualization” It is sponsored by the Knight Center for Journalism in Americas and taught by the extremely-talented Albert Cairo. If you have a quick second, check out his website because it’s phenomenal.

Anyways, we have reached the final week of the course and Cairo had us all make a final interactive project. He gave us free reign to do whatever we want. We just had to pick a topic we’re passionate about. Given that I was a cops and courts reporter for a year in Galesburg, Illinois before moving to Waterloo, Iowa, I am passionate about crime reporting. So I decided for my final project I’d examine states with high violent crime rates and see what other characteristics they have. Do they have higher unemployment rates? Or lower education rates? What about wage rates?

Obviously, this is the type of project that could be expanded upon. I limited my final project to just four topics mostly because of time constraints. I work a full time job, you know! Anyways, here’s my final project. Let me know if you have any suggestions for improvement.

teaser

About:

Data: Information for the graphic was collected from four different sources, which are all listed when you click on the graphic. I took the spreadsheets from the listed websites and took out what I wanted, making CSVs for each of the four categories broken down in the interactive.

Map: The shapefile for the United States was taken from the U.S. Census Bureau’s website. Find it by going to this link, and selecting “States (and equivalent)” from the dropdown menu. I then simplified the shapefile by about 90 percent using this website. Simplifying basically makes the lines on the states’ polygons less precise but dramatically reduces the size of the file. This is important because people aren’t going to want to wait all day for your maps to load.

Merging data with map: I then opened that shapefile with an awesome, open-source mapping program called QGIS. I loaded up the four spreadsheets of data in QGIS as well using the green “Add vector layer” button (This is important! Don’t use the blue “Create a Layer from a Delimited Text File” button). The shapefile and all the spreadsheets will now show up on the right side of the screen in QGIS under “Layers.”

Each spreadsheet had a row for the state name, which matched the state names for each row in the shapefile.  It’s important these state names match exactly. For instance, for crime data from the FBI I had to lowercase the state names. So I turned “IOWA” into “Iowa” before loading it into QGIS because all the rows in the shapefile were lowercase (“Iowa” in the above example).

Then you can open the shapefile’s properties in QGIS and merge the data from the spreadsheets with the data in the shapefile using the “Joins” tab. Finally, right click on the shapefiles layer then “Save As” then export it as a GeoJSON file. We’ll use this with the wonderful mapping library Leaflet.

qgis_teaser

Leaflet: I used Leaflet to make the map. It’s awesome. Check it out. I won’t get into how I made the map interactive with Javascript because it’s copied very heavily off the this tutorial put out by Leaflet. Also check it out. The only thing I did differently was basically make separate functions (mentioned in the tutorial) for each of my four maps. There is probably (definitely) a better way to do this but I kind of ran out of time and went with what I had. If you’re looking for the full code, go here and click “View Page Source.”

Design: The buttons used are from Twitter’s Bootstrap. I used jQuery’s show/hide functions to show and hide all the element the page. This included DIVs for the legend, map and header.

GeoJSON: The last thing I did was modify my GeoJSON file. You’ll notice how the top 10 states for violent crime rates are highlighted in black on the maps to more easily compare their characteristics across the maps. Well, I went into the GeoJSON and put all those 10 states attributes at the bottom of the file. That way they are loaded last on the map and thus appear on top of the other states. If you don’t do this, the black outlines for the states don’t show up very well and look like crap. Here’s GeoJSON file for reference.

Hopefully that will help guide others. If you have any questions, feel free to drop me a line. Thanks!

 

Written by csessig

December 11, 2012 at 12:13 am

Going mobile with our news apps

leave a comment »

ImageIt’s been about two months since we launched our crime map for the city of Waterloo and so far, the response has been overwhelmingly positive.

I’ve been proud of our fans on Facebook, in particular, who are using the map to make informed opinions on crime in the area. One of our fans actually counted every dot on the map to counter another fan’s claim that crime only happens on the east side of town. He found crime was almost equally spread across the town.

In hopes that this data will be digested by an even larger audience, I launched the mobile/tablet equivalent of the map last week. The new app is fully responsive thanks to Twitter’s Bootstrap framework. If you haven’t check out Bootstrap, you really should. It makes designing websites for all platforms — desktops, tablets, mobile phones — about as easy as its going to get.

I also went in and modified  a lot of the CSS, changing things like the width of objects on the page from pixels to percentages. This ensures that the app looks great no matter how wide the screen you’re viewing it from is.

Serving up the tablet version of the map wasn’t particularly difficult. It’s basically the same map on our site without the header and footer, which seems to load slower on iPads. It’s also designed to be flexible regardless of how wide your tablet screen is.

The mobile version was more difficult. At this time, the mobile app does not have a map of the crimes. Instead, it’s just the color charts comparing different crimes and a table of all the crime data. I stripped out the map mostly because it’s difficult to click individual points on the map on small mobile screens. But screens continue to get bigger and nicer so hopefully this won’t be a problem in the future.

One pro tip: I set the padding-right CSS property on the table of crimes to 10 percent. This gives a nice cushion to the right of the table, making it easier for people to scroll past it on smartphones with touch screens.

For this project, I went about making the mobile version the completely wrong way: I opted to create just the desktop version at first and then go back and make one for tablets and phones.

Ideally I would have done both at the same time, which is easy to do with Bootstrap. And that’s exactly what I did for another project we launched this weekend on campaign finance reports. The project exams the finance reports for candidates running in four local races. The reports are made available by the Iowa Ethics and Campaign Disclosure Board, which fortunately allows users to export the data into CSV files.

We broke down the data and created bar charts using Bootstrap to compare the figures. The framework has several bar options to easily create the bar charts, which are worth checking out. The best part is they are designed to look good on all platforms.

We also have databases of all the contributions that are searchable. This allows readers to see exactly where the money is coming from.

For this project, I created the mobile and tablet equivalents of app as I was creating the desktop version. When viewing it on a desktop computer, the app is embedded on our website so it has the election header, footer and colors. The same app was created with responsive design in mind. So if you open it on a mobile phone, the app will look just as good on a tablet or smartphone as it does on a desktop computer.

Many studies show that more and more people are getting their news on smartphones. It is imperative that we keep those readers in mind when designed full-scale apps for our websites.

Hopefully we can continue this trend at the Courier and make sure our projects are reaching our full audience.

Written by csessig

October 21, 2012 at 1:58 pm

How We Did It: Waterloo crime map

with 3 comments

Note: This is cross-posted from Lee’s data journalism blog. Reporters at Lee newspapers can read my blog over there by clicking here.

Last week we launched a new feature on the Courier’s website: A crime map for the city of Waterloo that will be updated daily Monday through Friday.

The map uses data provided by the Waterloo police department. It’s presented in a way to allow readers to make their own stories out of the data.

(Note: The full code for this project is available here.)

Here’s a quick run-through of what we did to get the map up and running:

1. Turning a PDF into manageable data

The hardest part of this project was the first step: Turning a PDF into something usable. Every morning, the Waterloo police department updates their calls for service PDF with the latest service calls. It’s a rolling PDF that keeps track of about a week of calls.

The first step I took was turning the PDF into a HTML document using the command line tool PDFtoHTMLFor Mac users, you can download it by going to the command line and typing in “brew install pdftohtml.” Then run “pdftohtml -c (ENTER NAME OF PDF HERE)” to turn the PDF into an HTML document.

The PDF we are converting is basically a spreadsheet. Each cell of the spreadsheet is turned into a DIV with PDFtoHTML. Each page of the PDF is turned into its own HTML document. We will then scrape these HTML documents using the programming language Python, which I have blogged about before. The Python library that will allow us to scrape the information is Beautiful Soup.

The “-c” command adds a bunch of inline CSS properties to these DIVs based on where they are on the page. These inline properties are important because they help us get the information off the spreadsheet we want.

All dates and times, for instance, are located in the second column. As a result, all the dates and times have the exact same inline left CSS property of “107” because they are all the same distance from the left side of the page.

The same goes for the dispositions. They are in the fifth column and are farther from the left side of the page so they have an inline left CSS property of “677.”

We use these properties to find the columns of information we want. The first thing we want is the dates. With our Python scraper, we’ll grab all the data in the second column, which is all the DIVs that have an inline left CSS property of “107.”

We then have a second argument that uses regular expressions to make sure the data is in the correct format i.e. numbers and not letters. We do this to make sure we are pulling dates and not text accidently.

The second argument is basically an insurance policy. Everything we pull with the CSS property of “107” should be a date. But we want to be 100% so we’ll make sure it’s integers and not a string with regular expressions.

The third column is the reported crimes. But in our converted HTML document, crimes are actually located in the DIV previous to the date + time DIV. So once we have grabbed a date + time DIV with our Python scraper, we will check the previous DIV to see if it matches one of the seven crimes we are going to map. For this project, we decided not to map minor reports like business checks and traffic stops. Instead we are mapping the seven most serious reports.

If it is one of our seven crimes, we will run one final check to make sure it’s not a cancelled call, an unfounded call, etc. We do this by checking the disposition DIVs (column five in the spreadsheet), which are located before the crime DIVs. Also remember that all these have an inline left CSS property of “677”.

So we check these DIVs with our dispositions to make sure they don’t contain words like “NOT NEEDED” or “NO REPORT” or “CALL CANCELLED.”

Once we know it’s a crime that fits into one of our seven categories and it wasn’t a cancelled call, we add the crime, the date, the time, the disposition and the location to a CSV spreadsheet.

The full Python scraper is available here.

2. Using Google to get latitude, longitude and JSON

The mapping service I used was Leaflet, as opposed to Google Maps. But we will need to geocode our addresses to get latitude and longitude information for each point to use with Leaflet. We also need to convert our spreadsheet into a Javascript object file, also known as a JSON file.

Fortunately that is an easy and quick process thanks to two gadgets available to us using Google Docs.

The first thing we need to do is upload our CSV to Google Docs. Then we can use this gadget to get latitude and longitude points for each address. Then we can use this gadget to get the JSON file we will use with the map.

3. Powering the map with Leaflet, jQRangeSlider, DataTables and Bootstrap

As I mentioned, Leaflet powers the map. It uses the latitude and longitude points from the JSON file to map our crimes.

For this map, I created my own icons. I used a free image editor known as Seashore, which is a fantastic program for those who are too cheap to shell out the dough for Adobe’s Photoshop.

The date range slider below the map is a very awesome tool called jQRangeSlider. Basically every time the date range is moved, a Javascript function is called that will go through the JSON file and see if the crimes are between those two dates.

This Javascript function also checks to see if the crime has been selected by the user. Notice on the map the check boxes next to each crime logo under “Types of Crimes.”

If the crime is both between the dates on the slider and checked by the users, it is mapped.

While this is going on, an HTML table of this information is being created below the map. We use another awesome tool called DataTables to make that table of crimes interactive. With it, readers can display up to a 100 records on the page or search through the records.

Finally, we create a pretty basic bar chart using the Progress Bars made available by Bootstrap, an awesome interface released by the people who brought us Twitter.

Creating these bars are easy: We just need to create DIVs and give them a certain class so Bootstrap knows how to style them. We create a bar for each crime that is automatically updated when we tweak the map

For more information on progress bars, check out the documentation from Bootstrap. I also want to thank the app team at the Chicago Tribune for providing the inspiration behind the bar chart with their 2012 primary election app.

The full Javascript file is available here.

4. Daily upkeep

This map is not updated automatically so every day, Monday through Friday, I will be adding new crimes to our map.

Fortunately, this only takes about 5-10 minutes of work. Basically I scrape the last few pages of the police’s crime log PDF, pull out the crimes that are new, pull them into Google Docs, get the latitude and longitude information, output the JSON file and put that new file into our FTP server.

Trust me, it doesn’t take nearly as long as it sounds to do.

5. What’s next?

Besides minor tweaks and possible design improvements, I have two main goals for this project in the future:

A. Create a crime map for Cedar Falls – Cedar Falls is Waterloo’s sister city and like the Waterloo police department, the Cedar Falls police department keeps a daily log of calls for service. They also post PDFs, so I’m hoping the process of pulling out the data won’t be drastically different that what I did for the Waterloo map.

B. Create a mobile version for both crime maps – Maps don’t work tremendously well on the mobile phone. So I’d like to develop some sort of alternative for mobile users. Fortunately, we have all the data. We just need to figure out how to display it best for smartphones.

Have any questions? Feel free to e-mail me at chris.essig@wcfcourier.com.

Data journalism resources

leave a comment »

Note: This is cross-posted from Lee’s data journalism blog. Reporters at Lee newspapers can read my blog over there by clicking here.

Okay this is going to be a quick blog post since we and every other Lee newspaper are switching over our sites to Templates 2.0. Here’s a couple of neat websites I’ve ran across in the past couple of weeks that may help with your next data story and/or visualization:

1. Data Journalism Handbook

Want to know about the importance of data journalism today and what is possible for you to do at your own newspaper? Then this handbook is a must read. One of the great aspects of the book is it shows how data can be used not only to make impressive graphics but also great stories. And best of all it is free and growing.

2. DataVisualization: Selected tools

Here’s a great resource for many of the tools used by data journalists across the world. For instance, Google Fusion Tables is the go-to service for map-makers, while Google Refine is a great program for cleaning data to use with maps, visualizations or just stories. Also check out ColorBrewer to help you find great color patterns.

One Javascript library worth checking out is D3.js, which has been used by the New York Times to create some jaw-dropping visualizations. Also worth noting are two blog posts from software developer Jim Vallandingham. One deals with making bubble charts with D3.js. The second shows how to use the library to work with past versions of Internet Explorer. This is a must read for sites like ourselves that have a high percentage of readers who use Internet Explorer.

3. Data Stories

If you are like me, you love listening to podcasts, especially after work when your eyes can no longer stare at your computer screen. This site features a series of podcasts on data visualization. Topics include how to learn data visualization, when to use animated graphics and other advice from people smarter than I will ever be. So check it out if you are looking for something new to listen to while you hit the gym, bike trail or couch.

Written by csessig

June 5, 2012 at 2:00 pm

Courses, tutorials and more for those looking to code

with 3 comments

Note: This is cross-posted from Lee’s data journalism blog. Reporters at Lee newspapers can read my blog over there by clicking here.

Without a doubt, there is an abundance of resources online for programmers and non-programers alike to learn to code.

This, of course, is great news for journalists like us who are looking to use programming to make visualizations, scrape websites or simply pick up a new skill.

Here’s a list of courses and tutorials I’ve found in the last couple months that have either helped me personally or look very promising:

1. Codecademy

Is 2012 the year of code? The startup service Codecademy sure thinks it is. They have made it their mission to teach every one who is willing how to code within one year. The idea was so intriguing that the New York Times ran a front page story (at least online) on it.

Basically, users create an account with the service and every week they are sent new exercises that will teach them how to code. The first exercises focused on Javascript. Now, users are moving into HTML and CSS. Each exercise takes a couple hours to complete and build off the previous week’s exercsies. And best of all, it’s FREE.

If you are a huge nerd like me, you’ll gladly spend your free time completing the courses.

2. Coursera

Want to take courses from Stanford University, Princeton University, University of Michigan and University of Pennsylvania for free? Yeah, I didn’t really think it was possible either until I found Coursera, which offers a wide variety of courses in computer science and other topics.

Right now, I am enrolled in Computer Science 101, which is a six-week course that focuses on learning the basics. Each week, you are e-mailed about an hour of video lectures, as well as exercises based on those lectures. There is also a discussion forum so you can meet your peers. This isn’t nearly as time consuming as Codecademy is, which might be appealing to some.

3. Udacity

Like Coursesra, Udacity offers a number of computer science classes on beginner, intermediate and advanced topics. The classes are also based on video lectures put together by some very, very smart people. I have not used this service, however, so I can’t speak to it too much. It looks promising though. And who wouldn’t want to learn how to program a robotic car?

4. Code School

This service offers screencasts on a host of topics like Javascript, jQuery, Ruby, HTML, CSS and more. The downside, however, is this service does cost: $20 a month or $55 a screencast. If you are looking to try it out, check out their free beginner’s screencast on the Javascript library jQuery, which is the best beginner’s introduction to jQuery I’ve seen. They also have a free screencast for the Ruby programming language.

5. PeepCode

If you are looking for screencasts but are on a tighter budget, check out PeepCode and their list of programming screencasts. Each are about $12, are downloadable and typically include source code for the programs to help you follow along at home. One of my favorites is “Meet the Command Line,” which will get you started with the Unix Command Line. Be warned though because some of their screencasts are geared towards more advanced users. A good understanding of programming is recommended before diving into some of these (An exception is the command line tutorial mentioned above).

6. Net Tuts+

Many of the tutorials on this site are geared towards programmers wanting to learn very specific things or solve specific problems. This tutorial, for instance, runs through how to make borders in CSS. And this one deals with the Command Line text editor called Vim. So if you have a particular problem but don’t have a ton of time to sit through video tutorials, you might want to check out this site’s extensive catalog.

7. ScraperWiki

Web scraping is a great skill for journalists to have because it can help us pull a large amount of information from websites in a matter of seconds. If you are looking for a place to start, check out some of the screencasts offered by ScraperWiki, a service that specializes in — you guessed it — web scraping.

8. Coding blogs

The number of blogs out there devoted to coding and programming is both vast and impressive. Two of my favorite are Life and Code and Baby Steps in Data Journalism. Both are geared towards journalists. In fact, many of the sites I listed here were initially posted on one of these blogs.

– Got a cool website that has helped you out?

I’d love to hear about it! Feel free to leave a comment or e-mail me at chris.essig@wcfcourier.com

Written by csessig

May 3, 2012 at 8:22 am