July 9, 2020

This piece originally appeared in Local Edition, our newsletter devoted to the telling stories of local journalists. Want to be part of the conversation? You can subscribe here

Every morning, I open up the Tampa Bay Times on my phone and scan the headlines on my way to the same place – the latest Florida coronavirus data by ZIP code.

I hold my breath, type in my ZIP code and gasp when I see the day’s numbers. Today, one in every 224 people have been infected with the coronavirus. The distance between those two numbers keeps shrinking.

When I drove back and forth to the house my sister rented in a nearby county last month, I used it. When I think about going to a beach or park, I use it. And as I’m trying to figure out what on earth to do with my kids when school starts, I’ll keep using it.

I’ve seen projects like this from local newsrooms across the country. They harness data and information and offer context. They make it easy to understand the river of numbers running past us all the time. They push local and state authorities to offer more and better information. And, for me at least, they keep me connected with the reality of what’s happening at the local level. I spoke with the creators of a few of those projects this week via email about what they made, how they made them and what they’ll build next.

Here’s what I heard from the Tampa Bay Times, which Poynter owns and is a partner for a fellowship I’m working on, CalMatters with a hospitals tracker and, because the coronavirus has changed everything about how we live, The Boston Globe with a brilliant way to keep people connected when they can’t be together.


Screen shot, Tampa Bay Times

Tampa Bay Times’ ZIP code tracker 

Created by Eli Murray and Langston Taylor, with design by Eli Zhang and translation by Juan Carlos Chavez

How did you decide to build this?

Murray: It started as a scraper we built to keep track internally of the changing number of cases and deaths. It was simple in the beginning; the scraper just went out and downloaded the data from the state dashboard every 10 minutes and would send an alert to the newsroom Slack channel whenever the data changed. After the scraper had been running for a month or so, Langston and I decided that we wanted to make our data available to everyone and began putting together our own dashboard off the state’s data.

What did it take?

Aside from the custom code we had to write to scrape the data, the biggest challenge for me has been cleaning up the data. The state’s dashboard has been shaky at various times over the past few months — columns being added/removed from the source data, the site going down for a day, the site being put behind a login that we don’t have access to, etc — so writing the scraper so that it doesn’t fall over when there’s an issue with the state dashboard has been its own full-time job.

What’s been the response/result?

The feedback I’ve gotten so far has been positive. We’ve got some loyal readers who have bookmarked the site and like to check back daily which is awesome and what I personally had hoped for our dashboard to become.

What are you working on next?

We’ve got a few more features we’re working on adding to the tracker. First, some backend upgrades to make it quicker to update (right now it takes about an hour for the tracker to update with new data from the state). We have also been tracking hospital and ICU beds in 10-minute intervals since early April, so we are going to be adding pages that show historical and current capacity trends for each hospital in the state.


Screen shot, CalMatters

CalMatters’ coronavirus hospitalizations tracker

Created by John Osborn D’Agostino and Lo Bénichou

How did you decide to build this? / What did it take?

Bénichou and D’Agostino: When the pandemic first started, many of the COVID-19 dashboards and analyses examined cases, deaths and testing. There wasn’t a good dashboard analyzing hospitalizations at the county level. In fact, the state made it difficult in the beginning to do any meaningful data analysis of hospitalization rates since, at the time, the only data source was a Tableau dashboard where you couldn’t download the data. 

On the first day the state released their hospitalization dashboard, we decided this information was incredibly important to understanding the spread of COVID-19, especially as testing was sporadic and unreliable at the time. We were ramping up our virus coverage and wanted to approach angles other publications weren’t yet pursuing at the time, so we decided to take ownership of the hospitalization data early on. 

So we started manually collecting the hospitalization data each day and provided the data in a machine-readable format through our Github account, and we started the process of constructing a dashboard. 

Version 1.0 of the dashboard used JavaScript to pipe data manually entered into a Google spreadsheet into a JSON endpoint for our visualizations. The initial graphics were made with Highcharts.js and made with iteration in mind.

For version 2.0 of the dashboard, we revamped the design and created a new visual hierarchy by differentiating confirmed and suspected patients. We used two color schemes to distinguish the two categories and added new visual elements like a heatmap and in-depth county charts. We also updated the data pipeline to use the state’s API rather than manually inputting the data from the Tableau dashboard.

What’s been the response/result?

Really positive. The dashboard has been in our top three most trafficked pages since we released it in early April. We’ve gotten so much constructive feedback from readers, doctors and data science people who helped provide valuable feedback leading to iterations of the dashboard over time.  

In the early days, we had many doctors and public health officials inform us of how invaluable the dashboard had been since no one else at the time was tackling that data, and we had several instances of people wanting to feed our data into models they were working on. 

What are you working on next?

The dashboard is always a work in progress. We’re hoping to eventually add bed capacity per county and will be updating the visualizations as new data is available.


Screen shot, The Boston Globe

The Boston Globe’s LiveGuide 

Created by Matt Stempeck, Jon Jandoc, Stefan Fox and Margo Dunlap.

How did you decide to build this?

Veronica Chao, deputy managing editor, living/arts: Matt Stempeck, formerly of MIT’s Media Lab, had done some work with the Globe. In early May, Matt pitched the idea of building a guide to all the digital events popping up while venues were closed due to the pandemic. He was inspired by the old scrolling screens you used to see on the TV Guide network and Prevue Channel. The vision was a list of channels, each devoted to a different interest area, from sports to arts to politics etc., populated with links to “platform-agnostic livestreams from around the Internet.” We liked the idea of helping our site visitors navigate their viewing experience during the pandemic. The LiveGuide, as we came to call it, would also give them another way to find our reporters’ and critics’ picks. We decided to test it.

What did it take?

Matt built the tool quickly, and Globe staff from various beats and departments took responsibility for filling some of the channels daily with as much recommended content as we could gather. We tested it and worked with Matt to figure out some additional sites to scrape for content and which channels made the most sense to maintain. Matt made the whole thing flexible so that the channels can change from day to day, depending on where we have recommendations. Then we launched, all in about a month.

What’s been the response/result?

We’ve seen some nice engagement with it. It was particularly useful when everyone was inside, staring at their screens much of the time. Now that it’s summer, with more of Massachusetts opening up and people eager to spend time outside, we’ll decide how long it makes sense to keep it going. It was proof that we could get a new feature up and running relatively quickly, working across beats, to respond to how readers are living now.


Kristen Hare covers the business and people of local news for Poynter.org and is the editor of Locally. You can subscribe to her weekly newsletter here. Kristen can be reached at khare@poynter.org or on Twitter at @kristenhare.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Kristen Hare teaches local journalists the critical skills they need to serve and cover their communities as Poynter's local news faculty member. Before joining faculty…
Kristen Hare

More News

Back to News