The FBI will release its first full report on crime trends in the United States today based exclusively on its new incident reporting system. The report traditionally drops the last Monday of September, but it did not appear last week.
This new National Incident-Based Reporting System will be fertile ground for those who want to distort or exaggerate crime trends for political or commercial reasons. And, it lays bare a dirty secret about counting crime in the United States: As a nation, we keep horrible, incomplete data that makes it impossible to get an accurate sense of the scope or impact of crime.
Describing local crime trends with authority and context is the single biggest challenge for newsrooms across the United States seeking to improve their coverage of public safety. I’ve worked with more than 50 newsrooms attempting to make their reporting more helpful to news consumers. These newsrooms serve large metropolitan markets and tiny rural communities. They include newspapers, local TV, public radio, and digital startups.
None of them, not one, felt like they could get consistent crime data from the agencies they cover.
Take, for example, The Philadelphia Inquirer, which is trying to reimagine its crime coverage in the midst of a rise in gun violence. Like most big-city departments, the Philadelphia Police Department posts current crime stats, going back to 2007.
But the numbers are raw numbers and do not take into account the population of Philadelphia. Nor do they go back to the 1990s, when crime was much higher.
On top of that, the Inquirer covers seven counties in Pennsylvania and New Jersey where most agencies are not as committed to publishing statistics.
“The end of Uniform Crime Reporting (UCR) blinded us to much of what was happening in many of them,” said Dylan Purcell, data reporter at the Inquirer. “The state police recently relaunched their lookup tool but it’s got issues. The transition from UCR to NIBRS is a setback for crime reporters all around the country.”
Crime data in the U.S. sometimes makes some cities seem more violent by eliminating suburbs that are separated by municipal borders. St. Louis, Missouri, is almost always at the top of the list of the cities with the highest murder rate, usually around 60 murders for every 100,000 residents. But when you look at metropolitan areas, St. Louis’ rate falls to 13 murders for every 100,000 residents. (This New York Times article explains the misleading phenomenon.)
Getting accurate insights into other crimes that frequently hit the news cycle is even harder. As an example, Inquirer researcher Ryan Briggs said he was hoping to bring more insight to a sensational story last week about a flash mob ransacking a Wawa. Briggs could not quantify the extent of the problem in Philadelphia, because the data on shoplifting and other theft trends was opaque.
“The summary data we have access to doesn’t differentiate between high-value and low-value thefts,” he said. “It’s all just ‘theft.’”
This is a great example of the failure of law enforcement to provide more detail about crime trends, said Laura Bennett, director of the Center for Just Journalism, a nonprofit organization dedicated to reforming the way journalists cover public safety. Bennett is a criminal legal policy expert who has led data analysis projects.
Across the country, journalists have done standalone stories about brazen, large-scale shoplifting, mostly after police release videos or other evidence. But figuring out whether smash and grabs are truly on the rise is harder.
“The deeper you search for real, objective evidence of an accelerating retail crime wave, the more difficult it is to be sure that you know anything at all,” Amanda Mull reported in The Atlantic last December.
That doesn’t mean that large smash and grabs aren’t concerning. But the lack of data makes it easy for police to get a lot of coverage of the sensational crime without proper context.
Fear sells. It is used by politicians, influencers, even journalists to convince people they are less safe than they really are. It falls on journalists to fact-check distorted claims, rather than amplify them.
All this puts a disproportionate weight on the annual FBI data release, Bennett said. The Center for Just Journalism just released a guide to help reporters see the data more clearly and use it responsibly.
Here are just a few of the problems journalists should be aware of as they use the FBI data:
A new system: For several years the FBI has maintained two reporting systems when releasing its annual Uniform Crime Report — the old Summary Reporting System (SRS) and the newer National Incident-Based Reporting System (NIBRS) — while encouraging law enforcement agencies to move to the newer system. SRS required that law enforcement count a series of related crimes under the single, most serious crime. If someone robbed a bank, stole a car and assaulted a witness, only the most serious crime (robbery) would be counted. The new system allows up to 10 crimes to be logged.
This new pending report will be the first time since 1991 that the FBI releases data exclusively from the newer reporting system, even though some agencies have been using it for decades. As a result, it may seem like crime is rising and it will be tricky, at best, to compare the new data to the previous year for cities that were previously submitting data in the now obsolete format. It will also be easy for those who want to confuse the public or sow fear by creating disinformation to do so.
Incomplete data:A full third of all the police agencies in the U.S. did not submit any data to the FBI for this report, including some of the largest cities in the country. As a result, the FBI will fill in the blanks with estimates as it calculates national, state and regional crime trends and provide confidence intervals for those estimates. In short, each crime statistic published by the FBI will be presented as a range of possible values rather than as a specific number. Reporters using the data will have to check the notes to see if trends for their local areas are based on actual data or estimates. If estimates are involved, it will be critical to include a clear caveat describing the margin of error.
Old data: The FBI data reflects crimes reported in 2021. That means the freshest data reflected in the report is nine months old and the oldest data is closer to two years old. Yet it is often presented as current, because it is the most current available data.
Narrow definition of crime: The FBI data release focuses on seven crimes: murder, rape, robbery, assault, burglary, larceny and motor vehicle theft. These are the traditional crimes around which most police agencies are organized. But they are not the crimes that affect the most people, Bennett said. Wage theft and environmental crimes affect at least as many people. But because they are not investigated by traditional police, they are not counted. Likewise, tax evasion accounts for more stolen dollars than any other kind of theft, but it’s investigated by the IRS, not local law enforcement.
Also missing from the FBI numbers are most crimes committed by members of law enforcement. “It’s not a small thing,” Bennett said. “There are many acts of illegal use of force. If you look at DOJ investigations, it shows really widespread assault by police. But it doesn’t show up in crime data.”
Finally, most of the crimes committed in prisons and jails do not show up in the FBI data.
Unaudited and unreliable: There are no checks and balances on the gathering of this data. No third party routinely audits law enforcement agencies to ensure they are counting crimes accurately. As a result, there are inconsistencies from agency to agency about what counts as reported crimes. And there are numerous examples of police using bureaucratic smokescreens to deliberately manipulate crime numbers, sometimes to make crime seem scarier, sometimes to make crime seem to be less of a problem, sometimes to make it seem like they are solving more crimes, and sometimes to undercount specific types of crimes.
All of these flaws in the data put journalists in a tough spot. This past summer, Poynter led 44 U.S. newsrooms through a 14-week course to improve their coverage of public safety. The newsrooms sought out the course and had permission from their top leadership to reimagine their coverage.
The journalists described being caught between opposing forces: an audience that is highly interested in understanding the implications of local crime on their personal safety and a culture of law enforcement that is eager to release information about individual crimes but reluctant to compile contextual data.
This almost universal reality of a big audience appetite for any information about crime and the dearth of meaningful information makes the annual FBI report both valuable and ripe for misuse. In addition to paying attention to all of the shortcomings detailed above, here are three more tips for using the crime data:
- Framing stories accurately includes using historical context, and noting the shortcomings and limitations of the data. This means double-checking your conclusions with independent researchers like Princeton University’s Jacob Kaplan.
- Include the voices of those most impacted by crime. So many crime trend stories are either deliberately or inadvertently framed as fearmongering that encourages people to avoid certain neighborhoods or communities. Rather than talking about those places, talk to the people in those neighborhoods who are organizing the community’s response or trying to solve the problems.
- Research and include information about solutions that are known to work. The John Jay College of Criminal Justice and Research compiles all studies about crime solutions that work.
Among local news organizations, crime coverage is often profoundly flawed. It amplifies bias, focuses on specific instances of violence and ignores or exaggerates overall trends. In short, it causes more harm than good.