150609_fossett_police_ap.jpg

Law And Order

The DIY Effort to Count Who Police Kill

Police-tracking sites put their official counterparts to shame. Can the DOJ learn from them?

Continue to article content

Katelyn Fossett is a web producer for POLITICO Magazine.

In the ten months since a national debate over policing erupted following the fatal encounter of officer Darren Wilson and Michael Brown in Ferguson last August, perhaps the most surprising development has been the revelation that the nation has literally no idea how many citizens are killed—justifiably or not—by law enforcement every year. “It’s ridiculous that I can’t tell you how many people were shot by the police last week, last month, last year,” FBI Director James Comey told a group of Georgetown students in February, in a rare moment of candor from America’s top cop.

The FBI keeps track of all kinds of specific crime stats: It can tell you how many coin-operated machines were broken into and the number of bicycles stolen each year, for instance. So why not police shootings?

Into this void has stepped a home-grown group of bloggers and online data analysts who track police shootings better than the government does—a group that includes the Killed by Police Facebook page, the Cato Institute’s policemisconduct.net, Truth Not Tasers (which tallies Taser-related deaths), the Houston-area Civilians Down page and the Wikipedia page of officer-involved shootings. There is even a “Puppycide Database,” an attempt to count all pets killed by police.

What they’ve found can be summed up pretty neatly: There are far more officer-involved shootings than what shows up in the official counts.

An exact year-to-year comparison between official and unofficial statistics is not possible because the most authoritative police tracking websites began operations after 2012, the most recent year for which government data is available. But using the numbers at hand, it’s clear these official statistics fail to record half or more of all fatal police encounters.

Between 2008 and 2012, the FBI recorded about 400 justifiable homicides per year. Between 2003 and 2009, the Department of Justice counted roughly 488 arrest-related year—pretty similar to the FBI’s numbers. By contrast, for 2013, which Fatal Encounters editor Brian Burghart considers the most fully accounted-for year on his website, he counted 1,142 deadly police shootings. The Killed by Police Facebook page, cited frequently in media reports and referred to by Burghart and others as one of the most reliable unofficial sources, counted 1,102 such deaths in 2014.

Police shootings have not doubled in recent years—the discrepancy between official and unofficial sources is because, by the government’s own admission, its data collection system is broken.

“At best, the [Department of Justice’s Arrest Related Deaths] program captured approximately half of the estimated law enforcement homicides in the United States during 2003–09 and 2011,” concluded a DoJ report in early March about the quality of its own arrest-related death data.

“It was like they read our website,” Burghart joked recently, referring to what he considered striking similarities between the shortcomings mentioned by the Department of Justice report and those he has highlighted on his blog.

In fact, government officials trying to take stock of the country’s police shootings do read websites like Burghart’s. While he wasn’t at liberty to talk specifics, Bill Sabol, the head of the Bureau of Justice Statistics, said his agency was taking a serious look at how to incorporate open-source, “big data” efforts.

“I can’t imagine that BJS would rely on open-source without review,” he said, pointing to the fact that media reports, from which these databases pull their numbers, can change by the day. But, he said, these big-data efforts could aid most in “hypothesis generation” or “nomination,” in helping the BJS identify possible incidents. “A simple model would be to use the open-source to nominate but then confirm and verify” with local agencies, he said.

An approach like that sounds like something the police-tracking bloggers and aggregators might propose themselves. But there is still a lot of room for data to fall through the cracks.

***

Until about a year ago, Americans, on the whole, gave little thought to how accurate or comprehensive data on local police departments was, no matter the federal agency it came from. Then, in the wake of the Ferguson shooting and the riots it set off, news reports highlighted the problem of undercounting arrest-related deaths, and in March, the Justice Department’s report gave the fullest insight yet into its own failings in coming up with an accurate, official tally.

An official solution to this problem is beginning to take shape. Late last year, Congress renewed the Death in Custody Reporting Act, which mandates that states receiving criminal justice assistance grants report deaths in custody—that is, in prisons, detention centers or in the process of arrest—and include race and gender information to the DOJ, which will then compile the information into one data set.

And last week, Democratic Senators Barbara Boxer and Cory Booker introduced the Police Reporting Information, Data, and Evidence of 2015 Act (PRIDE), which requires that local police departments report officer-involved use-of-force incidents to the Department of Justice. (It requires some information in addition to that required by the Death in Custody Reporting Act, like the number of fatally shot police officers and the number of nonfatal shootings.)

Whether or not these new laws can live up to their promise depends on whether officials are able to correct a muddled, lightly overseen, fifty-state-wide data-reporting process.

If the new version of the Death in Custody Reporting Act is to succeed, it will need to avoid the mistakes of the previous version, which expired in 2006 and was characterized by widespread disorganization. The data the program collected could come to the Bureau of Justice Statistics from such varied sources as incident reports, use-of-force reports, death certificates, the mayor’s office—even from just asking individual officers. It was a formula for inaccuracies.

“Because the [Death in Custody Reporting Program] was an unfunded program, we did not specify that data were going to be collected any specific way. We left it up to the state to determine their method of collecting data,” says Andrea Burch, a statistician at the DoJ’s Bureau of Justice Statistics who helmed the Arrest-Related Death program when it was active. “Some of them are more proactive about identifying deaths than others.”

After the previous Death in Custody Reporting Act expired in 2006, local police departments were reporting their information to the BJS completely voluntarily, which made the statistics even worse. The gradual degeneration of the data helps explain why, when Burch started managing the Arrest-Related Deaths program in 2010, she was crosschecking that deeply flawed information against some of the aforementioned police-tracking websites: Truth Not Tasers, Civilians Down and the officer-involved shootings Wikipedia page. It was the last line of defense for an essentially defunct program.

“We knew about a year ago that we were missing a significant portion of the data,” Burch admitted. “We are a statistical agency, and the data we were collecting didn’t meet our standards.” That recognition prompted the agency to suspend the program.

The Centers for Disease Control and the FBI collect their own numbers on police shootings, but  both  have  their  own shortcomings. The Achilles’ heel of CDC’s Fatal Injury Reports is that medical examiners and coroners who provide these numbers are  not explicitly instructed to mention police involvement; many of these homicides are, as a result, misclassified. The FBI’s reporting of so-called “justifiable homicides” in its yearly Supplementary Homicide Report (SHR), part of its broader Uniform Crime Report is little better, by its own admission.

By comparison, the recently renewed Death in Custody Reporting Act seems, at least ostensibly, the most promising avenue for real reform. The program uses Google searches, news reports and—both crucially and problematically—agencies’ self-reporting, to come up with an official tally for each state. As a result, only the DOJ’s program can incorporate the practices of the independently run police tracking blogs—casting a wide net for all coverage—and patch up the government’s reporting holes.

What are the practices of the unofficial police trackers?

The process used by Burghart’s website takes information from three different sources: Outside submissions that are cross-referenced with media reports; public records requests; and analysis from individual researchers who tackle big knots of data. Because the information is either official or checked against readily available media sources, it’s safe to say websites like this offer a fuller picture than the data from government sources. (At the same time, it’s unwise to assume 100 percent accuracy from such sites; initial media reports often turn out to have facts wrong.)

Using a process that instead relies on crowdsourced submissions,  Deadspin writer Kyle Wagner is assembling his  own database of all officer-involved shootings. He asks submitters to do a Google search of officer-involved shootings for one particular day and then enter in all the results they find from the first ten pages of results. His database includes fatal and nonfatal shootings and, like Burghart’s site, the sex and race of the victims. Burghart is confident that the combination of Wagner’s database and his own will make for the most comprehensive list of police shootings in existence.

How do we know these websites aren’t simply over-counting?

Fivethirtyeight.com ran through one random sample of 146 incidents from the Killed by Police Facebook page, and writers Reuben Fischer-Baum and Al Johri found that all of the links went back to established media sources. “By the narrowest measure possible—in which we give police every benefit of the ‘cause of death’ doubt in incidents where they Tasered or restrained suspects,” they  wrote, “85 percent of the sampled incidents were the sort of police killings the government might be expected to keep track of.” According to their calculations, that translates to about 1,000 such cases per year that the government should be counting. It currently falls far short of that number.

These sites’ methodologies are hardly airtight. Regardless, Burghart’s Fatal Encounters, the officer-involved shootings Wikipedia page and the Killed by Police Facebook page are sources frequently mentioned in news reports as the most reliable alternatives to official data.

The Department of Justice seems to have taken note. By combining some of the best practices of unofficial trackers with the statistical know-how needed to conduct the official process, there might be a real chance for reform.

***

Virginia Rep. Bobby Scott, who sponsored the new version of the Death in Custody Reporting Act that passed last December,  hopes we’re entering an era of increased transparency. But it’s not clear yet how well founded those hopes are.

The new Death in Custody Reporting Act is very similar to the old one. As in the past, it allows each state’s coordinator—who can be  plucked from a state criminal justice agency, a university, a department of corrections, a medical examiner’s office and many other places—to cast a wide net for data on police shootings. There’s only really one significant tweak: the mechanism through which local agencies will be compelled by the federal government to report their data, what can be described as the “jurisdictional hook.” The old jurisdictional hook, under the 2000 version of the Act, was eligibility for funds under law enforcement grant programs, while the new one is a 10 percent funding penalty enforced at the attorney general’s discretion. Even though it’s not mandatory, some hope that a penalty will be more effective in forcing compliance than the previous incentive.

Burghart is skeptical. “I can’t imagine the political fallout if an attorney general says, ‘Oh we’re not going to give law enforcement any federal dollars,’” he remarked about the legislation.  

Perhaps the biggest problem is that fact-checking open-source data with 18,000 local law enforcement agencies would likely not be much less work-intensive than compiling the self-reported data from those same 18,000 agencies. That task would require a word that kept appearing in interviews: “resources.” Another word that kept popping up: “ideal world.”

The Department of Justice, it seems, agrees with those assessments. “When choosing solutions for a complete and accurate data collection,” the March report read, “BJS should consider ways to … dedicate resources and funding to support data collection efforts at the state and local levels.”

For Rep. Scott, the sponsor of the legislation, the real test is time. The new numbers, he says, “will add statistical substance to the debate” that has erupted since Ferguson. “Once the data comes in, we’ll know what to do,” he says. “It’s really simple and straightforward.”

For those who’ve been keeping closer watch, of course, it’s anything but.

Jump to sidebar section