What I've Learned from Two Years Collecting Data on Police Killings

A few days ago, Deadspin's Kyle Wagner began to compile a list of all police-involved shootings in the U.S. He's not the only one to undertake such a project: D. Brian Burghart, editor of the Reno News & Review, has been attempting a crowdsourced national database of deadly police violence. We asked Brian to write about what he's learned from his project.

It began simply enough. Commuting home from my work at Reno's alt-weekly newspaper, the News & Review, on May 18, 2012, I drove past the aftermath of a police shooting—in this case, that of a man named Jace Herndon. It was a chaotic scene, and I couldn't help but wonder how often it happened.

I went home and grabbed my laptop and a glass of wine and tried to find out. I found nothing—a failure I simply chalked up to incompetent local media.


A few months later I read about the Dec. 6, 2012, killing of a naked and unarmed 18-year-old college student, Gil Collar, by University of South Alabama police. The killing had attracted national coverage—The New York Times, the Associated Press, CNN—but there was still no context being provided—no figures examining how many people are killed by police.

I started to search in earnest. Nowhere could I find out how many people died during interactions with police in the United States. Try as I might, I just couldn't wrap my head around that idea. How was it that, in the 21st century, this data wasn't being tracked, compiled, and made available to the public? How could journalists know if police were killing too many people in their town if they didn't have a way to compare to other cities? Hell, how could citizens or police? How could cops possibly know "best practices" for dealing with any fluid situation? They couldn't.


The bottom line was that I found the absence of such a library of police killings offensive. And so I decided to build it. I'm still building it. But I could use some help. You can find my growing database of deadly police violence here, at Fatal Encounters, and I invite you to go here, research one of the listed shootings, fill out the row, and change its background color. It'll take you about 25 minutes. There are thousands to choose from, and another 2,000 or so on my cloud drive that I haven't even added yet. After I fact-check and fill in the cracks, your contribution will be added to largest database about police violence in the country. Feel free to check out what has been collected about your locale's information here.

The biggest thing I've taken away from this project is something I'll never be able to prove, but I'm convinced to my core: The lack of such a database is intentional. No government—not the federal government, and not the thousands of municipalities that give their police forces license to use deadly force—wants you to know how many people it kills and why.

It's the only conclusion that can be drawn from the evidence. What evidence? In attempting to collect this information, I was lied to and delayed by the FBI, even when I was only trying to find out the addresses of police departments to make public records requests. The government collects millions of bits of data annually about law enforcement in its Uniform Crime Report, but it doesn't collect information about the most consequential act a law enforcer can do.

I've been lied to and delayed by state, county and local law enforcement agencies—almost every time. They've blatantly broken public records laws, and then thumbed their authoritarian noses at the temerity of a citizen asking for information that might embarrass the agency. And these are the people in charge of enforcing the law.

The second biggest thing I learned is that bad journalism colludes with police to hide this information. The primary reason for this is that police will cut off information to reporters who tell tales. And a reporter can't work if he or she can't talk to sources. It happened to me on almost every level as I advanced this year-long Fatal Encounters series through the News & Review. First they talk; then they stop, then they roadblock.www.newsreview.com/reno/about-this-series/content?oid=12858771

Take Philadelphia for example. In Philadelphia, the police generally don't disclose the names of victims of police violence, and they don't disclose the names of police officers who kill people. What reporter has time to go to the most dangerous sections of town to try to find someone who knows the name of the victim or the details of a killing? At night, on deadline, are you kidding? So with no victim and no officer, there's no real story, but the information is known, consumed and mulled over in an ever-darkening cloud of neighborhood anger.

Many Gawker readers watched in horror as Albuquerque police killed James Boyd, a homeless man, for illegal camping. Look at these stats, though (I don't know if they're comprehensive; I believe they are): In Bernallilo County, N.M., three people were killed by police in 2012; in 2013, five. In Shelby County, Tenn., nine people were killed by police in 2012; in 2013, 11.

Who the hell knew Memphis Police were killing men at more than double the rate the cops were killing people in Albuquerque? But when I emailed the reporter at the Memphis Commercial Appeal to track the numbers back further, I got no response. I bought a subscription, but haven't been able return to research in that region. (Why don't you help me out? Just do a last name search here before you dig in.)

There are many other ways that bad or sloppy journalism undermines the ability of researchers to gather data on police shootings. Reporters make fundamental errors or typos; they accept police excuses for not releasing names of the dead or the shooters, or don't publish the decedents' names even if they're released; they don't publish police or coroner's reports. Sometimes they don't show their work: This otherwise excellent St. Louis Post-Dispatch article claims there were 15 fatal shooting cases involving law enforcement agencies between January 2007 to September 30, 2011—but provides few names and dates for further research efforts.

And that list doesn't even get into fundamental errors in attitude toward police killing—for example, the tendency of large outlets and wire services to treat killings as local matters, and not worth tracking widely. Even though police brutality is a national crisis. Journalists also don't generally report the race of the person killed. Why? It's unethical to report it unless it's germane to the story. But race is always germane when police kill somebody.

This is the most most heinous thing I've learned in my two years compiling Fatal Encounters. You know who dies in the most population-dense areas? Black men. You know who dies in the least population dense areas? Mentally ill men. It's not to say there aren't dangerous and desperate criminals killed across the line. But African-Americans and the mentally ill people make up a huge percentage of people killed by police.

And if you want to get down to nut-cuttin' time, across the board, it's poor people who are killed by police. (And by the way, around 96 percent of people killed by police are men.)

But maybe most important thing I learned is that collecting this information is hard. I still firmly believe that having a large, searchable database will allow us not just better understanding of these incidents, but better training, policies and protocols for police, and consequently fewer dead people and police. But normal people don't much care about numbers. Trolls intentionally try to pollute the data. Subterranean disinformationists routinely get out fake numbers. I try to take advantage of the public passion when when an incendiary event happens, like the death of Kelly Thomas, James Boyd, Eric Garner or Michael Brown. Or when a Deadspin writer decides to get involved. My girlfriend calls this "riding the spike." I call it journalism. Or maybe, obsession.

Fatal Encounters can be found here, and is on Twitter at @FatalEncounters. Deadspin's submission form can be found here.