Crime Stats Charged with Fraud

Criminologists are questioning the value of some crime statistics in New York:

In interviews with the criminologists, other retired senior officers cited examples of what the researchers believe was a periodic practice among some precinct commanders and supervisors: checking eBay, other Web sites, catalogs or other sources to find prices for items that had been reported stolen that were lower than the value provided by the crime victim. They would then use the lower values to reduce reported grand larcenies — felony thefts valued at more than $1,000, which are recorded as index crimes under CompStat — to misdemeanors, which are not, the researchers said.

Just last week Elinor Ostrom (Nobel 2009) spoke at the Mercatus Center. In her chat, she stressed that stats sometimes lie. And they sometimes lie in predictable ways. Her example (video is here, though the audio is of poor quality):

… we eventually, again by field work, decided that FBI crime data was not reliable. That happened when I was sitting in a [police] department in Wisconsin on a related project and I was doing archival work. So I was back in the back corner and no one was paying attention to me. …

A sergeant called in a cadet and, I won’t use the bad language that the sergeant used, but he bawled out that cadet … “You’ve just reported a $54 bicycle thefts! In this department, we don’t have $54 bicycle thefts! No bicycle theft is more than $49!”

Why? In those days grand larceny was a theft of more than $50. You observe that and you realize that crime statistics may not always be the most accurate way to measure performance. For those of us who want to do good empirical work, we’ve got to ask, “What are the incentives of those filling out the forms?”

My transcription is not perfect here, but you get the idea.

Ostrom’s comments remind me that the early reports on “jobs saved or created” with stimulus funds were famously awful. Part of the problem was that people filling out forms (and everyone else) had no idea how to measure a job “saved or created.” But even that error wasn’t truly random. The Obama Administration, from the start, designed the very term “saved or created” to tilt the error toward making the jobs numbers higher than they should have been. The executive branch ultimately had to temper its numbers with reality.

Archives