Posted 8/14/14, updated 7/8/18
COOKING THE BOOKS
Has LAPD been using whiteout to fight crime?
By Julius (Jay) Wachtel. Six years ago, a post entitled “Why the Drop?” posed a question about Los Angeles’ crime statistics: “Crime has been falling. Does anyone know why?” Thanks to some intrepid reporting by the Los Angeles Times, we might finally have our answer. And it’s not pretty.
In 2001 the violent crime rate in the City of Angels reached a historic high of 1388.2 per 100,000 population. By 2007, the tally had plunged to 718.4. This startling reduction of 52 percent meant that even as the population increased, there were 24,437 fewer violent crimes. True enough, crime had eased throughout the U.S. But even as the national trend line flattened, L.A.’s Part I crime crime rate (murder, forcible rape, robbery and aggravated assault) kept falling. In 2012 violent crime in the U.S. increased by seven-tenths of one percent. But L.A. reported yet another single-year drop, in this case, of eight percent.
Considering its burgeoning population and thin police coverage, L.A.’s unbroken string of victories seemed remarkable. So we wondered. After considering possible causal factors such as demographics and harsh sentencing, our speculation took what may have been a prophetic turn:
National crime stats come from the police, the same agencies whose effectiveness the data supposedly measures. Many reporting problems have surfaced over the years. Bookkeeping errors (unsurprisingly, usually leading to undercounts), differences in categorization, even purposeful jiggling – they’ve all taken place. Suffice it to say that cooking the books is eminently possible, and no one’s watching.
Each year the FBI publishes crime statistics, by city and state. According to the Times, the decline in L.A.’s crime rate is attributable, at least in part, to a practice of purposely downgrading incidents so they don’t reach the Part I threshold. In fact, police departments throughout the U.S. have been cooking the books for years. Want to keep an aggravated assault – the most common Part I violent crime – off the FBI tally? Easy. Simply discourage reporting. Or if a victim refuses to play ball, downplay their account, minimize their injuries or ignore the use of a weapon. Presto! You now have a simple assault, which is not included in the FBI’s report.
Don’t believe it? Here are a few examples:
- In 1998 the U.S. Justice Department opened an inquiry into fudged crime statistics in Philadelphia. As a local reporter later said, “The phony stats were known for many years. Aggravated assaults were easily changed to simple assaults…Precinct commanders used to joke about this, but behind those statistics are real victims.”
- Detroit chief James Barren was fired in 2009 when his department and the medical examiner were caught misclassifying homicides as self-defense and suicide.
- In the same year a Dallas newspaper investigation revealed that police were reporting only half the crimes called for in FBI guidelines. Although use of a weapon (not just a gun) makes assaults “aggravated,” pipe beatings, to give one example, were being recorded as simple assaults.
- Also in 2009 the Florida Department of Law Enforcement attributed chronic under-reporting of serious crime by Miami police to “a self-imposed pressure that certain [officers] felt as a result of the implementation of Compstat.” One of the examples cited was a carjacking that police downgraded to an “information report.”
- Sometimes crimes can’t be easily downgraded. But Baltimore found an ingenious way to make it seem as though fewer citizens were being shot. How? By reporting shootings with multiple victims as a single crime.
Click here for the complete collection of conduct and ethics essays
For possibly the longest running and most systematic manipulation of crime data look to the Big Apple. NYPD officers have been accusing their agency of undercounting serious crime for years. As one cop said, “If it’s a robbery, they’ll make it a petty larceny...a civilian punched in the face, menaced with a gun, and his wallet was removed, and they wrote ‘lost property’.” Indeed, some cops got so angry that they secretly taped superiors telling them to downgrade reports. By 2010 the department had no choice but to formally investigate. It concluded that, yes, a few rogue managers were purposely downgrading crimes. Orders were duly issued banning the practice.
Yet the problem apparently persisted. In The Crime Numbers Game: Management by Manipulation, a stinging exposť published in 2012, two criminal justice professors (one, a retired NYPD captain) alleged that these unsavory practices have not only continued but are literally embedded in the troubled agency’s DNA.
Compstat, NYPD’s vaunted number-crunching tool, likely deserves much of the blame. Brought to Los Angeles by former (and current) NYPD Commissioner Bill Bratton, it measures officer performance by tallying enforcement activity – stops, tickets and arrests – and the agency’s success by counting crimes. Of course, once NYPD started bragging about its success, crime rates had to keep going down. And even if crime really was falling, cops (at least those seeking good evaluations) remained under instructions to make as many stops and arrests as possible. (Thanks to the law of unintended consequences, high levels of police activity can have negative effects. New York’s stop and frisk campaign seemed like a great idea – until it didn’t.)
As we’ve repeatedly said, what really “counts” in policing can be impossible to adequately express with numbers. Police departments aren’t factories, and officers aren’t assembly-line workers. Adopting programs such as Compstat can push aside worthy objectives and distort what actually gets done. And while relying on numbers alone to form public policy is a bad idea, fudging them is unforgivable. It turns cops into liars. It misleads policymakers and the public. Granting offenders undeserved breaks also shortchanges victims and increases everyone’s risk of becoming the next casualty.
Hopefully the Times’ jaw-dropping findings will lead LAPD to reassess both the value and accuracy of its statistics. Coincidentally, just as this post was going to press, the California State Board of Equalization issued an alert warning that some businesses were gaming tax collectors with “illegal sales suppression software” that automatically understates sales volume. While there is no known application that does that for city crime statistics, one can only imagine the possibilities!
UPDATE (8/24/18): In June 2017 NYPD paid out $75 million to settle a lawsuit that charged a quota system led officers to issue more than one million legally insufficient summonses between 2007-2015. That settlement has led NYPD to formally train officers about its “no quota” policy. NYPD’s commissioner recently threatened to discipline supervisors who put quantity over quality. But problems apparently persist. Twelve minority officers - the “NYPD 12” - have an active lawsuit charging that they were punished for not meeting arrest expectations. Their action is the subject of a new, award-winning documentary: “Crime + Punishment.”
UPDATE (11/9/17): After being supposedly told that her “meddling” would cost her a promotion, LAPD Captain Lillian Carranza went public with claims that despite the establishment of a “Data Integrity Unit,” aggravated assaults continued to be purposely undercounted in various field divisions by ten percent. Chief Charlie Beck promptly shot back, calling the claim “outrageous.” But the officer’s union rebuked the chief and supports Carranza, who has filed a lawsuit alleging retaliation.
Did you enjoy this post? Be sure to explore the homepage and topical index!
Home Top Permalink Print/Save Feedback
Why do Cops Lie? Be Careful (II) Be Careful (I) Is Crime Up or Down?
Location, Location, Location Quantity, Quality and the NYPD The Numbers Game Liars Figure
Too Much of a Good Thing? Why the Drop?
Fudge Factor: Cooking the Books on Crime Stats