
In two Loop office buildings about eight blocks apart, a pair of University of Chicago research teams are analyzing big data to answer a thorny question that has become especially charged in recent months: Will a police officer have an adverse interaction with a citizen?
The team from the university’s Crime Lab is in the first stages of working with the Chicago Police Department to build a predictive data program to improve the department’s Early Intervention System, which is designed to determine if an officer is likely to engage in aggressive, improper conduct with a civilian.
The other team, part of U. of C.’s Center for Data Science & Public Policy, is expected to launch a data-driven pilot of an Early Intervention System with the Charlotte-Mecklenburg Police Department in North Carolina by the end of the summer. The center is working on similar efforts with the Los Angeles County sheriff’s office and the Nashville and Knoxville police departments in Tennessee.
Data crunching has been used in policing since the late 1970s. But applying this level of big-data processing — similar to techniques that help determine email spam, a person’s movie preferences or advertisements on a social media page — to predict police misconduct is new, experts say. In this foray, data scientists are encountering deep suspicion from officers concerned about the system’s fairness and effectiveness. The new approach also raises the complex issue of what to do once the system predicts an officer is likely to misbehave.
The efforts come at a volatile time in Chicago and around the country. The Chicago Police Department is under a federal probe after last year’s release of video showing an officer fatally shooting Laquan McDonald 16 times in October 2014. The release of another video earlier this month, from the scene of a July stolen car crash in which police fatally shot 18-year-old Paul O’Neal in the back, further roiled relations between the community and its police force.
Those incidents were followed by weekend rioting in Milwaukee after a police officer shot and killed a man who reportedly refused to drop his gun during a foot chase.
While the police misconduct application is one of the more controversial elements of this version of big-data processing, the researchers say their goal is broader.
“The thing we’re finding is that using it (big data) to predict officer adverse incidents is just one use,” said Rayid Ghani, director of the Center for Data Science & Public Policy and previously chief data scientist for President Barack Obama’s 2012 campaign. “Inside police departments, they are doing a lot of other things — performance management, other safety things, training. This is easily extensible to all those things.”
Jens Ludwig, director of the Crime Lab, added: “Ultimately the goal here is that you want to train and retain the very highest-quality police force that you can.”
Lingering concerns
Most departments, including Charlotte-Mecklenburg, use a threshold system to determine if an officer is likely to have an adverse interaction with a citizen and needs intervention. That system typically flags an officer if he or she has been involved in multiple worrisome incidents — citizen complaints, vehicle accidents, on-the-job chases and injuries, or uses of excessive force — in a short time period.
The problem with threshold systems is that they place an inordinately high and inaccurate number of officers in the at-risk categories, while letting other officers in need of intervention slip by undetected, experts say.
The advantage of data-driven analysis is that it can take mounds of law enforcement information and look for patterns that lead to misconduct and those that lead to exemplary performance, supporters say.
Chicago police in 1994 became one of the first departments in the country to start a pilot Early Intervention System using data analysis. The software program, called BrainMaker, was started partly in response to police union criticism that the existing human supervisors were too arbitrary and subjective.
It was abandoned less than two years later amid Fraternal Order of Police contentions that the system was too intrusive, unable to accurately assess the nuance of police work and would set up an officer for punishment even though he or she had not misbehaved.
Those concerns linger to this day.
In Charlotte-Mecklenburg, which has a nonunionized department that serves a steadily growing, diverse region of more than 1 million, some officers are nervous about sharing extensive personal information with researchers, said Capt. Stella Patterson, who leads the department’s professional standards unit. Those fears have eased somewhat after department administrators showed the officers that the information was made anonymous before turning it over to researchers, she added.
Other concerns by experts who have studied law enforcement for decades resemble earlier issues: An officer who is flagged as being at high-risk for an adverse interaction could be stigmatized and damaged professionally — before he or she has misbehaved, experts said.
“There’s just kind of a discomfort for anybody who’s involved in criminal justice about singling out and punishing people without basis of anything that they’ve done, but based on attributes that they have,” said Locke Bowman, Northwestern University clinical law professor and one of the lawyers who successfully pushed for a special prosecutor in the McDonald shooting case.
Related to that is the question of what to do with the information suggesting an officer may be headed for an adverse interaction.
Finding effective interventions — such as additional training or counseling — is challenging, experts say. Those efforts are complicated by a lack of credible data on citizen complaints and what experts see as an inefficient process to adjudicate police officer discipline. Both are common problems in departments across the country, including CPD.
Tribune examinations of the department have found a police union contract that shields officers from scrutiny in exchange for the union’s concession on pay. In addition, the city’s police oversight agency, the Independent Police Review Authority, has been slow and often superficial in its investigations into alleged police abuse. When wrongdoing is confirmed, the agency often recommends light punishment, the Tribune has found.
Chicago police Superintendent Eddie Johnson has characterized the department’s early warning system as “not effective.” In a March interview with the Tribune, Johnson said part of the reason for its ineffectiveness “is because too many people had a hand in it.” An excessive number of supervisors would review individual issues with an officer but often would be unaware of the officer’s entire record, Johnson said.
“The mechanism we’re putting into place is so there’s a few supervisors that would review that, and then they’ll see a pattern of behavior,” he added.
On Wednesday, CPD spokesman Anthony Guglielmi said in an email that the new system will “spot problems before they exist and support police officers with training and peer-to-peer mentoring on the front end.”
Context may help accuracy
Advocates of the data-driven approach agree that its success depends on reliable and extensive data. The quality of data is improving and the capacity for processing that legitimate data is rapidly becoming more sophisticated, supporters say.
In addition, the U. of C.’s data science teams have visited Charlotte-Mecklenburg police several times, participating in ride-alongs and officer focus groups seeking their input on what factors may predict an officer having an adverse interaction. Blending that context with the higher quality data processing has made the newer system even more accurate, U. of C.’s analysts say.
“There’s a lot of human intuition in it,” said Lauren Haynes, senior project manager at the Center for Data Science & Public Policy. She added that Charlotte-Mecklenburg officers who once were suspicious of the data program have welcomed the chance to share their perspective.
Early tests from modeling have yielded encouraging results. Compared with the Charlotte Police Department’s existing threshold-based system, the Data Science & Public Policy system accurately flagged more officers who went on to have adverse interventions, Patterson said.
“That was an indication that we’re going in the right direction,” she said. She emphasized that the proposed system “is not punitive in any fashion. They’re early warnings that alert us.”
Guglielmi also stressed the new system’s “nonpunitive” structure would be “based on industry best practices.”
Researchers from the university last visited Charlotte-Mecklenburg in mid-July, and the department is making final tweaks to the model, Patterson said.
“And then we’ll see if it works,” she said. “If it doesn’t, we’ll go back to the drawing board.”
Those running the University of Chicago projects and the collaborating law enforcement agencies across the country describe the new approach as the pursuit of a well-rounded personnel system that would help officers perform a tough job in the most effective way.
At the same time, the system can identify certain strengths of officers and place them in capacities that allow them to succeed, supporters said.
“We are looking for an early warning system that would allow the department to intervene with an employee to help ensure they are the best they can be,” Nashville police spokesman Don Aaron said in an email.
The university’s Crime Lab endeavor with CPD has recently cleared “legal and bureaucratic issues and is just now getting going,” said Ludwig, the Crime Lab director. “Everyone is eager to get something in place that is high-quality as soon as possible,” he added.
Guglielmi said the Police Department expects to roll out the new system next year.
This article was originally published on www.chicagotribune.com and can be viewed in full


Archive
- October 2024(44)
- September 2024(94)
- August 2024(100)
- July 2024(99)
- June 2024(126)
- May 2024(155)
- April 2024(123)
- March 2024(112)
- February 2024(109)
- January 2024(95)
- December 2023(56)
- November 2023(86)
- October 2023(97)
- September 2023(89)
- August 2023(101)
- July 2023(104)
- June 2023(113)
- May 2023(103)
- April 2023(93)
- March 2023(129)
- February 2023(77)
- January 2023(91)
- December 2022(90)
- November 2022(125)
- October 2022(117)
- September 2022(137)
- August 2022(119)
- July 2022(99)
- June 2022(128)
- May 2022(112)
- April 2022(108)
- March 2022(121)
- February 2022(93)
- January 2022(110)
- December 2021(92)
- November 2021(107)
- October 2021(101)
- September 2021(81)
- August 2021(74)
- July 2021(78)
- June 2021(92)
- May 2021(67)
- April 2021(79)
- March 2021(79)
- February 2021(58)
- January 2021(55)
- December 2020(56)
- November 2020(59)
- October 2020(78)
- September 2020(72)
- August 2020(64)
- July 2020(71)
- June 2020(74)
- May 2020(50)
- April 2020(71)
- March 2020(71)
- February 2020(58)
- January 2020(62)
- December 2019(57)
- November 2019(64)
- October 2019(25)
- September 2019(24)
- August 2019(14)
- July 2019(23)
- June 2019(54)
- May 2019(82)
- April 2019(76)
- March 2019(71)
- February 2019(67)
- January 2019(75)
- December 2018(44)
- November 2018(47)
- October 2018(74)
- September 2018(54)
- August 2018(61)
- July 2018(72)
- June 2018(62)
- May 2018(62)
- April 2018(73)
- March 2018(76)
- February 2018(8)
- January 2018(7)
- December 2017(6)
- November 2017(8)
- October 2017(3)
- September 2017(4)
- August 2017(4)
- July 2017(2)
- June 2017(5)
- May 2017(6)
- April 2017(11)
- March 2017(8)
- February 2017(16)
- January 2017(10)
- December 2016(12)
- November 2016(20)
- October 2016(7)
- September 2016(102)
- August 2016(168)
- July 2016(141)
- June 2016(149)
- May 2016(117)
- April 2016(59)
- March 2016(85)
- February 2016(153)
- December 2015(150)