Christopher Beam for Slate explains research being done at UCLA in collaboration with the LAPD on predictive policing:
Predictive policing is based on the idea that some crime is random—but a lot isn’t. For example, home burglaries are relatively predictable. When a house gets robbed, the likelihood of that house or houses near it getting robbed again spikes in the following days. Most people expect the exact opposite, figuring that if lightning strike once, it won’t strike again. “This type of lightning does strike more than once,” says Brantingham. Other crimes, like murder or rape, are harder to predict. They’re more rare, for one thing, and the crime scene isn’t always stationary, like a house. But they do tend to follow the same general pattern. If one gang member shoots another, for example, the likelihood of reprisal goes up.
This happened in my neighborhood when I was in fifth grade. We lived in a pretty quiet neighborhood, but one morning a window was open. Someone had come into our house while we were sleeping and stole whatever was in immediate reach. They also stole my dad’s brand new bicycle from the garage. Same thing happened to my neighbor two days later.
[Slate via @amstatnews]
I remember a movie about predicting crime where a *minor* 10 percent error is not *reported*. I think it was called “The Police Who Predicted Wrong.” (Yes this is a Simpsons reference)
Just noticed that Slate corrected its story. Homes can’t be robbed because robbery involves threat to a person. Homes can be burglerized.
You may want to recapture the quote.
Nothing new here. This type of collaboration between the police and researchers has been going on for a while now. This blurb also suggests that crime data can simply be mined — but a lot of theory goes into explaining crime trends, not just looking for hot spots.