the interrelationships between specific operations 

 of the criminal justice system and crime incidence 

 that most attempts to assess any program impact 

 on crime have been open to serious challenge. 



The analytic problem can be paraphrased quite 

 simply: To measure the success of a crime inter- 

 vention program, one must simply count the num- 

 ber of crimes that didn't happen because of the 

 program. Of course, what this means is that one 

 measures the diflference between the observed 

 crime rate in the target jurisdiction and some math- 

 matical projection of the rate expected in the ab- 

 sence of the intervention. The degree of con- 

 fidence in the result is obviously highly dependent 

 on the confidence one has in the projection tech- 

 nique. 



One example of an effort now underway to ad- 

 dress this problem is the thorough and systematic 

 exploration of stochastic modeling as applied to 

 monthly crime rates. When thoroughly developed 

 and validated, the technique should be able to de- 

 tect with greater precision changes in the crime 

 rate that result from programs or new approaches, 

 distinquishing such changes from random month- 

 to-month fluctuations and assessing the probability 

 that observed deviations from projected rates are 

 in fact statistically significant evidence of some 

 change in the crime rate generating process. 



The project evolved from a basic model that was 

 developed and tested using police data from Atlan- 

 ta covering a 52-month period. That project, un- 

 dertaken as part of the local evaluation of the 

 LEAA high impact anticrime program in Atlanta, 

 demonstrated the feasibility of the analytic ap- 

 proach — time series analysis resulting in a model 

 of the autoregressive, moving average type. 



In the Atlanta model, 84 percent of the variance 

 in the month-to-month distribution of serious 

 crime could be "explained" by examining the 

 structure of the distribution itself. For burglary 

 alone this "forecasting efficiency" was 74 percent. 

 Although there is no reason to suspect that Atlanta 

 crime data are statistically distributed in any parti- 

 cularly fortunate way, it is obviously desirable to 

 subject this assumption to empirical verification. 

 Consequently, the basic model is now undergoing 

 intensive validation through construction of simi- 

 lar models for approximately 15 cities. 



Concurrently, developmental work is underway 

 to extend the predictive power of the model and 

 enhance its usefulness for criminal justice planning 

 and evaluation. This work will investigate: 



• The introduction of explanatory, socioeco- 

 nomic variables into the model's structure 



• Disaggregation to jurisdictions smaller than 

 entire cities 



• Integration of such disaggregated submodels 

 to analyze crime displacement 



• The applicability of these models to evalua- 

 tion through the introduction of causal varia- 

 bles that reflect particular programs or strate- 

 gies. 



Methodologically, the assumption is that, by 

 mathematical analysis of what has gone on in the 

 past, a model can be developed for reliable predic- 

 tion of the future, provided there is no change in 

 any of the (unknown) processes whose dynamics 

 are reflected in crime rate data. 



The most primitive form of the model builds 

 from the assumption that, in a given series of 

 monthly crime rates, the variations and changes 

 observed contain a component that is "caused" 

 and a component that is truly random. The 

 "caused" part may contain long-term trends (not 

 necessarily linear) and seasonal variations, but no 

 assumptions are made about the real nature of 

 these underlying causes. The random component 

 is, of course, constrained to have a zero average. 

 Through analytic optimization and iteration tech- 

 niques, a model is developed in the sense that an 

 optimum form emerges which specifies not only 

 the number and function of the necessary parame- 

 ters but also their value. Predictions are then made 

 as expected values. This means that the model's 

 best estimate of the crime rate in any future month 

 is projected to be made up of a component whose 

 value is a relatively simple function of the crime 

 rates in certain previous months (or their expected 

 values) plus a random component, whose expected 

 value is zero. 



In addition to serving as the basis for the 15-city 

 model, these data will be used to attempt to con- 

 struct an aggregated model as a first step toward 

 providing an analytic tool for projecting national 

 crime rates. More important for this project will be 

 the work designed to introduce causal variables 

 into a dynamically more sophisticated version of 

 the basic model. In essence, this will be done by 

 examining relationships that might exist between 

 different values taken on by the parameters that 

 drive the individual city models and some of the 

 city's socioeconomic characteristics. 



Model building also will be attempted at a lower 

 level of aggregation (census tract, police precinct) 

 in selected cities in an attempt to establish the fea- 

 sibility of the approach at this level and the poten- 

 tially greatly enhanced utility such models might 

 then have for planning and evaluation. 



Research agreements program. Another basic 

 research program was developed by the Institute in 

 1974. The research agreements program (RAP) 

 links the Institute to selected universities and re- 

 search organizations on a long-term basis. With 

 funds from the Institute, these research bodies are 

 conducting basic research in broad areas of crime 

 and criminal justice, and their experience should 



JUSTICE 143 



