BAR Analysis 



The BAR method is an alternative to the hot-spot method. The BAR data reduction method is 

 also described by an algorithm rather than by an optimization process. The geometry of directional 

 beams where lower-than average received noise is observed defines directions where high-intensity noise 

 cannot originate. These constraints, applied to the observations of higher-than-average received intensity, 

 in most cases, tell which of the two ambiguous directions the bulk of the noise must have come from. 

 The rest, usually a small part of the total, is spread around rather than attributed to a unique direction 

 as in the hot-spot method. (The reader seeking more detailed information should consult Ref . 5, as well 

 as Refs. 6 and 7.) 



DISCUSSION 



All linear processing methods are easily generalized from a single component of nth order to a com- 

 plete Fourier series. This, in turn, is sufficient to characterize an arbitrary field. If more than two sets 

 of observations are made, the solution will be overspecified when the measurements are totally free from 

 error. In the presence of finite error, the redundant measurements can be used to reduce the expected 

 error in the estimates. All of these analytic methods use assumptions about the field to be measured 

 and the kind of measurement errors which will be encountered. These assumptions may be verifiable 

 facts, but they may be chosen for simplicity and plausibility or they may simply be wishful thinking. 



We have proceeded stubbornly on the assumption that it is necessary to make an estimate of func- 

 tion 1(8) which describes the distribution of noise in azimuth. But this may not be necessary. For 

 example, suppose the task at hand is to validate a model of ambient noise distribution. If the model 

 yields a computed value of I{d), it is extremely simple to calculate the symmetric part of the model pre- 

 diction by Eq. (1). An eminently satisfactory way of validating the model is to compare observations 

 with a symmetrical part of the prediction. If the symmetric part of the prediction is matched by a 

 number of independent observations made with the array pointing in arbitrarily assigned directions, this 

 will result in a convincing model validation. 



Similarly, experimental studies to support the need for narrower search beams or for adaptive beam 

 forming can be framed in such a way that it is not necessary to determine the noise field distribution 

 unambiguously. 



On the other hand, for most kinds of linear array design studies and many kinds of surveillance 

 system performance studies, an approximation to the ambient noise field in all directions is necessary, 

 and some general estimator hke that in Eq. (18) is highly desirable. 



Choice of error criteria depends on end use. If many measurements will be made, and the reduced 

 estimates may be averaged, smoothed, or interpolated among, then a bias-free estimate is desirable, be- 

 cause averaging reduces the error below the bias. On the other hand, if only one measurement will be 

 made, the minimization of error independent of bias may be appropriate. But why the mean-square- 

 error Eq. (22)? Admittedly it is a simple and plausible error criterion, but the true value or criterion of 

 performance of some system under investigation may be more closely represented as error weighting 

 other than the square. 



As shown in the references, it is difficult and expensive to attempt to completely describe the dis- 

 tribution of ambient noise, and we suspect it may be unnecessary. On the other hand, raw data from a 

 linear array is ambiguous and may be of limited utUity. A compromise is needed between the qualities 

 of ambient noise directionality specification useful in a particular application and the cost and complex- 

 ity of measurement and analysis. 



Most of the ambiguity in ambient noise directionality measurement can be removed with reasonable 

 amounts of measurement and ansdysis. However, as residual ambiguity is reduced, improvement gets 

 more and more difficult. Also, the lower bounds of random errors in the results, as distinguished from 



902 



