164 



INTENSITY FLUCTUATIONS 



of the median point and by the slope with which the 

 curve passes through the median point. The central 

 portion of an integrated distribution function gives, 

 de facto, no more information than is contained in the 

 statement of two parameters of the distribution, such 

 as the mean and the standard deviation.. The addi- 

 tional information which is represented by the shapes 

 of the two "tails," must frequently be discounted 

 because of the small number of signals which de- 

 termine these shapes. 



It is true that the mere existence of a tail at the 

 high amplitude end permits certain conclusions al- 

 though these conclusions are mostly negative. If 

 fluctuation were brought about exclusively by the 

 interference of two signals, each having a fixed ampli- 

 tude, then there should be a cutoff at an amplitude 

 equal to the algebraic sum of the amplitudes of the 

 components, corresponding to constructive inter- 

 ference. According to equation (25), for instance, 

 P(a) should reach its maximum of 1 at an amplitude 

 twice tti. The fact that there is a percentage of ampli- 

 tudes, however small, exceeding that value proves 

 that interference between two signals of equal ampli- 

 tude cannot be the only cause of fluctuation. 



The variability of fluctuation magnitudes, which 

 was touched on in Section 7.1.1, is reflected in the 

 variability of the observed amplitude distributions. 

 Even if large samples were processed consisting of 

 thousands of signals for each sample, there is every 

 reason to believe that their distribution functions 

 would differ appreciably. At the present stage of the 

 theory, the details of observed distribution functions 

 do not lend themselves readily to theoretical inter- 

 pretation. 



Additional plots of observed distributions can be 

 found in references 1 and 2, while additional theoreti- 

 cal distributions are discus.sed in a memorandum 

 from HUSL." 



7.1.3 



Rapidity of Fluctuation 



So far, we have discussed only the typical devia- 

 tions which individual signals show from the average. 

 In this section, we shall be concerned with the time 

 pattern of the fluctuation. Two sequences of signals 

 could have the same relative standard deviation of 

 amplitude, but could differ utterly in the nature of 

 their fluctuations. For example, in one sequence the 

 signal amplitudes might be distributed throughout 

 the sequence in random fashion, so that a small ampli- 

 tude signal is as likely to be followed by another small 



amplitude signal as by a large amplitude signal; while 

 in the other sequence, each signal amplitude might 

 be only slightly different from the amplitude of the 

 preceding or following signal. In the second sequence, 

 the total spread of amplitudes can be just as large 

 as in the first one, if a rising or falling tendency is 

 maintained through a number of consecutive signals. 

 The self-correlation coefficient is the mathematical 

 tool by means of which the time pattern of fluctua- 

 tion can be expressed in quantitative form. 



The Coefficient of Self-Correlation 



Let us consider a sequence of signals which are re- 

 ceived under apparently identical conditions. It is, of 

 course, conceivable that each signal is completely 

 unaffected by the strength of the preceding signal; 

 this would mean that the distribution function of all 

 those signals which follow immediately after signals 

 of intensity /i are identical with the distribution func- 

 tion of all signals (without restriction). On the other 

 hand, it may be found that the signals immediately 

 following signals with the intensity 7i have a distri- 

 bution function which depends on the choice of h. 

 Both of these situations seem to occur in practice. If 

 the signals directly following those with intensity /i 

 tend to have intensities not too much different from 

 /i, it is said that, in the sequence considered, con- 

 secutive signals have a positive correlation. 



In order to obtain some numerical measure for the 

 degree of correlation in a given sequence, we shall 

 compare the difference between two consecutive sig- 

 nals with the difference between two signals picked at 

 random. Focusing our attention on intensities, for 

 instance (we might as well consider amplitudes or 

 levels without changing the mathematics), we shall 

 compare the mean squared intensity difference be- 

 tween two signals chosen at random with the mean 

 squared intensity difference between a signal and its 

 immediate predecessor. We are then concerned with 

 the expression 



Sr = {In - ImY - ih - In-^r = 2(/„/„-l - P) , 



(26) 



in which n and m are to be varied independently of 

 each other. The expression on the right-hand side can 

 be obtained as follows. We have 



_ {In- In^ 



^ 1,1 ^i-nlm "T Im 



(/„ - /„-.) 



In + 2/„/„-l — In 



In this expression, all the squared term averages, 

 II, 7^, and /^_i, are equal and cancel each other. The 



