E(t, )>+--8(t,) is identical to the distribution of E(t, i 
E(t, + T)y +++ 58(t, + tT) for all T and all arbitrary choices of the 
times t tis then the stochastic process €(t) is said to be stationary. 
Backs 
Ihe oe first and second moments E[&] and BLES] of the distributions 
are equal, then the process is weakly stationary. 
Our discussion of control systems has been limited to systems in 
which knowledge of the system at time t together with the governing 
equations suffices to describe its future evolution. Knowledge of the 
past when the present is given is superfluous relative to the future 
evolution of the system. The stochastic system analogy of this situation 
is the Markov property for random processes; these are stochastic process- 
es in which the past and future of the processes are conditionally 
independent. In order to define a Markov process, the conditional 
probability and the transition probabilities have to be defined. The 
conditional probability P(A|B) is the probability that A will occur if B 
has occurred. Given a sequence of times th < to Sooo to Sie, wns 
probability that €(t) < x if the sample function €(*) has already taken 
the values E(t,)s E(ty),+++,8(t_) is denoted by P(&(t) < x[E(ty),.-+5 
E(t_))- A stochastic process is said to be a Markov process if 
P(E(t) < x[E(t,),---» S(t,)) = PEC) < x/E(e,)) 
The transition probability distribution F(x, Ely s) is defined by F(x, 
tly, s) = P(&(t) < x/&(s) = y). If a stochastic process is a Markov 
process, its finite distribution functions are given by 
F(x,, Koseres KS typos 
F(x,5 t,) F (x55 ty |x,> t,)---FQ, Ea Eaare ew 
45 
