This results from an application of the Baye rule. A Markov process is 
thus defined by two functions, the absolute probability distribution 
F(x, t) and the transition probabilities F(x, tly, s). 
Consider a system with the following dynamic equation: 
2 (ES > WY) or Ew (e) (4.1) 
where € is a small parameter and w is a stochastic process. Since w is 
stochastic, the state of the system x will also be stochastic; thus, we 
are interested in solving stochastic differential equations. Further- 
more, our interest is not with a particular sample function x(*) which 
is a particular discription of the state of the system during one run 
through the process; our interest is with the statistical properties of 
the stochastic process x (t). 
Consider the linear stochastic differential equation 
dx = A x dt + dw (4.2) 
where w is a stochastic process. In order to make some progress in 
finding the statistical properties of x, assume that w is a Wiener 
process. 
A Wiener process is a Markov process which satisfies the following 
conditions: 
1. It is a second order process; that is, for all t 
AN 
ale Ce) < © 
Hence, the mean m(t) exists as well as the covariance function 
r(s, t) = cov [w(t), w(s)] 
46 
