I. Basic math.
 1 Conditional probability.
 2 Normal distribution.
 3 Brownian motion.
 4 Poisson process.
 5 Ito integral.
 6 Ito calculus.
 7 Change of measure.
 8 Girsanov's theorem.
 A. Change of measure-based verification of Girsanov's theorem statement.
 B. Direct proof of Girsanov's theorem.
 9 Forward Kolmogorov's equation.
 10 Backward Kolmogorov's equation.
 11 Optimal control, Bellman equation, Dynamic programming.
 II. Pricing and Hedging.
 III. Explicit techniques.
 IV. Data Analysis.
 V. Implementation tools.
 VI. Basic Math II.
 VII. Implementation tools II.
 VIII. Bibliography
 Notation. Index. Contents.

## Direct proof of Girsanov's theorem.

e follow the setup of the two previous sections ( Girsanov setup ) and ( Girsanov change of measure ) but give the direct derivation of the expression for the changed probability measure. We consider a "small time interval" version of the formula ( Change of measure ):

The last formula is the equation that we need to satisfy by selecting a proper process . The , , are deterministic functions from the point of view of . The and are random variables from the point of view of . These variables represent a small change over the time interval . The distribution of is known, for a column of iid standard normal variables in the "original measure". The distribution of is what we are trying to find from the above equation. We seek as a variable adapted to the filtration generated by . Hence, we express the last "original measure" expectaion as follows where is dimensionality of . In the last integral the is the integration parameter over all possible values of and is the function of because is -adapted.

The last integral is supposed to be equal to for any smooth and sharply decaying function . We seek such that the would be a standard Brownian motion in the new (changed) measure: Therefore we need the former and the latter integrals to be equal. This gives us a recipe for construction of the .

We introduce the convenience notation and and state that we are seeking the satisfying for any smooth . To make conclusions we need to have the same expression as an argument of . Hence, we make the change of variable in the right integral. It becomes Therefore, has to satisfy for any smooth . Hence, must satisfy or Changing to the original variable we obtain or, in the original notation, Using the above SDE and the boundary condition we conclude

 (Girsanov kernel)

We summarize with the following statement.

Theorem

(Girsanov's theorem) Let where the is a column of -adapted iid standard Brownian motions with respect to some -given probability measure and is an adapted integrable process. Then is a standard Brownian motion with resect to the "changed" probability given by the expectation , see ( Definition_of_change_of_measure ). The is given by the formula ( Girsanov kernel ).

 Notation. Index. Contents.