Taylor expansions for the moments of functions of random variables
In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite.
First moment
Since the second term disappears. Also is . Therefore,
where and are the mean and variance of X respectively.[1]
It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example,
Second moment
Similarly,[1]
The above is using a first order approximation unlike for the method used in estimating the first moment. It will be a poor approximation in cases where is highly non-linear. This is a special case of the delta method. For example,
The second order approximation, when X follows a normal distribution, is:[2]
Notes
- Haym Benaroya, Seon Mi Han, and Mark Nagurka. Probability Models in Engineering and Science. CRC Press, 2005.
- Hendeby, Gustaf; Gustafsson, Fredrik. "ON NONLINEAR TRANSFORMATIONS OF GAUSSIAN DISTRIBUTIONS" (PDF). Retrieved 5 October 2017.
Further reading
- Wolter, Kirk M. (1985). "Taylor Series Methods". Introduction to Variance Estimation. New York: Springer. pp. 221–247. ISBN 0-387-96119-4.