The n-th moment of a real-valued continuous function f(x) of a real variable about a value c is the integral:
$$m_{n}=\int _{-\infty }^{\infty }(x-c)^{n}\,f(x)\,\mathrm {d} x$$Sometime, taking integral for some analytic function is difficult. Now, the question is:
The solution was originally derived from the interpolation of a function with polynomials using difference methods.
Assume:
The Newton interpolation polynomial is a linear combination of Newton basis polynomials:
\begin{align} f(x) &= y_0 + \frac {x-x_0} {h} \left( \Delta^1y_0 + \frac {x-x_0-h} {2h}\left(\Delta^2y_0 + \cdots \right) \right) \\ &= y_0 + \sum_{k=1}^n \frac{\Delta^ky_0}{k!h^k} \prod_{i=0}^{n-1} (x-x_0-ih) \\ &= y_0 + \sum_{k=1}^n \frac{\Delta^ky_0}{k!} \prod_{i=0}^{n-1} (\frac{x-x_0}{h}-i) \\ &= \sum_{k=0}^n {\frac{x-x_0}{h} \choose k}~ \Delta^k y_0 \\ \end{align}Taylor proved when $x_{i}- x_{i+1} = h = \Delta x$ approach $0$, yielded Taylor series:
\begin{aligned}f(x)&=f(a)+\lim _{h\to 0}\sum _{k=1}^{\infty }{\frac {\Delta _{h}^{k}[f](a)}{k!h^{k}}}\prod _{i=0}^{k-1}((x-a)-ih)\\&=f(a)+\sum _{k=1}^{\infty }{\frac {d^{k}}{dx^{k}}}f(a){\frac {(x-a)^{k}}{k!}}\\\end{aligned}The Taylor series of a real or complex-valued function f (x) that is infinitely differentiable (so-called analytic function) at a real or complex number a is the power series:
$f(x)=f(a)+{\frac {f'(a)}{1!}}(x-a)+{\frac {f''(a)}{2!}}(x-a)^{2}+{\frac {f'''(a)}{3!}}(x-a)^{3}+\cdots+R(n)=\sum _{n=0}^{\infty }{\frac {f^{(n)}(a)}{n!}}(x-a)^{n}$
With a = 0, the Maclaurin series takes the form:
$f(x)\vert_{a = 0} = f(0)+{\frac {f'(0)}{1!}}x+{\frac {f''(0)}{2!}}x^{2}+{\frac {f'''(0)}{3!}}x^{3}+\cdots {\frac {f^{(n)}(0)}{n!}}x^{n}=\sum _{n=0}^{\infty }{\frac {f^{(n)}(0)}{n!}}x^{n}$
Given:
Let $f(x)$ be the exponential function, $e^{x}$, then Taylor's polynomial of $f(x)$ is:
$f(x)=e^{x}=\frac{e^a}{0!}(x-a)^0+\frac{e^a}{1!}(x-a)^1+\frac{e^a}{2!}(x-a)^2+\frac{e^a}{3!}(x-a)^3+ \cdots +\frac{e^a}{n!}(x-a)^n=\sum _{n=0}^{\infty }\frac {e^a}{n!}(x-a)^{n}$
With a = 0, the Maclaurin series takes the form:
$f(x)=e^{x}\vert_{a=0}=\frac{e^0}{0!}x^0+\frac{e^0}{1!}x^1+\frac{e^0}{2!}x^2+\frac{e^0}{3!}x^3+ \cdots +\frac{e^0}{n!}x^n=\frac{1}{0!}x^0+\frac{1}{1!}x^1+\frac{1}{2!}x^2+\frac{1}{3!}x^3+ \cdots +\frac{1}{n!}x^n=\sum _{n=0}^{\infty }\frac {1}{n!}x^{n}$
Given:
Let $f(x)$ be the exponential function, $e^{tx}$, then Taylor's polynomial of $f(x)$ is:
$f(x)=e^{tx}=\frac{t^0 e^{ta}}{0!}(x-a)^0+\frac{t^1 e^{ta}}{1!}(x-a)^1+\frac{t^2 e^{ta}}{2!}(x-a)^2+\frac{t^3 e^{ta}}{3!}(x-a)^3+ \cdots +\frac{t^n e^{ta}}{n!}(x-a)^n=\sum _{n=0}^{\infty }\frac {t^n e^{ta}}{n!}(x-a)^{n}$
With a = 0, the Maclaurin series takes the form:
$f(x)=e^{tx}\vert_{a=0}=\frac{t^0 e^0}{0!}x^0+\frac{t^ 1e^0}{1!}x^1+\frac{t^2 e^0}{2!}x^2+\frac{t^3 e^0}{3!}x^3+ \cdots +\frac{t^n e^0}{n!}x^n=\frac{t^0}{0!}x^0+\frac{t^1}{1!}x^1+\frac{t^2}{2!}x^2+\frac{t^3}{3!}x^3+ \cdots +\frac{t^n}{n!}x^n=\sum _{n=0}^{\infty }\frac {t^n}{n!}x^{n}$
Given:
Replace $x$ with a random variable with probability density function, $X$, then Taylor's polynomial of $E[f(x)]$ is:
$E[f(X)]=E[e^{tX}]=E[\frac{t^0 e^{ta}}{0!}(X-a)^0]+E[\frac{t^1 e^{ta}}{1!}(X-a)^1]+E[\frac{t^2 e^{ta}}{2!}(X-a)^2]+E[\frac{t^3 e^{ta}}{3!}(X-a)^3]+ \cdots +E[\frac{t^n e^{ta}}{n!}(X-a)^n]=\sum _{n=0}^{\infty }E[\frac {t^n e^{ta}}{n!}(X-a)^{n}]$
With a = 0, the Maclaurin series takes the form:
$E[f(X)]=E[e^{tX}]\vert_{a=0}=\frac{t^0}{0!}E[X^0]+\frac{t^1}{1!}E[X^1]+\frac{t^2}{2!}E[X^2]+\frac{t^3}{3!}E[X^3]+ \cdots +\frac{t^n}{n!}E[X^n]=\sum _{n=0}^{\infty }\frac {t^n}{n!}E[X^{n}]$
Let $E[X^n]=m_n$, yield:
$$E[f(X)]=E[e^{tX}]\vert_{a=0}=\frac{t^0}{0!}m_0+\frac{t^1}{1!}m_1+\frac{t^2}{2!}m_2+\frac{t^3}{3!}m_3+ \cdots +\frac{t^n}{n!}m_n=\sum _{n=0}^{\infty }\frac {t^n}{n!}m_n$$The above derivation shows $E[f(X)]=\sum_{n=0}^{\infty } \frac{t^n}{n!} m_n$ is a function of moment about origin. Also, only the i-th term remains by taking the k-th derivative of $E[f(X)]$ with respect to t, where:
Let $X$ be a random variable with CDF $F_{X}$. The moment generating function (mgf) of $F_{X}$, denoted by $M_{X}(t)$, is
$M_{X}(t):= E[e^{tX}]$
Given:
Using the $MGF_X(t)$ to calculate the k-th moment about original:
$$E[x^k]=\frac{d^{k}M_{X}(t)}{dt^k} \vert_{t=0} = \frac{d^{k}\int _{-\infty }^{\infty }e^{tx}f_{X}(x)\,dx}{dt^k} \vert_{t=0}$$Suppose that the problem is to estimate k unknown parameters $\theta _{1},\theta _{2},\dots ,\theta _{k}$ characterizing the distribution $f_{W}(w;\theta )$ of the random variable .
Suppose the first k moments of the true distribution (the "population moments") can be expressed as functions of the $\theta$ s: ${\begin{aligned}\mu _{1}&\equiv \operatorname {E} [W]=g_{1}(\theta _{1},\theta _{2},\ldots ,\theta _{k}),\\[4pt]\mu _{2}&\equiv \operatorname {E} [W^{2}]=g_{2}(\theta _{1},\theta _{2},\ldots ,\theta _{k}),\\&\,\,\,\vdots \\\mu _{k}&\equiv \operatorname {E} [W^{k}]=g_{k}(\theta _{1},\theta _{2},\ldots ,\theta _{k}).\end{aligned}}$
Suppose a sample of size n is drawn, resulting in the values $w_1, \dots, w_n$. For $j=1, \dots,k$, let: ${\widehat {\mu }}_{j}={\frac {1}{n}}\sum _{i=1}^{n}w_{i}^{j}$ be the j-th sample moment, an estimate of $\mu_{j}$. The method of moments estimator for $\theta _{1},\theta _{2},\ldots ,\theta _{k}$ denoted by ${\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\dots ,{\widehat {\theta }}_{k}$ is defined as the solution (if there is one) to the equations:
${\begin{aligned}{\widehat {\mu }}_{1}&=g_{1}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}),\\[4pt]{\widehat {\mu }}_{2}&=g_{2}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}),\\&\,\,\,\vdots \\{\widehat {\mu }}_{k}&=g_{k}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}).\end{aligned}}$
!python3 -m jupyter nbconvert jupyter_MGF.ipynb --to html
[NbConvertApp] Converting notebook jupyter_MGF.ipynb to html [NbConvertApp] Writing 588398 bytes to jupyter_MGF.html