Astronautics: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Delusion23
m fix typos, dates, formatting, references, brackets and links, replaced: HG Wells → H. G. Wells using AWB
en>JorisvS
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
In [[probability theory]] and [[statistics]], the '''moment-generating function''' of a [[random variable]] is an alternative specification of its [[probability distribution]]. Thus, it provides the basis of an alternative route to analytical results compared with working directly with [[probability density function]]s or [[cumulative distribution function]]s. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables. Note, however, that not all random variables have moment-generating functions.
Eusebio is the name girls use to call me and I think it sounds quite good when you say it. Idaho is our birth install. I included to be unemployed yet still now I am a major cashier. My visitors say it's not good for me but methods I love doing can be to bake but I am glad for [http://search.Un.org/search?ie=utf8&site=un_org&output=xml_no_dtd&client=UN_Website_en&num=10&lr=lang_en&proxystylesheet=UN_Website_en&oe=utf8&q=thinking&Submit=Go thinking] on starting something mroe challenging. I'm not good at webdesign but you might desire to check my website: http://prometeu.net<br><br>


In addition to univariate distributions, moment-generating functions can be defined for vector- or matrix-valued random variables, and can even be extended to more general cases.
Feel free to surf to my page how to hack clash of clans - [http://prometeu.net read what he said],
 
The moment-generating function does not always exist even for real-valued arguments, unlike the [[Characteristic function (probability theory)|characteristic function]]. There are relations between the behavior of the moment-generating function of a distribution and properties of the distribution, such as the existence of moments.
 
==Definition==
In [[probability theory]] and [[statistics]], the '''moment-generating function''' of a [[random variable]] ''X'' is
 
:<math> M_X(t) := \mathbb{E}\!\left[e^{tX}\right], \quad t \in \mathbb{R}, </math>
 
wherever this [[expected value|expectation]] exists.
 
<math> M_X(0) </math> always exists and is equal to&nbsp;1.
 
A key problem with moment-generating functions is that moments and the moment-generating function may not exist, as the integrals need not converge absolutely. By contrast, the [[Characteristic function (probability theory)|characteristic function]] always exists (because it is the integral of a bounded function on a space of finite [[measure (mathematics)|measure]]), and thus may be used instead.
 
More generally, where <math>\mathbf X = ( X_1, \ldots, X_n)</math><sup>T</sup>, an ''n''-dimensional [[random vector]], one uses <math>\mathbf t \cdot \mathbf X = \mathbf t^\mathrm T\mathbf X</math> instead of&nbsp;''tX'':
 
:<math> M_{\mathbf X}(\mathbf t) := \mathbb{E}\!\left(e^{\mathbf t^\mathrm T\mathbf X}\right).</math>
 
The reason for defining this function is that it can be used to find all the moments of the distribution.<ref>Bulmer, M.G., Principles of Statistics, Dover, 1979, pp. 75&ndash;79</ref>  The series expansion of ''e''<sup>''tX''</sup> is:
 
: <math>
e^{t\,X} = 1 + t\,X + \frac{t^2\,X^2}{2!} + \frac{t^3\,X^3}{3!} + \cdots +\frac{t^n\,X^n}{n!} + \cdots.
</math>
 
Hence:
 
: <math>
\begin{align}
M_X(t) = \mathbb{E}(e^{t\,X}) &= 1 + t \,\mathbb{E}(X) + \frac{t^2 \,\mathbb{E}(X^2)}{2!} + \frac{t^3\,\mathbb{E}(X^3)}{3!}+\cdots + \frac{t^n\,\mathbb{E}(X^n)}{n!}+\cdots \\
& = 1 + tm_1 + \frac{t^2m_2}{2!} + \frac{t^3m_3}{3!}+\cdots + \frac{t^nm_n}{n!}+\cdots,
\end{align}
</math>
 
where ''m''<sub>''n''</sub> is the ''n''th [[moment (mathematics)|moment]].
 
Differentiating ''M''<sub>''X''</sub>(t) ''i'' times with respect to ''t'' and setting ''t''&nbsp;=&nbsp;0 we hence obtain the ''i''th moment about the origin, ''m''<sub>''i''</sub>,
see [[Moment-generating_function#Calculations_of_moments|Calculations of moments]] below.
 
==Examples==
Here are some examples of the moment generating function and the characteristic function for comparison. It can be seen that the characteristic function is a [[Wick rotation]] of the moment generating function Mx(t) when the latter exists.
{|class="wikitable"
|-
! Distribution
! Moment-generating function ''M''<sub>''X''</sub>(''t'')
! Characteristic function ''φ(t)''
|-
| [[Bernoulli distribution|Bernoulli]]  <math>\, P(X=1)=p</math>
| &nbsp; <math>\, 1-p+pe^t</math>
| &nbsp; <math>\, 1-p+pe^{it}</math>
|-
| [[Geometric distribution|Geometric]]  <math>(1 - p)^{k-1}\,p\!</math>
| &nbsp; <math>\frac{p}{1-(1-p) e^t}\!</math>,    <br>  for &nbsp;<math>t<-\ln(1-p)\!</math>
| &nbsp; <math>\frac{p}{1-(1-p)\,e^{it}}\!</math>
|-
| [[Binomial distribution|Binomial]] B(''n, p'')
| &nbsp; <math>\, (1-p+pe^t)^n</math>
| &nbsp; <math>\, (1-p+pe^{it})^n</math>
|-
| [[Poisson distribution|Poisson]] Pois(''λ'')
| &nbsp; <math>\, e^{\lambda(e^t-1)}</math>
| &nbsp; <math>\, e^{\lambda(e^{it}-1)}</math>
|-
| [[Uniform distribution (continuous)|Uniform (continuous)]] U(''a, b'')
| &nbsp; <math>\, \frac{e^{tb} - e^{ta}}{t(b-a)}</math>
| &nbsp; <math>\, \frac{e^{itb} - e^{ita}}{it(b-a)}</math>
|-
| [[Discrete uniform distribution|Uniform (discrete)]] U(''a, b'')
| &nbsp; <math>\, \frac{e^{at} - e^{(b+1)t}}{(b-a+1)(1-e^{t})}</math>
| &nbsp; <math>\, \frac{e^{ait} - e^{(b+1)it}}{(b-a+1)(1-e^{it})}</math>
|-
| [[Normal distribution|Normal]] ''N''(''μ, σ<sup>2</sup>'')
| &nbsp; <math>\, e^{t\mu + \frac{1}{2}\sigma^2t^2}</math>
| &nbsp; <math>\, e^{it\mu - \frac{1}{2}\sigma^2t^2}</math>
|-
| [[Chi-squared distribution|Chi-squared]] χ<sup>2</sup><sub style="position:relative;left:-5pt;top:2pt">k</sub>
| &nbsp; <math>\, (1 - 2t)^{-k/2}</math>
| &nbsp; <math>\, (1 - 2it)^{-k/2}</math>
|-
| [[Gamma distribution|Gamma]] Γ(''k, θ'')
| &nbsp; <math>\, (1 - t\theta)^{-k}</math>
| &nbsp; <math>\, (1 - it\theta)^{-k}</math>
|-
| [[Exponential distribution|Exponential]] Exp(''λ'')
| &nbsp; <math>\, (1-t\lambda^{-1})^{-1}</math>
| &nbsp; <math>\, (1 - it\lambda^{-1})^{-1}</math>
|-
| [[Multivariate normal distribution|Multivariate normal]] ''N''(''μ'', ''Σ'')
| &nbsp; <math>\, e^{t^\mathrm{T} \mu + \frac{1}{2} t^\mathrm{T} \Sigma t}</math>
| &nbsp; <math>\, e^{i t^\mathrm{T} \mu - \frac{1}{2} t^\mathrm{T} \Sigma t}</math>
|-
| [[Degenerate distribution|Degenerate]] ''δ<sub>a</sub>''
| &nbsp; <math>\, e^{ta}</math>
| &nbsp; <math>\, e^{ita}</math>
|-
| [[Laplace distribution|Laplace]] L(''μ, b'')
| &nbsp; <math>\, \frac{e^{t\mu}}{1 - b^2t^2}</math>
| &nbsp; <math>\, \frac{e^{it\mu}}{1 + b^2t^2}</math>
|-
| [[Negative binomial distribution|Negative Binomial]] NB(''r, p'')
| &nbsp; <math>\, \frac{(1-p)^r}{(1-pe^t)^r}</math>
| &nbsp; <math>\, \frac{(1-p)^r}{(1-pe^{it})^r}</math>
|-
| [[Cauchy distribution|Cauchy]] Cauchy(''μ, θ'')
| does not exist
| &nbsp; <math>\, e^{it\mu -\theta|t|}</math>
|-
|}
 
==Calculation==
The moment-generating function is given by the [[Riemann&ndash;Stieltjes integral]]
 
: <math>M_X(t) = \int_{-\infty}^\infty e^{tx}\,dF(x)</math>
 
where ''F'' is the [[cumulative distribution function]].
 
If ''X'' has a continuous [[probability density function]] ''ƒ''(''x''), then ''M''<sub>''X''</sub>(&minus;''t'') is the [[two-sided Laplace transform]] of ''ƒ''(''x'').
 
: <math>
\begin{align}
M_X(t) & = \int_{-\infty}^\infty e^{tx} f(x)\,dx \\
& = \int_{-\infty}^\infty \left( 1+ tx + \frac{t^2x^2}{2!} + \cdots + \frac{t^nx^n}{n!} + \cdots\right) f(x)\,dx \\
& = 1 + tm_1 + \frac{t^2m_2}{2!} +\cdots + \frac{t^nm_n}{n!} +\cdots,
\end{align}
</math>
 
where ''m''<sub>''n''</sub> is the ''n''th [[moment (mathematics)|moment]].
 
===Sum of independent random variables===
If ''X''<sub>1</sub>, ''X''<sub>2</sub>, ..., ''X''<sub>''n''</sub> is a sequence of independent (and not necessarily identically distributed) random variables, and
 
: <math>S_n = \sum_{i=1}^n a_i X_i,</math>
 
where the ''a''<sub>''i''</sub> are constants, then the probability density function for ''S''<sub>''n''</sub> is the [[convolution]] of the probability density functions of each of the ''X''<sub>''i''</sub>, and the moment-generating function for ''S''<sub>''n''</sub> is given by
 
: <math>
M_{S_n}(t)=M_{X_1}(a_1t)M_{X_2}(a_2t)\cdots M_{X_n}(a_nt) \, .
</math>
<!----------
Below was lifted from [[generating function]] ... there should be an
analog for the moment-generating function
 
*Suppose that ''N'' is also an independent, discrete random variable taking values on the non-negative integers, with probability-generating function ''G''<sub>''N''</sub>.  If the ''X''<sub>1</sub>, ''X''<sub>2</sub>, ..., ''X''<sub>''N''</sub> are independent ''and'' identically distributed with common probability-generating function ''G''<sub>X</sub>, then
 
::<math>G_{S_N}(z) = G_N(G_X(z)).</math>
-------->
 
===Vector-valued random variables===
For [[random vector|vector-valued random variables]] ''X'' with [[real number|real]] components, the moment-generating function is given by
 
:<math> M_X(t) = E\left( e^{\langle t, X \rangle}\right) </math>
 
where ''t'' is a vector and <math>\langle \cdot, \cdot \rangle</math> is the [[dot product]].
 
==Important properties==
An important property of the moment-generating function is that if two distributions have the same moment-generating function, then they are identical at [[almost all]] points.<ref name="Gentle">{{cite book| author = Grimmett, Geoffrey| title = Probability - An Introduction| publisher = [[Oxford University Press]]| isbn = 978-0-19-853264-4 }}{{Page 101 ff|date=1986}}</ref> That is, if for all values of&nbsp;''t'',
 
:<math>M_X(t) = M_Y(t),\, </math>
 
then
 
:<math>F_X(x) = F_Y(x) \, </math>
 
for all values of ''x'' (or equivalently ''X'' and ''Y'' have the same distribution). This statement is not equivalent to ``if two distributions have the same moments, then they are identical at all points", because in some cases the moments exist and yet the moment-generating function does not, because in some cases the limit
 
:<math>\lim_{n \rightarrow \infty} \sum_{i=0}^n \frac{t^im_i}{i!}</math>
 
does not exist. This happens for the [[lognormal distribution]].
<!--
If the moment generating function is defined on such an interval, then it uniquely determines a probability distribution. -->
 
===Calculations of moments===
The moment-generating function is so called because if it exists on an open interval around ''t''&nbsp;=&nbsp;0, then it is the [[exponential generating function]] of the [[moment (mathematics)|moments]] of the [[probability distribution]]:
 
:<math>m_n = E \left( X^n \right) = M_X^{(n)}(0) = \frac{d^n M_X}{dt^n}(0).</math>
 
Here ''n'' should be a nonnegative integer.
 
==Other properties==
[[Hoeffding's lemma]] provides a bound on the moment-generating function in the case of a zero-mean, bounded random variable.
 
==Relation to other functions==
Related to the moment-generating function are a number of other [[integral transform|transforms]] that are common in probability theory:
 
;[[characteristic function (probability theory)|characteristic function]]: The [[characteristic function (probability theory)|characteristic function]] <math>\varphi_X(t)</math> is related to the moment-generating function via <math>\varphi_X(t) = M_{iX}(t) = M_X(it):</math> the characteristic function is the moment-generating function of ''iX'' or the moment generating function of ''X'' evaluated on the imaginary axis. This function can also be viewed as the [[Fourier transform]] of the [[probability density function]], which can therefore be deduced from it by inverse Fourier transform.
 
;[[cumulant-generating function]]: The [[cumulant-generating function]] is defined as the logarithm of the moment-generating function; some instead define the cumulant-generating function as the logarithm of the [[Characteristic function (probability theory)|characteristic function]], while others call this latter the ''second'' cumulant-generating function.
 
;[[probability-generating function]]: The [[probability-generating function]] is defined as <math>G(z) = E[z^X].\,</math> This immediately implies that <math>G(e^t)  = E[e^{tX}] = M_X(t).\,</math>
 
==See also==
* [[Factorial moment generating function]]
* [[Rate function]]
* [[Hamburger moment problem]]
 
{{More footnotes|date=February 2010}}
 
== References ==
{{reflist}}
* {{cite book |last1=Casella |first1=George |last2=Berger |first2=Roger |title=Statistical Inference |edition=2nd |isbn=978-0-534-24312-8 |pages=59–68 }}
 
* {{cite book |last1=Grimmett|first1=Geoffrey |last2=Welsh |first2=Dominic |title=Probability - An Introduction |edition=1st |isbn=978-0-19-853264-4 |pages=101 ff}}
 
{{Theory of probability distributions}}
 
{{DEFAULTSORT:Moment-Generating Function}}
[[Category:Theory of probability distributions]]
[[Category:Generating functions]]

Latest revision as of 13:09, 9 January 2015

Eusebio is the name girls use to call me and I think it sounds quite good when you say it. Idaho is our birth install. I included to be unemployed yet still now I am a major cashier. My visitors say it's not good for me but methods I love doing can be to bake but I am glad for thinking on starting something mroe challenging. I'm not good at webdesign but you might desire to check my website: http://prometeu.net

Feel free to surf to my page how to hack clash of clans - read what he said,