Suppose that the problem is to estimate
unknown parameters
describing the distribution
of the random variable
.[1] Suppose the first
moments of the true distribution (the "population moments") can be expressed as functions of the
s:
![{\displaystyle {\begin{aligned}\mu _{1}&\equiv \operatorname {E} [W]=g_{1}(\theta _{1},\theta _{2},\ldots ,\theta _{k}),\\[4pt]\mu _{2}&\equiv \operatorname {E} [W^{2}]=g_{2}(\theta _{1},\theta _{2},\ldots ,\theta _{k}),\\&\,\,\,\vdots \\\mu _{k}&\equiv \operatorname {E} [W^{k}]=g_{k}(\theta _{1},\theta _{2},\ldots ,\theta _{k}).\end{aligned}}}](//wikimedia.org/api/rest_v1/media/math/render/svg/678951940462615d65aeff797eba8e2b0e068c7e)
Suppose a sample of size
is drawn, and it leads to the values
. For
, let

be the j-th sample moment, an estimate of
. The method of moments estimator for
denoted by
is defined as the solution (if there is one) to the equations:[source?]
![{\displaystyle {\begin{aligned}{\widehat {\mu }}_{1}&=g_{1}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}),\\[4pt]{\widehat {\mu }}_{2}&=g_{2}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}),\\&\,\,\,\vdots \\{\widehat {\mu }}_{k}&=g_{k}({\widehat {\theta }}_{1},{\widehat {\theta }}_{2},\ldots ,{\widehat {\theta }}_{k}).\end{aligned}}}](//wikimedia.org/api/rest_v1/media/math/render/svg/38a99f372ab240288d0ee58be952396b6ee3687b)