Loading AI tools
Theorem in statistics From Wikipedia, the free encyclopedia
In statistics, the Lehmann–Scheffé theorem is a prominent statement, tying together the ideas of completeness, sufficiency, uniqueness, and best unbiased estimation.[1] The theorem states that any estimator that is unbiased for a given unknown quantity and that depends on the data only through a complete, sufficient statistic is the unique best unbiased estimator of that quantity. The Lehmann–Scheffé theorem is named after Erich Leo Lehmann and Henry Scheffé, given their two early papers.[2][3]
This article needs additional citations for verification. (April 2011) |
If T is a complete sufficient statistic for θ and E(g(T)) = τ(θ) then g(T) is the uniformly minimum-variance unbiased estimator (UMVUE) of τ(θ).
Let be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case) where is a parameter in the parameter space. Suppose is a sufficient statistic for θ, and let be a complete family. If then is the unique MVUE of θ.
By the Rao–Blackwell theorem, if is an unbiased estimator of θ then defines an unbiased estimator of θ with the property that its variance is not greater than that of .
Now we show that this function is unique. Suppose is another candidate MVUE estimator of θ. Then again defines an unbiased estimator of θ with the property that its variance is not greater than that of . Then
Since is a complete family
and therefore the function is the unique function of Y with variance not greater than that of any other unbiased estimator. We conclude that is the MVUE.
An example of an improvable Rao–Blackwell improvement, when using a minimal sufficient statistic that is not complete, was provided by Galili and Meilijson in 2016.[4] Let be a random sample from a scale-uniform distribution with unknown mean and known design parameter . In the search for "best" possible unbiased estimators for , it is natural to consider as an initial (crude) unbiased estimator for and then try to improve it. Since is not a function of , the minimal sufficient statistic for (where and ), it may be improved using the Rao–Blackwell theorem as follows:
However, the following unbiased estimator can be shown to have lower variance:
And in fact, it could be even further improved when using the following estimator:
The model is a scale model. Optimal equivariant estimators can then be derived for loss functions that are invariant.[5]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.