# Hodges' estimator

In statistics, Hodges’ estimator[1] (or the Hodges–Le Cam estimator[2]), named for Joseph Hodges, is a famous[3] counter example of an estimator which is "superefficient", i.e. it attains smaller asymptotic variance than regular efficient estimators. The existence of such a counterexample is the reason for the introduction of the notion of regular estimators.

Hodges’ estimator improves upon a regular estimator at a single point. In general, any superefficient estimator may surpass a regular estimator at most on a set of Lebesgue measure zero.[4]

## Construction

Suppose ${\displaystyle \scriptstyle {\hat {\theta }}_{n}}$ is a "common" estimator for some parameter θ: it is consistent, and converges to some asymptotic distribution Lθ (usually this is a normal distribution with mean zero and variance which may depend on θ) at the Template:Sqrt-rate:

${\displaystyle {\sqrt {n}}({\hat {\theta }}_{n}-\theta )\ {\xrightarrow {d}}\ L_{\theta }\ .}$

Then the Hodges’ estimator ${\displaystyle \scriptstyle {\hat {\theta }}_{n}^{H}}$ is defined as [5]

${\displaystyle {\hat {\theta }}_{n}^{H}={\begin{cases}{\hat {\theta }}_{n},&{\text{if }}|{\hat {\theta }}_{n}|\geq n^{-1/4},{\text{ and}}\\0,&{\text{if }}|{\hat {\theta }}_{n}|

This estimator is equal to ${\displaystyle \scriptstyle {\hat {\theta }}_{n}}$ everywhere except on the small interval [−n−1/4, n−1/4], where it is equal to zero. It is not difficult to see that this estimator is consistent for θ, and its asymptotic distribution is [6]

{\displaystyle {\begin{aligned}&n^{\alpha }({\hat {\theta }}_{n}^{H}-\theta )\ {\xrightarrow {d}}\ 0,\qquad {\text{when }}\theta =0,\\&{\sqrt {n}}({\hat {\theta }}_{n}^{H}-\theta )\ {\xrightarrow {d}}\ L_{\theta },\quad {\text{when }}\theta \neq 0,\end{aligned}}}

for any αR. Thus this estimator has the same asymptotic distribution as ${\displaystyle \scriptstyle {\hat {\theta }}_{n}}$ for all θ ≠ 0, whereas for θ = 0 the rate of convergence becomes arbitrarily fast. This estimator is superefficient, as it surpasses the asymptotic behavior of the efficient estimator ${\displaystyle \scriptstyle {\hat {\theta }}_{n}}$ at least at one point θ = 0. In general, superefficiency may only be attained on a subset of measure zero of the parameter space Θ.

## Example

The mean square error (times n) of Hodges’ estimator. Blue curve corresponds to n = 5, purple to n = 50, and olive to n = 500.[7]

Suppose x1, …, xn is an iid sample from normal distribution N(θ, 1) with unknown mean but known variance. Then the common estimator for the population mean θ is the arithmetic mean of all observations: ${\displaystyle \scriptstyle {\bar {x}}}$. The corresponding Hodges’ estimator will be ${\displaystyle \scriptstyle {\hat {\theta }}_{n}^{H}\;=\;{\bar {x}}\cdot {\mathbf {1} }\{|{\bar {x}}|\,\geq \,n^{-1/4}\}}$, where 1{…} denotes the indicator function.

The mean square error (scaled by n) associated with the regular estimator x is constant and equal to 1 for all θ’s. At the same time the mean square error of the Hodges’ estimator ${\displaystyle \scriptstyle {\hat {\theta }}_{n}^{H}}$ behaves erratically in the vicinity of zero, and even becomes unbounded as n → ∞. This demonstrates that the Hodges’ estimator is not regular, and its asymptotic properties are not adequately described by limits of the form (θ fixed, n → ∞).

## Notes

### References

• {{#invoke:citation/CS1|citation

|CitationClass=book }}

• {{#invoke:Citation/CS1|citation

|CitationClass=journal }}

• {{#invoke:Citation/CS1|citation

|CitationClass=journal }}

• {{#invoke:citation/CS1|citation

|CitationClass=book }} Template:Refend