19 Jan, 2016

Admissibility


Admissibility

An estimator $\tilde\theta_1(x) $ is inadmissible under a loss function $L(\theta,\tilde\theta)$ if there exists $\tilde\theta_2$ s.t. \(E_{x|\theta}[L(\theta,\tilde\theta_1,(x))] \gt E_{x|\theta}[L(\theta,\tilde\theta_2,(x))]\) for all $\theta$. i.e. $R(\theta,\tilde\theta_1(x)) > R(\theta,\tilde\theta_2(x))$. If no such $\tilde\theta_2$ exists, then $\tilde\theta_1$ is admissible.

Consider $x|\theta \sim N_p(\theta,I_p)$. The MLE is $\hat\theta = x$. This estimator is not admissible for $p \ge 3$ because

\[\tilde\theta = \left(1 - \frac{p-2}{\norm{x}_2}\right) x\]

dominates $\hat\theta$ for $p\ge 3$ under squared-error loss $L(\theta,x) = \sum (\theta_i-x_i)^2$. (James-Stein Estimator)

Proof

$\hat\theta = x$
$R(\theta, \hat\theta) = E_{x|\theta}[L(\theta,\hat\theta)] = E_{x|\theta}[(\theta-x)’(\theta-x)] = E[\chi^2_{(p)}] = p$

$R(\theta,\hat\theta) = E_{x|\theta}[\left(\theta-\left(1 - \frac{p-2}{\norm{x}_2}\right)x\right)’\left(\theta-\left(1 - \frac{p-2}{\norm{x}_2}\right)x\right)]$
$= p - E_{y|\theta} \left( \frac{(p-2)^2}{p-2+2y} \right) \lt p = R(\theta,\hat\theta)$
with $y \sim \text{Poisson}(\sum_{i=1}^p\theta_i^2/2)$

Therefore, $\tilde\theta$ (James-Stein) dominates $\hat\theta$ (MLE), for $p\ge 3$. They are both dominated by another estomator $\tilde\theta^{+} = x\p{1-\frac{p-2}{\sum x_i^2}}_{+}$.

Empirical Bayes Justification for $\tilde\theta$ (J-S)

$x|\theta \sim N_p(\theta,I)$
$\theta \sim N(0,\tau^2 I)$
$x =\theta + \epsilon_1, \epsilon \sim N_p(0,I)$
$ x\sim N_p(0,(1+\tau^2)I)$
$ \theta|x \sim N_p(\p{1+1/\tau^2}^{-1}x, \p{1+1/\tau^2}I)$
Under squared-error loss, the Baye’s estimator is the posterior mean.

How do we choose $\tau^2$?

One possible way is to think of $x \sim N_p(0,(1+\tau^2)I)$ as a likelihood and either use a prior on $\tau^2$ or find another estimator for $\tau^2$.

Let $k^2 = 1+\tau^2$
$p(x|k^2) = (2\pi k^2)^{-p\2} \exp{-\frac{x’x}{2k^2}}$
$k^2_{mle} = \frac{\norm{x}_2}{p}$
Therefore, by the invariance property of the MLE, $\hat\tau^2 = \frac{\norm{x}_2}{p} - 1$.

If instead of using the MLE for $k^2$ we use the unbiased estimator we would obtain that $\theta^*$ is the J-S estimator.

It is possible to show that under proper priors Bayes estimators are admissible and have finite bayes risks.

Properties of Estimators

  • Invariance: A procedure to construct an estimator is invariant if $\eta = g(\theta)$ is estimated by $\tilde\eta = g(\tilde\theta)$ with $\tilde\theta$ the estimator for $\theta$.
    • MLE’s are invariant
    • Bayes point estimators are not necessarily invariant
    • medians are invariant under absolute error loss