# خطأ تربيعي متوسط

الخطأ التربيعي المتوسط Mean squared error لتقدير T من أجل المؤشر غير القابل للقياس theta هو متوسط انحرافات هذا التقدير عن الؤشر الفعلي أي أنه القيمة المتوقعة لإنحرافات التقديرات عن المؤشر الفعلي.

${\displaystyle \operatorname {MSE} (T)=\operatorname {E} ((T-\theta )^{2}),}$

## فهرست

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

## التعريف والخواص الأساسية

In statistics, Mean squared error is used in two distinct senses: in estimation, and in residuals.

### التوقع

The MSE of an estimator ${\displaystyle {\hat {\theta }}}$ with respect to the estimated parameter ${\displaystyle \theta }$ is defined as

${\displaystyle \operatorname {MSE} ({\hat {\theta }})=\operatorname {E} (({\hat {\theta }}-\theta )^{2}).}$

The MSE can be written as the sum of the variance and the squared bias of the estimator

${\displaystyle \operatorname {MSE} ({\hat {\theta }})=\operatorname {Var} ({\hat {\theta }})+\left(\operatorname {Bias} ({\hat {\theta }},\theta )\right)^{2}.}$

The MSE thus assesses the quality of an estimator in terms of its variation and unbiasedness. Note that the MSE is not equivalent to the expected value of the absolute error.

Since MSE is an expectation, it is a number, and not a random variable. It may be a function of the unknown parameter ${\displaystyle \theta }$, but it does not depend on any random quantities.

### البواقي

In a linear model and other regression models, the residuals, or estimated errors, are the differences between the observed data and fitted model, ${\displaystyle e_{i}=Y_{i}-{\hat {Y}}_{i}}$. The mean squared error is

${\displaystyle {\frac {1}{n}}\sum _{i=1}^{n}e_{i}^{2}}$

(the n in the denominator is often modified by a correction for degrees of freedom). In this case the MSE depends on data, and is a random variable.

If the true errors have mean 0 and variance ${\displaystyle \sigma ^{2}}$, then the MSE is an estimate of ${\displaystyle \sigma ^{2}}$.

## أمثلة

Suppose we have a random sample of size n from an identically distributed population, ${\displaystyle X_{1},\dots ,X_{n}}$.

Some commonly-used estimators of the true parameters of the population, μ and σ2, are[1] shown in the following table (see notes for distribution requirements for the MSEs in the table related to variance estimators).

True value Estimator Mean squared error
θ = μ ${\displaystyle {\hat {\theta }}}$ = the unbiased estimator of the sample mean, ${\displaystyle {\overline {X}}={\frac {1}{n}}\sum _{i=1}^{n}(X_{i})}$ ${\displaystyle \operatorname {MSE} ({\overline {X}})=\operatorname {E} (({\overline {X}}-\mu )^{2})=\left({\frac {\sigma }{\sqrt {n}}}\right)^{2}}$
θ = σ2 ${\displaystyle {\hat {\theta }}}$ = the unbiased estimator of the sample variance, ${\displaystyle S_{n-1}^{2}={\frac {1}{n-1}}\sum _{i=1}^{n}\left(X_{i}-{\overline {X}}\,\right)^{2}}$ ${\displaystyle \operatorname {MSE} (S_{n-1}^{2})=\operatorname {E} ((S_{n-1}^{2}-\sigma ^{2})^{2})={\frac {2}{n-1}}\sigma ^{4}}$
θ = σ2 ${\displaystyle {\hat {\theta }}}$ = the biased estimator of the sample variance, ${\displaystyle S_{n}^{2}={\frac {1}{n}}\sum _{i=1}^{n}\left(X_{i}-{\overline {X}}\,\right)^{2}}$ ${\displaystyle \operatorname {MSE} (S_{n}^{2})=\operatorname {E} ((S_{n}^{2}-\sigma ^{2})^{2})={\frac {2n-1}{n^{2}}}\sigma ^{4}}$
θ = σ2 ${\displaystyle {\hat {\theta }}}$ = the biased estimator of the sample variance, ${\displaystyle S_{n+1}^{2}={\frac {1}{n+1}}\sum _{i=1}^{n}\left(X_{i}-{\overline {X}}\,\right)^{2}}$ ${\displaystyle \operatorname {MSE} (S_{n+1}^{2})=\operatorname {E} ((S_{n+1}^{2}-\sigma ^{2})^{2})={\frac {2}{n+1}}\sigma ^{4}}$

لاحظ أن:

1. The MSEs shown for the variance estimators assume ${\displaystyle X_{i}\sim \operatorname {N} (\mu ,\sigma ^{2})}$ i.i.d. so that ${\displaystyle {\frac {(n-1)S_{n-1}^{2}}{\sigma ^{2}}}\sim \chi _{n-1}^{2}}$. The result for ${\displaystyle S_{n-1}^{2}}$ follows easily from the ${\displaystyle \chi _{n-1}^{2}}$ variance that is ${\displaystyle 2n-2}$.
2. The general MSE expression for the unbiased variance estimator, without distribution assumptions, is ${\displaystyle \operatorname {MSE} (S_{n-1}^{2})={\frac {1}{n}}[\mu _{4}-{\frac {n-3}{n-1}}\sigma ^{4}]}$, where ${\displaystyle \mu _{4}}$ is the fourth central moment.[2]
3. Unbiased estimators may not produce estimates with the smallest total variation (as measured by MSE): ${\displaystyle S_{n-1}^{2}}$'s MSE is larger than ${\displaystyle S_{n+1}^{2}}$'s MSE.
4. Estimators with the smallest total variation may produce biased estimates: ${\displaystyle S_{n+1}^{2}}$ typically underestimates σ2 by ${\displaystyle {\frac {2}{n}}\sigma ^{2}}$

## المصادر

1. ^ Degroot, Morris (1980). Probability and Statistics (2 ed.). Addison-Wesley.
2. ^ Mood, A., F. Graybill, D. Boes (1974). Introduction to the Theory of Statistics (p. 229) (3 ed.). McGraw-Hill.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .