Summary
$\begin{aligned} E(\bar{X})&=\mu \end{aligned}$
$\begin{aligned} Var(\bar{X})&=\frac{\sigma^{2}}{n} \end{aligned}$
$\begin{aligned} sd(\bar{X})&=\frac{\sigma}{\sqrt{n}} \end{aligned}$
Let's prove the following summary
1. For a random sample size $n$ on $X$, where $E(X)=\mu$, we have $\begin{aligned} E(\bar{X})&=\mu \end{aligned}$
proof)
Each random variable $X_{i}$ in the random sample has same distribution as $X$, and so $E(X_{i})=\mu$.
And $X_{1},X_{2},...,X_{n}$ are mutually independent.
Hence,
$\begin{aligned} E(\bar{X}) &=E(\frac{X_{1}+X_{2}+...+X_{n}}{n})\\ &=\frac{1}{n}(E(X_{1})+E(X_{2})+...+E(X_{n}))\\ &=\frac{1}{n}(\mu + \mu +...+ \mu) \\ &=\frac{n\mu}{n} \\ &=\mu \end{aligned}$
2. For a random sample size $n$ on $X$, where $Var(X)=\sigma^{2}$, we have $\begin{aligned} Var(\bar{X})&=\frac{\sigma^{2}}{n} \end{aligned}$
proof)
Each random variable $X_{i}$ in the random sample has same distribution as $X$, and so $Var(X_{i})=\sigma^{2}$.
And $X_{1},X_{2},...,X_{n}$ are mutually independent.
Hence,
$\begin{aligned} Var(\bar{X}) &=Var(\frac{X_{1}+X_{2}+...+X_{n}}{n})\\ &=\frac{1}{n^{2}}(Var(X_{1})+Var(X_{2})+...+Var(X_{n}))\\ &=\frac{1}{n^{2}}(\sigma^{2}+\sigma^{2}+...+\sigma^{2}) \\ &=\frac{n\sigma^{2}}{n^{2}} \\ &=\frac{\sigma^{2}}{n} \end{aligned}$
Used Theorem
1. If $X$, $Y$ are independent random variables and $a$, $b$ in $R$, then
$E(aX + bY)=aE(X)+bE(Y)$
$V(aX + bY)=a^{2}V(X)+b^{2}V(Y)$
Attach Used Latex
요약
\begin{aligned}
E(\bar{X})&=\mu
\end{aligned}
\begin{aligned}
Var(\bar{X})&=\frac{\sigma^{2}}{n}
\end{aligned}
\begin{aligned}
sd(\bar{X})&=\frac{\sigma}{\sqrt{n}}
\end{aligned}
증명 1.
\begin{aligned}
E(\bar{X})
&=E(\frac{X_{1}+X_{2}+...+X_{n}}{n})\\
&=\frac{1}{n}(E(X_{1})+E(X_{2})+...+E(X_{n}))\\
&=\frac{1}{n}(\mu + \mu +...+ \mu) \\
&=\frac{n\mu}{n} \\
&=\mu
\end{aligned}
증명 2.
\begin{aligned}
Var(\bar{X})
&=Var(\frac{X_{1}+X_{2}+...+X_{n}}{n})\\
&=\frac{1}{n^{2}}(Var(X_{1})+Var(X_{2})+...+Var(X_{n}))\\
&=\frac{1}{n^{2}}(\sigma^{2}+\sigma^{2}+...+\sigma^{2}) \\
&=\frac{n\sigma^{2}}{n^{2}} \\
&=\frac{\sigma^{2}}{n}
\end{aligned}
기타
$E(aX + bY)=aE(X)+bE(Y)$
$V(aX + bY)=a^{2}V(X)+b^{2}V(Y)$
'DataScience > Statistics' 카테고리의 다른 글
특이도 민감도 재현율 정밀도 정확도 (0) | 2022.08.12 |
---|---|
[ESL2] Singular Matrix (0) | 2021.12.26 |
[ESL2] Quadratic Function (0) | 2021.12.26 |
Sample Variance (0) | 2021.04.11 |
Central Limit Theorem (0) | 2021.04.11 |