Question 1.1.

\[\begin{eqnarray} {\mathrm{Var}}\left(\hat{\mu}\left(X_{1:N}\right)\right)&=&{\mathrm{Var}}\left(\frac{1}{N}\sum_{n=1}^{N}X_{n}\right) \\ &=&\frac{1}{N^{2}}{\mathrm{Cov}}\left(\sum_{m=1}^{N}X_{m},\sum_{n=1}^{N}X_{n}\right) \\ &=&\frac{1}{N^{2}}\sum_{m=1}^{N}\sum_{n=1}^{N}{\mathrm{Cov}}\left(X_{m},X_{n}\right) \\ &=&\frac{1}{N^{2}}\left(N\gamma_{0}+2\left(N-1\right)\gamma_{1}+\ldots+2\gamma_{N-1}\right) \\ &=&\frac{1}{N}\gamma_{0}+\frac{1}{N^{2}}\sum_{h=1}^{N-1}\left(N-h\right)\gamma_{h} \end{eqnarray}\]


Question 1.2.
By definition, \[ \hat{\gamma}_{h}\left(x_{1:N}\right)=\frac{1}{N}\sum_{n=1}^{N-h}\left(x_{n}-\hat{\mu}_{n}\right)\left(x_{n+h}-\hat{\mu}_{n+h}\right). \] Here, we consider the null hypothesis where \(X_{1:N}\) is IID with mean \(0\) and standard deviation \(\sigma\). We therefore use the estimator \(\hat\mu_n=0\) and the autocovariance function estimator becomes \[\begin{eqnarray} \hat{\gamma}_{h}\left(x_{1:N}\right)&=&\frac{1}{N}\sum_{n=1}^{N-h}x_{n}x_{n+h}, \end{eqnarray}\] We let \(\sum_{n=1}^{N-h}X_{n}X_{n+h}=U\) and \(\sum_{n=1}^{N}X_{n}^{2}=V\), and carry out a Taylor first order expansion of \[\hat\rho_h(X_{1:N}) = \frac{U}{V}\] about \(({\mathbb{E}}[U],{\mathbb{E}}[V])\). This gives \[ \hat{\rho}_{h}(X_{1:N}) \approx\frac{{\mathbb{E}}\left(U\right)}{{\mathbb{E}}\left(V\right)}+\left(U-{\mathbb{E}}\left(U\right)\right)\left.\frac{\partial}{\partial U}\left(\frac{U}{V}\right)\right|_{\left({\mathbb{E}}\left(U\right),{\mathbb{E}}\left(V\right)\right)}+\left(V-{\mathbb{E}}\left(V\right)\right)\left.\frac{\partial}{\partial V}\left(\frac{U}{V}\right)\right|_{\left({\mathbb{E}}\left(U\right),{\mathbb{E}}\left(V\right)\right)}. \] We have \[ {\mathbb{E}}\left(U\right)=\sum_{n=1}^{N-h}{\mathbb{E}}\left(X_{n}\, X_{n+h}\right)=0, \] \[ {\mathbb{E}}\left(V\right)=\sum_{n=1}^{N}{\mathbb{E}}\left(X_{n}^{2}\right)=N\sigma^{2}, \] \[ \frac{\partial}{\partial U}\left(\frac{U}{V}\right)=\frac{1}{V}, \] \[ \frac{\partial}{\partial V}\left(\frac{U}{V}\right)=\frac{-U}{V^{2}}. \] Putting this together, we have \[\begin{eqnarray} \hat{\rho}_{h}(X_{1:N})&\approx&\frac{{\mathbb{E}}\left(U\right)}{{\mathbb{E}}\left(V\right)}+\frac{U}{{\mathbb{E}}\left(V\right)}-\frac{\left(V-{\mathbb{E}}\left(V\right)\right){\mathbb{E}}(U)}{{\mathbb{E}}(V)^{2}} \\ &=&\frac{U}{N\sigma^{2}}. \end{eqnarray}\] This gives us an approximation, \[ {\mathrm{Var}}\left(\hat{\rho}_{h}(X_{1:N})\right)\approx\frac{{\mathrm{Var}}\left(U\right)}{N^{2}\sigma^{4}}. \] We can now compute \[\begin{eqnarray} {\mathrm{Var}}\left(U\right)&=&{\mathbb{E}}\left[\left(\sum_{n=1}^{N-h}X_{n}X_{n+h}\right)^{2}\right] \\ &=&{\mathbb{E}}\left[\sum_{n=1}^{N-h}X_{n}^{2}X_{n+h}^{2}\right]+2{\mathbb{E}}\left[\sum_{i=1}^{N-h-1}\sum_{j=i+1}^{N-h}X_{i}X_{i+h}X_{j}X_{j+h}\right] \end{eqnarray}\] Since \(X_{1:N}\) are i.i.d. and mean zero, the second sum has zero expectation and \[\begin{eqnarray} {\mathrm{Var}}\left(U\right)&=&{\mathbb{E}}\left[\sum_{n=1}^{N-h}X_{n}^{2}X_{n+h}^{2}\right] \\ &=&\sum_{n=1}^{N-h}{\mathbb{E}}\left(X_{n}^{2}\right){\mathbb{E}}\left(X_{n+h}^{2}\right) \\ &=&\left(N-h\right)\sigma^{4} \end{eqnarray}\] Therefore, \[ {\mathrm{Var}}\left(\hat{\rho}_{h}(X_{1:N})\right)\approx\frac{\left(N-h\right)}{N^{2}} \] When \(n\rightarrow\infty\), \({\mathrm{Var}}\left(\hat{\rho}_h(X_{1:N})\right)\rightarrow\frac{1}{N}\), justifying a standard deviation under the null hypothesis of \(1/\sqrt{N}\).

B. A 95% confidence interval is a function of the data that constructs a set which (under a spedified model) covers the true parameter with probability 0.95.