# integration – Using properties of expectation to calculate \$E(hat{theta_n}) \$ and \$Var(hat{theta_n}) = \$ Suppose, you want to calculate

$$begin{equation*} theta = int_{0}^{1} h(x)dx end{equation*}$$

where $$h$$ is some complicated but continuous function (though evaluating $$h$$ at a particular $$x$$ is not hard). For example, one such function might be

$$begin{equation*} h(x) = exp(-sqrt{x}) | sin(x^4log(x))| end{equation*}$$

(i) Suppose you can draw random observations from $$g$$, where $$g$$ is a positive density on $$(0, 1)$$. Let $$X_1, . . . , X_n$$ be i.i.d. from $$g$$. Consider the estimator of $$theta$$ given by

$$begin{equation*} hat{theta_n} = frac{1}{n} sum_{i=1}^{n} frac{h(X_i)}{g(X_i)} end{equation*}$$

(For example, if $$g$$ is the uniform density, you are just averaging the function over the data points.) Compute $$E(hat{theta_n})$$.

indent We calculate $$E(hat{theta_n})$$ as follows:

begin{equation*} begin{aligned} & E(hat{theta_n}) = \ & E(frac{1}{n} sum_{i=1}^{n} frac{h(x_i)}{g(x_i)}) = \ & frac{1}{n}int_{0}^{1} frac{h(x)}{g(x)} g(x) dx = \ & frac{1}{n}int_{0}^{1} h(x) frac{g(x)}{g(x)} dx = \ & frac{1}{n}int_{0}^{1} h(x) dx = frac{theta}{n} \ end{aligned} end{equation*}

(ii) Assume g is the uniform density. Compute $$Var(hat{theta_n})$$ and give a good upper bound to this variance if you know that $$0 geq h(x) geq 1$$ for all $$x$$.

indent Using expectation we break apart the variance as follows:

begin{equation*} begin{aligned} & Var(hat{theta_n}) = \ & E(hat{theta_n}^2) – (E(theta_n)^2)= \ & E left(left( frac{1}{n} sum_{i=1}^{n} frac{h(x_i)}{g(x_i)} right)^2 right) – left(E left( frac{1}{n} sum_{i=1}^{n} frac{h(x_i)}{g(x_i)} right) right)^2 = \ & E left(frac{1}{n^2} left( sum_{i=1}^{n} frac{h(x_i)}{g(x_i)} right)^2 right) – left( frac{theta}{n} right)^2 = \ & frac{1}{n^2} left( int_{0}^{1} left(frac{h(x)}{g(x)}right)^2 g(x) right) – left( frac{theta}{n} right)^2 = \ & …\ end{aligned} end{equation*}

Is this last step correct? Is my next step to multiply out the g(x) terms so one cancels? When I do that I get stuck. Would appreciate a push if anyone has one.

(iii) Assume g is the uniform density. Using Chebychev’s inequality, how large must n be to guarantee that the absolute difference $$|hat{theta_n} – theta|$$ is no bigger than 0.05 with probability at least 95%.

(iv) If you could choose $$g$$, what choice would minimize $$Var(hat{theta_n})$$. Posted on Categories Articles