1

Suppose that $Y_1,...,Y_n$ is a set of random variables that satisfy the model $Y_i=\theta x_i+e_i$, $i=1,...,n$ where $x_1,...,x_n$ are fixed (non-random) known constants and $e_1,...,e_n$ are independent and identically distributed $N(0,\sigma^2)$ random variables where $\sigma^2$ is unknown.

a. Find the maximum likelihood estimator $\theta$ and find its variance

b. Find the method of moment estimator $\theta$ and calculate its variance

What I have so far.

a. First, given the pdf for e which is $N(0,\sigma^2)$ I used the change of variables technique to get $f_y(Y)=\frac{1}{\sqrt {2\pi}\sigma}e^{\frac{-y-\theta x_i}{2\sigma^2}}|1|$. From the maximum likelihood function I got $L(\theta)=0$ so I inferred that $\theta=y_{\max}$. Since that value made $L(\theta)$ smallest. How to I proceed to find the variance?

b. I found $E(X)$ and $E(X^2)$ however they are both zero. Does the MOM estimator exist?

BCLC
  • 13,459
angelo086
  • 795

1 Answers1

1

For a:

Hint: Assuming $\theta_e = y_{\max}$ (a number), we have

$\hat{\theta} = Y_{\max}$ (random)


Larsen Marx


enter image description here


However (lol), I'm not sure $\theta_e = y_{\max}$. How did you get that? I got the same pdf for $Y$'s as you, but...

...the likelihood I got is

$$L(\theta) = \prod_{i=1}^{n} \left[\frac{1}{\sigma \sqrt{2 \pi}} \exp\left(\frac{-(y_i-\theta x_i)^2}{2\sigma^2}\right)\right]$$

Maximising $L(\theta)$ is equivalent to maximising $\ln L(\theta)$ so when I did the latter, I got

$$\theta_e = \frac{\sum y_i x_i}{\sum x_i^2}$$

$$\to \hat{\theta} = \frac{\sum Y_i x_i}{\sum x_i^2}$$

Finally, I noted that (Borel-measurable) functions of independent RVs are independent:

  1. https://stats.stackexchange.com/questions/96206/is-it-true-that-if-epsilon-t-simiid-0-1-then-e-epsilon-t2-epsi

  2. How do you prove that if $ X_t \sim^{iid} (0,1) $, then $ E(X_t^{2}X_{t-j}^{2}) = E(X_t^{2})E(X_{t-j}^{2})$?

$$\to Var(\hat{\theta}) = \frac{\sigma^2 \sum x_i^2}{\left(\sum_i x_i^2 \right)}$$


For b:

What's $X$?

What I got was that

$$0 = E[Y_i] \stackrel{set}{=} \frac{y_1 + \cdots + y_n}{n}$$

$$\to \theta_e = \frac{-\sum e_i}{\sum x_i}$$

$$\to \hat{\theta} = \frac{-\sum E_i}{\sum x_i}$$

$$\to Var(\hat{\theta}) = \frac{n\sigma^2}{(\sum x_i)^2}$$

Here my $e$'s are numbers and $E$'s are random. In your notation, $e$'s are random.

I don't think we need to compute $E[Y_i^2]$ anymore (which is $\sigma^2$, not $0$) because there's only one parameter to estimate:


enter image description here


BCLC
  • 13,459