S.O.S. Mathematics CyberBoard

Your Resource for mathematics help on the web!
It is currently Wed, 11 Dec 2024 09:42:00 UTC

All times are UTC [ DST ]




Post new topic Reply to topic  [ 4 posts ] 
Author Message
PostPosted: Wed, 16 Nov 2016 19:49:21 UTC 
Offline
Member

Joined: Sat, 1 Nov 2014 09:07:24 UTC
Posts: 47
I got some troubles proving following theorems:

1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix \Sigma) the maximum likelihood estimation of \beta gives
\hat{\beta}_{ML} = (X' \Sigma X)^{-1}X'\Sigma^{-1}y

2) Let \hat{\beta}_1 = \frac{\sum_{i = 1}^n(x_i - \overline{x})y_i}{\sum_{i = 1}^n(x_i - \overline{x})} be estimated by least squares method, let \sigma_{x \delta} = cov(x_i^A, \delta), \sigma_x^2 = \sum_{i = 1}^n(x_i^A - \overline{x}^A)^2/n, \sigma_{\delta}^2 = Var(\delta_i).

Show that E(\hat{\beta}_1) = \beta_1\frac{\sigma_x^2 + \sigma_{x \delta}}{\sigma_x^2 + \sigma_{\delta}^2 + 2\sigma_{x \delta}}

My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the \hat{\beta}_1 simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors.


Top
 Profile  
 
PostPosted: Thu, 17 Nov 2016 03:56:56 UTC 
Offline
Moderator
User avatar

Joined: Mon, 29 Dec 2008 17:49:32 UTC
Posts: 7846
Location: NCTS/TPE, Taiwan
Pan Miroslav wrote:
I got some troubles proving following theorems:

1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix \Sigma) the maximum likelihood estimation of \beta gives
\hat{\beta}_{ML} = (X' \Sigma X)^{-1}X'\Sigma^{-1}y

2) Let \hat{\beta}_1 = \frac{\sum_{i = 1}^n(x_i - \overline{x})y_i}{\sum_{i = 1}^n(x_i - \overline{x})} be estimated by least squares method, let \sigma_{x \delta} = cov(x_i^A, \delta), \sigma_x^2 = \sum_{i = 1}^n(x_i^A - \overline{x}^A)^2/n, \sigma_{\delta}^2 = Var(\delta_i).

Show that E(\hat{\beta}_1) = \beta_1\frac{\sigma_x^2 + \sigma_{x \delta}}{\sigma_x^2 + \sigma_{\delta}^2 + 2\sigma_{x \delta}}

My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the \hat{\beta}_1 simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors.


(1) Recall the probability density function of multivariate normal distribution N(\mu,\Sigma)
f(\mathbf{y})=\dfrac{1}{\sqrt{\det(2\pi\Sigma)}}\exp\left(-\dfrac{1}{2}(\mathbf{y}-\mu)^T\Sigma^{-1}(\mathbf{y}-\mu)\right)
(2) What are x_i^A and x_i? What is \delta?
Anyway, split X=(X_1,X_2) where X_2 contains all the other independent variables that we ignore and write \beta=(\beta_1,\beta_2). Expand out Y=X'\beta+\epsilon in the formula for ordinary least-square and take (conditional) expectation (so the \epsilon disappears). The result is the so-called omitted variable (bias) formula
\mathbb{E}(b_1\mid X)=\beta_1+(X'_1X_1)^{-1}X'_1X_2\beta_2
which has an intuitive description --- the bias (X'_1X_1)^{-1}X'_1X_2\beta_2 is precisely the weighted proportion of the omitted variables X_2 that are "explained" by the variables X_1 we included.

_________________
\begin{aligned}
Spin(1)&=O(1)=\mathbb{Z}/2&\quad&\text{and}\\
Spin(2)&=U(1)=SO(2)&&\text{are obvious}\\
Spin(3)&=Sp(1)=SU(2)&&\text{by }q\mapsto(\mathop{\mathrm{Im}}\mathbb{H}\ni p\mapsto qp\bar{q})\\
Spin(4)&=Sp(1)\times Sp(1)&&\text{by }(q_1,q_2)\mapsto(\mathbb{H}\ni p\mapsto q_1p\bar{q_2})\\
Spin(5)&=Sp(2)&&\text{by }\mathbb{HP}^1\cong S^4_{round}\hookrightarrow\mathbb{R}^5\\
Spin(6)&=SU(4)&&\text{by the irrep }\Lambda_+\mathbb{C}^4
\end{aligned}


Top
 Profile  
 
PostPosted: Thu, 17 Nov 2016 09:40:29 UTC 
Offline
Member

Joined: Sat, 1 Nov 2014 09:07:24 UTC
Posts: 47
outermeasure wrote:
Pan Miroslav wrote:
I got some troubles proving following theorems:

1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix \Sigma) the maximum likelihood estimation of \beta gives
\hat{\beta}_{ML} = (X' \Sigma X)^{-1}X'\Sigma^{-1}y

2) Let \hat{\beta}_1 = \frac{\sum_{i = 1}^n(x_i - \overline{x})y_i}{\sum_{i = 1}^n(x_i - \overline{x})} be estimated by least squares method, let \sigma_{x \delta} = cov(x_i^A, \delta), \sigma_x^2 = \sum_{i = 1}^n(x_i^A - \overline{x}^A)^2/n, \sigma_{\delta}^2 = Var(\delta_i).

Show that E(\hat{\beta}_1) = \beta_1\frac{\sigma_x^2 + \sigma_{x \delta}}{\sigma_x^2 + \sigma_{\delta}^2 + 2\sigma_{x \delta}}

My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the \hat{\beta}_1 simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors.


(1) Recall the probability density function of multivariate normal distribution N(\mu,\Sigma)
f(\mathbf{y})=\dfrac{1}{\sqrt{\det(2\pi\Sigma)}}\exp\left(-\dfrac{1}{2}(\mathbf{y}-\mu)^T\Sigma^{-1}(\mathbf{y}-\mu)\right)
(2) What are x_i^A and x_i? What is \delta?
Anyway, split X=(X_1,X_2) where X_2 contains all the other independent variables that we ignore and write \beta=(\beta_1,\beta_2). Expand out Y=X'\beta+\epsilon in the formula for ordinary least-square and take (conditional) expectation (so the \epsilon disappears). The result is the so-called omitted variable (bias) formula
\mathbb{E}(b_1\mid X)=\beta_1+(X'_1X_1)^{-1}X'_1X_2\beta_2
which has an intuitive description --- the bias (X'_1X_1)^{-1}X'_1X_2\beta_2 is precisely the weighted proportion of the omitted variables X_2 that are "explained" by the variables X_1 we included.


So, for the first one it's now basically about using log(f(y)) so it will be simpler and then derivate and solve first order conditions for maximum, right?

For the second one I'll write down some more information
Let (x_i^O, y_i^O) be observed values for i = 1, 2, \ldots, n and let (x_i^A, y_i^A) be the close actual values, close means y_i^O = y_i^A + \epsilon_i, x_i^O = x_i^A + \delta_i, where \epsilon and \delta are independent with E(\epsilon_i) = E(\delta_i) = 0 and Var(\epsilon_i) = \sigma_{\epsilon}^2, Var(\delta_i) = \sigma_{\delta}^2. We want to model y_i^A = \beta_0 + \beta_1 x_i^A, but we observed values (x_i^O, y_i^O) so our model is y_i^O = \beta_0 + \beta_1 x_i^O + (\epsilon_i - \beta_1 \delta_i)


Top
 Profile  
 
PostPosted: Fri, 18 Nov 2016 05:49:54 UTC 
Offline
Moderator
User avatar

Joined: Mon, 29 Dec 2008 17:49:32 UTC
Posts: 7846
Location: NCTS/TPE, Taiwan
Pan Miroslav wrote:
outermeasure wrote:
Pan Miroslav wrote:
I got some troubles proving following theorems:

1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix \Sigma) the maximum likelihood estimation of \beta gives
\hat{\beta}_{ML} = (X' \Sigma X)^{-1}X'\Sigma^{-1}y

2) Let \hat{\beta}_1 = \frac{\sum_{i = 1}^n(x_i - \overline{x})y_i}{\sum_{i = 1}^n(x_i - \overline{x})} be estimated by least squares method, let \sigma_{x \delta} = cov(x_i^A, \delta), \sigma_x^2 = \sum_{i = 1}^n(x_i^A - \overline{x}^A)^2/n, \sigma_{\delta}^2 = Var(\delta_i).

Show that E(\hat{\beta}_1) = \beta_1\frac{\sigma_x^2 + \sigma_{x \delta}}{\sigma_x^2 + \sigma_{\delta}^2 + 2\sigma_{x \delta}}

My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the \hat{\beta}_1 simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors.


(1) Recall the probability density function of multivariate normal distribution N(\mu,\Sigma)
f(\mathbf{y})=\dfrac{1}{\sqrt{\det(2\pi\Sigma)}}\exp\left(-\dfrac{1}{2}(\mathbf{y}-\mu)^T\Sigma^{-1}(\mathbf{y}-\mu)\right)
(2) What are x_i^A and x_i? What is \delta?
Anyway, split X=(X_1,X_2) where X_2 contains all the other independent variables that we ignore and write \beta=(\beta_1,\beta_2). Expand out Y=X'\beta+\epsilon in the formula for ordinary least-square and take (conditional) expectation (so the \epsilon disappears). The result is the so-called omitted variable (bias) formula
\mathbb{E}(b_1\mid X)=\beta_1+(X'_1X_1)^{-1}X'_1X_2\beta_2
which has an intuitive description --- the bias (X'_1X_1)^{-1}X'_1X_2\beta_2 is precisely the weighted proportion of the omitted variables X_2 that are "explained" by the variables X_1 we included.


So, for the first one it's now basically about using log(f(y)) so it will be simpler and then derivate and solve first order conditions for maximum, right?

For the second one I'll write down some more information
Let (x_i^O, y_i^O) be observed values for i = 1, 2, \ldots, n and let (x_i^A, y_i^A) be the close actual values, close means y_i^O = y_i^A + \epsilon_i, x_i^O = x_i^A + \delta_i, where \epsilon and \delta are independent with E(\epsilon_i) = E(\delta_i) = 0 and Var(\epsilon_i) = \sigma_{\epsilon}^2, Var(\delta_i) = \sigma_{\delta}^2. We want to model y_i^A = \beta_0 + \beta_1 x_i^A, but we observed values (x_i^O, y_i^O) so our model is y_i^O = \beta_0 + \beta_1 x_i^O + (\epsilon_i - \beta_1 \delta_i)


(1) After you take the product of likelihoods of each observation to get the likelihood of all observed values, yes.
(2) I think you better check the denominator of \hat{\beta}_1, other than that you have X_1=(1,x^A), X_2=(\delta), X=(1,x^A,\delta) and \beta=(\beta_0,\beta_1,\beta_2)=(\beta_0,\beta_1,-\beta_1) so plugging into the omitted variable formula gives \mathbb{E}\hat{\beta}_1.

_________________
\begin{aligned}
Spin(1)&=O(1)=\mathbb{Z}/2&\quad&\text{and}\\
Spin(2)&=U(1)=SO(2)&&\text{are obvious}\\
Spin(3)&=Sp(1)=SU(2)&&\text{by }q\mapsto(\mathop{\mathrm{Im}}\mathbb{H}\ni p\mapsto qp\bar{q})\\
Spin(4)&=Sp(1)\times Sp(1)&&\text{by }(q_1,q_2)\mapsto(\mathbb{H}\ni p\mapsto q_1p\bar{q_2})\\
Spin(5)&=Sp(2)&&\text{by }\mathbb{HP}^1\cong S^4_{round}\hookrightarrow\mathbb{R}^5\\
Spin(6)&=SU(4)&&\text{by the irrep }\Lambda_+\mathbb{C}^4
\end{aligned}


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 4 posts ] 

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
cron
Contact Us | S.O.S. Mathematics Homepage
Privacy Statement | Search the "old" CyberBoard

users online during the last hour
Powered by phpBB © 2001, 2005-2017 phpBB Group.
Copyright © 1999-2017 MathMedics, LLC. All rights reserved.
Math Medics, LLC. - P.O. Box 12395 - El Paso TX 79913 - USA