S.O.S. Mathematics CyberBoard http://417773.os285nnd.asia/CBB/ |
|
Maximum likelihood and SIMEX http://417773.os285nnd.asia/CBB/viewtopic.php?f=6&t=69725 |
Page 1 of 1 |
Author: | Pan Miroslav [ Wed, 16 Nov 2016 19:49:21 UTC ] |
Post subject: | Maximum likelihood and SIMEX |
I got some troubles proving following theorems: 1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix ) the maximum likelihood estimation of gives 2) Let be estimated by least squares method, let , , . Show that My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors. |
Author: | outermeasure [ Thu, 17 Nov 2016 03:56:56 UTC ] |
Post subject: | Re: Maximum likelihood and SIMEX |
Pan Miroslav wrote: I got some troubles proving following theorems: 1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix ) the maximum likelihood estimation of gives 2) Let be estimated by least squares method, let , , . Show that My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors. (1) Recall the probability density function of multivariate normal distribution (2) What are and ? What is ? Anyway, split where contains all the other independent variables that we ignore and write . Expand out in the formula for ordinary least-square and take (conditional) expectation (so the disappears). The result is the so-called omitted variable (bias) formula which has an intuitive description --- the bias is precisely the weighted proportion of the omitted variables that are "explained" by the variables we included. |
Author: | Pan Miroslav [ Thu, 17 Nov 2016 09:40:29 UTC ] |
Post subject: | Re: Maximum likelihood and SIMEX |
outermeasure wrote: Pan Miroslav wrote: I got some troubles proving following theorems: 1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix ) the maximum likelihood estimation of gives 2) Let be estimated by least squares method, let , , . Show that My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors. (1) Recall the probability density function of multivariate normal distribution (2) What are and ? What is ? Anyway, split where contains all the other independent variables that we ignore and write . Expand out in the formula for ordinary least-square and take (conditional) expectation (so the disappears). The result is the so-called omitted variable (bias) formula which has an intuitive description --- the bias is precisely the weighted proportion of the omitted variables that are "explained" by the variables we included. So, for the first one it's now basically about using so it will be simpler and then derivate and solve first order conditions for maximum, right? For the second one I'll write down some more information Let be observed values for and let be the close actual values, close means , where and are independent with and . We want to model , but we observed values so our model is |
Author: | outermeasure [ Fri, 18 Nov 2016 05:49:54 UTC ] |
Post subject: | Re: Maximum likelihood and SIMEX |
Pan Miroslav wrote: outermeasure wrote: Pan Miroslav wrote: I got some troubles proving following theorems: 1) In linear regression model with DEPENDENT normally distributed errors (with covariance matrix ) the maximum likelihood estimation of gives 2) Let be estimated by least squares method, let , , . Show that My progress - in the second one, simply putting all the things I know to the result and hoping that it will give expectation of the simply did not work. For the first one I don't know what to try, because I'm not even sure how the likelihood function looks like when we got dependent errors. (1) Recall the probability density function of multivariate normal distribution (2) What are and ? What is ? Anyway, split where contains all the other independent variables that we ignore and write . Expand out in the formula for ordinary least-square and take (conditional) expectation (so the disappears). The result is the so-called omitted variable (bias) formula which has an intuitive description --- the bias is precisely the weighted proportion of the omitted variables that are "explained" by the variables we included. So, for the first one it's now basically about using so it will be simpler and then derivate and solve first order conditions for maximum, right? For the second one I'll write down some more information Let be observed values for and let be the close actual values, close means , where and are independent with and . We want to model , but we observed values so our model is (1) After you take the product of likelihoods of each observation to get the likelihood of all observed values, yes. (2) I think you better check the denominator of , other than that you have , , and so plugging into the omitted variable formula gives . |
Page 1 of 1 | All times are UTC [ DST ] |
Powered by phpBB® Forum Software © phpBB Group http://www.phpbb.com/ |