Hi Michael,
I'm not sure this will be helpful to you, but perhaps one way to get at this
is to draw a likelihood that you consider quite informative (because it is a
tight curve around its maximum) and then test out some likelihood ratios,
especially comparing the ratio of the likelihood at its maximum to the
likelihood at some other possible (but less likely) parameter value. This
might clarify the link between curvature and likelihood ratios. Then, draw
a second likelihood ratio which is flatter (but might be for estimating the
same parameter) and check out some likelihood ratios for that curve. How do
the likelihood ratios change for that curve as you try different parameter
values compared to the very narrow likelihood you initially drew?
Iain
On Wed, Feb 17, 2010 at 3:54 PM, Michael Barnett <mlbarnett at gmail.com>wrote:
Hi everyone -
I have a basic question on the problem set - on question 1.3,
describing the importance of the "information" value. Even after
reading UPM Ch. 4, I'm still a little confused about how to bring the
concept of the likelihood ratio into the answer. I think I intuitively
understand why the second derivative of the log-likelihood is a good
estimate of the precision of the MLE, but the connection to the LR has
me scratching my head a bit. What am I missing?
-Michael
_______________________________________________
gov2001-l mailing list
gov2001-l at
lists.fas.harvard.edu
http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l