Hi folks,
We made this announcement in section, but in case you weren't around --
there will NOT be a problem set going out this evening or anything due next
Thursday. Instead, we're adding a small assignment onto your replication by
asking you to create a replication data set in the IQSS dataverse (and then
to include a citation to this replication data set in your paper).
We'll be discussing this in lecture on Monday.
best,
Maya
Dear all,
Does anyone have experience downloading R on Ubuntu? I'm having
trouble doing this, even after working through the instructions here:
http://cran.r-project.org/bin/linux/ubuntu/
Many thanks,
Shelby
Hi all,
Can someone help me with the advice on the problem 2.1 in our PS7?
This is my function in R. I can't get percentage, but that probably means that I got it wrong altogethe.
>counterfactual.one <- cbind(median(mar), median(femm), median(kid5), mean(doctor), median(ment), median(mar))
>counterfactual.two <- cbind(median(mar)-1, median(femm), median(kid5), mean(doctor), median(ment), median(mar)-1)
>counterfactuals<-rbind(counterfactual.one, counterfactual.two )
>counterfactuals[1,4]
>coefficients<-as.data.frame(cbind(optimizing$par[2], optimizing$par[3], optimizing$par[4], optimizing$par[5], optimizing$par[6], optimizing$par[8]))
>verdict <- whatif(data = coefficients, cfact = counterfactuals)
>summary(verdict)
>verdict$in.hull
>verdict$sum.stat
Any help is appreciated.
Nino Malekovic
MPA Candidate, Class 2011
Harvard Kennedy School
________________________________________
From: gov2001-l-bounces at lists.fas.harvard.edu [gov2001-l-bounces at lists.fas.harvard.edu] On Behalf Of gov2001-l-request at lists.fas.harvard.edu [gov2001-l-request at lists.fas.harvard.edu]
Sent: Thursday, April 15, 2010 7:46 AM
To: gov2001-l at lists.fas.harvard.edu
Subject: gov2001-l Digest, Vol 57, Issue 19
Send gov2001-l mailing list submissions to
gov2001-l at lists.fas.harvard.edu
To subscribe or unsubscribe via the World Wide Web, visit
http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
or, via email, send a message with subject or body 'help' to
gov2001-l-request at lists.fas.harvard.edu
You can reach the person managing the list at
gov2001-l-owner at lists.fas.harvard.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of gov2001-l digest..."
Today's Topics:
1. Drop box (Akihiro Nishi)
2. Standard Error for 1.5 (Han He)
3. Re: Standard Error for 1.5 (Michael Barnett)
4. Re: Standard Error for 1.5 (Han He)
5. Re: Standard Error for 1.5 (Iain Osgood)
----------------------------------------------------------------------
Message: 1
Date: Wed, 14 Apr 2010 15:42:04 -0400
From: "Akihiro Nishi" <anishi at hsph.harvard.edu>
Subject: [gov2001] Drop box
To: <gov2001-l at lists.fas.harvard.edu>
Message-ID: <4BC5E24C0200004600041E39 at hsph.harvard.edu>
Content-Type: text/plain; charset="us-ascii"
Hi, Maya and Iain,
I cannot find the course drop-box for Problem Set 7. Thanks!
Akihiro
>>> Jason Ketola 04/13/10 11:22 PM >>>
Thanks, Joseph. Thank you for pointing that out!
I was definitely making it much harder on myself.
Jason
Gavinlertvatana, Poj wrote:
> Jason,
>
> For the homework, you don't need to take derivatives. 1.2 just asks for a function that calculates the ll, with inputs of parameters, x, and y. Just take the log of the (product of) likelihood function. Later, in 1.4, you use optim, which calls the ll function as in previous hw's.
>
> Outside of the assignment, you're right though that in general, you would take two partial derivs to find the analytic answer -- i.e., find the maximum of the log likelihood function.
>
> Best regards,
> Joseph
>
> Joseph Poj Gavinlertvatana
> Doctoral student, Marketing
> Harvard Business School
> Wyss Hall, Soldiers Field, Boston, MA 02163
> Ph 617.230.5907
> Fx 617.496.4397
> Txt/Vm 617.910.0563
> Em pgavinlertvatana at hbs.edu
>
> -----Original Message-----
> From: gov2001-l-bounces at lists.fas.harvard.edu [mailto:gov2001-l-bounces at lists.fas.harvard.edu] On Behalf Of Jason Ketola
> Sent: Tuesday, April 13, 2010 11:03 PM
> To: Class List for Gov 2001/E-2001
> Subject: [gov2001] 1.2 question
>
> This is likely due to my uncertainty about 1.1, but does finding the LL
> in 1.2 require us to take two partial derivatives?
>
> Thanks,
> Jason
> _______________________________________________
> gov2001-l mailing list
> gov2001-l at lists.fas.harvard.edu
> http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
>
> _______________________________________________
> gov2001-l mailing list
> gov2001-l at lists.fas.harvard.edu
> http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
>
_______________________________________________
gov2001-l mailing list
gov2001-l at lists.fas.harvard.eduhttp://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
Hey,
I am not particularly sure if I did the first difference right but I took
the median values and changed a married variable, ran it through the
expected value equation and took the difference. My question is how do we
then get the standard error. Anyone have any idea/hints?
Best,
Han
Harvard College Class of 2013
614-329-1324
Hi, Maya and Iain,
I cannot find the course drop-box for Problem Set 7. Thanks!
Akihiro
>>> Jason Ketola 04/13/10 11:22 PM >>>
Thanks, Joseph. Thank you for pointing that out!
I was definitely making it much harder on myself.
Jason
Gavinlertvatana, Poj wrote:
> Jason,
>
> For the homework, you don't need to take derivatives. 1.2 just asks for a function that calculates the ll, with inputs of parameters, x, and y. Just take the log of the (product of) likelihood function. Later, in 1.4, you use optim, which calls the ll function as in previous hw's.
>
> Outside of the assignment, you're right though that in general, you would take two partial derivs to find the analytic answer -- i.e., find the maximum of the log likelihood function.
>
> Best regards,
> Joseph
>
> Joseph Poj Gavinlertvatana
> Doctoral student, Marketing
> Harvard Business School
> Wyss Hall, Soldiers Field, Boston, MA 02163
> Ph 617.230.5907
> Fx 617.496.4397
> Txt/Vm 617.910.0563
> Em pgavinlertvatana at hbs.edu
>
> -----Original Message-----
> From: gov2001-l-bounces at lists.fas.harvard.edu [mailto:gov2001-l-bounces at lists.fas.harvard.edu] On Behalf Of Jason Ketola
> Sent: Tuesday, April 13, 2010 11:03 PM
> To: Class List for Gov 2001/E-2001
> Subject: [gov2001] 1.2 question
>
> This is likely due to my uncertainty about 1.1, but does finding the LL
> in 1.2 require us to take two partial derivatives?
>
> Thanks,
> Jason
> _______________________________________________
> gov2001-l mailing list
> gov2001-l at lists.fas.harvard.edu
> http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
>
> _______________________________________________
> gov2001-l mailing list
> gov2001-l at lists.fas.harvard.edu
> http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
>
_______________________________________________
gov2001-l mailing list
gov2001-l at lists.fas.harvard.eduhttp://lists.fas.harvard.edu/mailman/listinfo/gov2001-l
I am trying to run a CEM with user-defined coarsening. Essentially there are a ton of revenue covariates, and I would like them all to be in the same 'bins'. Can someone confirm that this is the right way to attach a number of cutpoints into the function? I have a cutpoint for each covariate conjoined by list(). Thanks in advanced.
Anil
salescut.point05 <- seq(0,13,.05) #set up bins
mat.point05 <- cem(treatment = "lrsales", data = winlow, drop = c("X","lrsales2"),
+ cutpoints = list(lgsales2_lag5 = salescut.point05, lrsales_lag5
+salescut.point05, lgsales2_lag4 = salescut.point05, lrsales_lag4 =
+salescut.point05, lgsales2_lag3 = salescut.point05, lrsales_lag3 =
+salescut.point05, lgsales2_lag2 = salescut.point05, lrsales_lag2 =
+salescut.point05, lgsales2_lag1 = salescut.point05, lrsales_lag1 =
+salescut.point05))
Anil Doshi
Doctoral Student | Technology and Operations Management
Harvard Business School
302 Wyss Hall
Boston, MA 02163
tel 646-244-5396
email adoshi at hbs.edu<mailto:adoshi at hbs.edu>
Hi all,
We're working on our replication paper which requires including fixed
effects in a probit regression. When we run the regression we get reported
coefficients for all of our independent variables and for each of the
possible fixed effect values except for one of them.
Before the regression we run as.factor to create industry.factor, the fixed
effects for industries and then include industry.factor in the probit
regression specification. After running summary on the output of the probit,
we get estimated coefficients for every fixed effect except for "Textiles".
Before the regression we can run summary(industry.factor) and we can see
that there are over 1200 observations for "Textiles" in industry.factor.
"Textiles" is the first industry listed in industry.factor, so I suspect
that R is somehow dropping it for being first in industry.factor. Any help
would be appreciated. We tried this with both the glm and zelig
implementations and found the exact same problem.
Thanks!
-Gabe & Tara
--
gc
Hello class:
I'm trying to obtain the first difference for Problem 1.5, and I'm running into
trouble. Can we use setx() even if the model does not use zelig?
any hints/pointers would be greatly appreciated =)
Harvard's International Coalition of College Philanthropists and MIT's Sigma
Phi Epsilon fraternity present...
**CONCERT FOR A CURE**
Featuring Emmy-winning electric violinist Mark Wood from Trans-Siberian
Orchestra!!
At Club OBERON, 2 Arrow Street, Harvard Sq.
April 18th, 8 pm
All benefits of the concert go towards:
- National Multiple Sclerosis Society (http://www.nationalmssociety.org)
- International Help of Missionaries (http://ihmonline.org)
- One Laptop Per Child (http://laptop.org/en/)
- World Water Relief Organization for Haiti (
http://www.worldwaterrelief-haiti.org)
*
*
*Special guest performers include Gigantic Ant (featuring MIT's own Balaji
Mani) and MIT's Gillian Grogan. *
Presented by Harvard's International Coalition of College Philanthropists
and MIT's Sigma Phi Epsilon fraternity, come soak in all of the funk, rock,
and violins you can for a worthy cause.
***TICKETS: $12 for students with SPEICCP10 discount code! $18 at the door.
$25 VIP tickets. Only 250 tickets left!***
*** Buy tickets at https://www.ovationtix.com/trs/pe/8091995 or call
866-811-4111. ***
Facebook event page: http://www.facebook.com/event.php?eid=115545471793053