The HMDC servers have a fix for this. there are 2 separate clusters
in the RCE. one is batch which has something like 400 nodes running
moderate sized machines. the other is a small number of very large
individual machines you can take over. talk to the HMDC people and
they'll get you set up.
Gary
---
http://gking.harvard.edu
On Sun, Apr 25, 2010 at 1:47 PM, Gabriel Chan
<gabe_chan at hksphd.harvard.edu> wrote:
Hi All,
We're running into some memory limitations in R that we're hoping to work
around. Our data has 15,000 observations and we are trying to run the
probit.gee model to find clustered standard errors. Everything works fine in
our models that just have a few covariates. However, when we try to run the
model with country fixed effects (there are 43 countries in the dataset) we
get an error message: "Error: cannot allocate vector of size 34 Kb" We
played around with the model and just included fixed effects for 32 of the
43 countries and got no error message. Any more than 32 countries and the
error comes back -- so we are only barely exceeding R's memory capabilities.
Also, the model works fine when we use vanilla probit, only probit.gee is
giving this problem. Also note that we have already used the command
memory.limit(size = 4000) -- which is the maximum, I believe.
If anyone has an idea for a workaround we'd love to hear it. Otherwise, I
think we're going to be stuck calculating our coefficients in R and having
to calculate our standard errors in Stata.
Thanks!
-Gabe and Tara
--
gc
_______________________________________________
gov2001-l mailing list
gov2001-l at
lists.fas.harvard.edu
http://lists.fas.harvard.edu/mailman/listinfo/gov2001-l