I used the commands to increase R's RAM use (--max-mem-size=500 and
--max-vsize=500M). It still gives me the same error message, "Error:
cannot allocate vector of size 109853 Kb" (except with a different
number of Kb). My other question is about how to change the time-series
options in the windows version so that it does not use the cross-sectional
variable & create dummy variables for all 8500 respondents). This
option is the global "_AMusecs=0;" in the online manual. How do
I implement this option in the windows version?
Kristin Burnett
On Tue, 03 Oct 2006 23:27:44 -0400 "Kristin
Burnett" wrote:
-----Original Message----- From: Levi Littvay (UNL) [mailto:https://webmail.psu.edu/webmail/main.cgi#] Sent: Tuesday, October 03, 2006 4:46 PM To: KRISTIN D BURNETT Subject: Re: [amelia] Fwd: amelia problem Kristin You need to increase the amount of RAM R uses. I know of two ways of doing this. One is done when you run R. Read about this at: http://gking.harvard.edu/zelig/docs/How_do_I2.html The other is by command: memory.limit(1576) (At least this is what I used. I have 2GB of RAM, so you might want to go lower.) Try these and let me know how it worked. Levi
Hello,
I have
been trying to use the most recent Windows version of Amelia II for multiple
imputation of my dataset (and I have the most recent windows version of R
installed: 2.3.1). My data have a max of 5 time points for about 8500
respondents (ends up with 33559 observation rows). In the options, I
identified the timepoint variable (ranges from 1 to 5) as the time series
index and the person id # as the cross sectional index. Then I
identified which variables were nominal (which adds about 20 dummies
altogether). I left the TSCS options at the default. And for priors
I set boundaries of 0 to 1 on the dichotomous variables (about 5 of those), and
set the boundary of 0 to 6 for one variable that I know should not extend
beyond this range. I am loading the data from Stata and requesting that
the 5 output datasets also be in Stata format.
I keep getting an
error message similar to the following:
<<<<<
There
was an unexpected error in the execution of Amelia.
Double check
all inputs for errors and take note of the error message:
Error: cannot
allocate vector of size 219444 Kb
>>>>>
I
started by including all possible interactions, which involved a dataset of 92
variables. But even after I took out the interaction variables and tried
to run it with only 18 variables (although it is more like 38 if you account
for the dummies created for nominal variables), and it still gave me the above
error message. This was also after increasing the workspace to 256MB
(when the computer has 512MB of RAM). Is the dataset just too large for
the program? Or am I doing something else wrong? Thank you for
whatever help you can give me.
Sincerely,
Kristin
Burnett