That makes sense, of course.
But how much is too much? I may have misunderstood the instructional
comments in the configuration file. But I take them to mean that one
can avoid the disk swapping performance penalty by setting the
max_workspace parameter to ((available physical memory)-2MB). I have
set the parameter to ((available physical memory)-(~200MB)). Have I
misinterpreted the instructions?
----- Original Message -----
From: Gary King <king(a)harvard.edu>
Date: Monday, May 26, 2003 5:00 pm
Subject: Re: [amelia] Absolute limit to RAM used by Amelia for Windows?
>
> If you ask for too much memory, then you will be requiring Gauss
> to swap
> memory to disk. So although I wish Gauss computed all this
> automatically,it doesn't. So you're in the position of having to
> get it right. Too
> little and it can bomb out, and too much and it will get slower.
> (If a
> run works in a given amount of memory, then adding memory will not
> usuallymake it faster.)
> Gary
>
> On Mon, 26 May 2003 joew(a)georgetown.edu wrote:
>
> > It's possible that the Windows 2000 Task Manager's "Processes"
> window
> > does not report the memory that the Gauss run-time module uses
> as work
> > space. Perhaps the window reports only the memory allocated to
> the
> > module's executing program code.
> >
> > Hence it may be useful to mention that increasing the work space
> > configuration values in the config file and, later, adding 512MB
> of
> > physical RAM did not reduce the amount of time required to
> generate
> > a "50 draws" report during the importance sampling stage.
> >
> > Also, I added a "workspace=250.0" line to the config file, in
> addition
> > to the "max_workspace=300.0" The amount of time to get 50 draws
> > actually _increased_ from ~32 minutes to ~39 minutes. (Wonder
> whether
> > I hit some virtual memory-triggering wall?)
> >
> > Thanks again in advance for any help with this.
> >
It's possible that the Windows 2000 Task Manager's "Processes" window
does not report the memory that the Gauss run-time module uses as work
space. Perhaps the window reports only the memory allocated to the
module's executing program code.
Hence it may be useful to mention that increasing the work space
configuration values in the config file and, later, adding 512MB of
physical RAM did not reduce the amount of time required to generate
a "50 draws" report during the importance sampling stage.
Also, I added a "workspace=250.0" line to the config file, in addition
to the "max_workspace=300.0" The amount of time to get 50 draws
actually _increased_ from ~32 minutes to ~39 minutes. (Wonder whether
I hit some virtual memory-triggering wall?)
Thanks again in advance for any help with this.
----- Original Message -----
From: Gary King <king(a)harvard.edu>
Date: Sunday, May 25, 2003 4:02 pm
Subject: Re: [amelia] Absolute limit to RAM used by Amelia for Windows?
>
> there is no limit so far as I know, but I haven't done tests. We
> ve heard
> that people have run Amelia with as many as a million observations and
> (too!) large numbers of variables. I suspect that they'd need
> more ram
> than 50megs to do that (although Gauss will start swapping to disk if
> necessary, it becomes much slower when it does). Has anyone else
> tried
> this?
>
> Gary
>
> : Gary King, King(a)Harvard.Edu http://GKing.Harvard.Edu :
> : Center for Basic Research Direct (617) 495-2027 :
> : in the Social Sciences Assistant (617) 495-9271 :
> : 34 Kirkland Street, Rm. 2 HU-MIT DC (617) 495-4734 :
> : Harvard U, Cambridge, MA 02138 eFax (928) 832-7022 :
>
> On Sun, 25 May 2003 joew(a)georgetown.edu wrote:
>
> > Is there an absolute limit to the amount of physical memory that
> one
> > can allocate to Amelia for Windows?
> >
> > I have run Amelia for several days on a PC with 256MB of RAM,
> Windows
> > 2000 Pro, and a 1.2 ghz AMD Athlon (better floating point) CPU.
> > Neither increasing the value of the configuration file's
> max_workspace
> > parameter nor increasing the amount of physical RAM in the PC
> seems to
> > increase the RAM allocated to the GSRUN process.
> >
> > 1. I set the max_workspace parameter value to 50 MB, and the
> Windows
> > Task Manager's Processes window indicated that the GSRUN process
> was
> > using about 50MB of RAM.
> >
> > 2. I reset the parameter to 65MB, reran the same job, and found
> that
> > the GSRUN process still ran using only 50MB of RAM.
> >
> > 3. Today, I upgraded the RAM in the PC from 256MB to 768MB and
> set the
> > max_workspace parameter value to 300MB. The GSRUN process still
> runs
> > using only 50MB of RAM.
> >
> > Note that in each instance, the RAM usage seemed to increase
> from ~27MB
> > during the variance computation stage to ~50MB during the
> importance
> > sampling stage.
> >
> > Do I need to change something other than the value of the
> max_workspace
> > parameter? (Yes, I remembered to "uncomment" that line of
> > configuration code.) Also, my RAM is not cluttered up with a
> bunch of
> > memory-resident crud in the background, so I doubt there's a RAM
> > issue. The Task Manager indicates that, at any given moment
> during the
> > importance sampling stage, there is ~500MB of RAM available.
> >
> > Thanks in advance for any assistance you can offer.
> >
> > The text of the configuration file I use appears below. Thanks
> in
> > advance.
> >
> > --------------------------
> >
> >
>
########################################################################
> ######
> > # GAUSS - Copyright (C) 1984-1998 Aptech Systems, Inc. All
> rights
> > reserved. #
> >
>
########################################################################
> ######
> > # GSRUN.CFG configuration file for GAUSS Run-Time
> > Module #
> >
>
########################################################################
> ######
> > # You may modify this file upon
> > installation. #
> >
>
########################################################################
> ######
> > #
>
> > #
> > # Upon startup, GSRUN looks for this configuration file in
> the
> > same #
> > # directory as GSRUN.EXE. This can be changed by defining an
> > environment #
> > # variable called GSRUN_CFG which contains the full drive and
> > path #
> > # specification of an alternate configuration file. The file
> name
> > will #
> > # remain the
> > same. #
> > #
>
> > #
> >
>
########################################################################
> ######
> > #
>
> > #
> > # The GAUSSDIR variable below defaults to the location of the
> .EXE
> > file #
> > # when commented out, so most users will not need to change
> it. If
> > you #
> > # want to change it, remove the comment character '#' and set
> it to
> > the #
> > # desired path. GAUSSDIR can be referenced using $(GAUSSDIR)
> for
> > any #
> > # of the other paths or filenames, or they can be hardcoded
> with
> > no #
> > # reference to
> > GAUSSDIR. #
> > #
>
> > #
> > # DO NOT put a trailing backslash on GAUSSDIR. GSRUN will
> remove
> > it #
> > # anyway before replacing GAUSSDIR in configuration
> > variables. #
> > #
>
> > #
> >
>
########################################################################
> ######
> > #
>
> > #
> > # You may also use the $() syntax to reference OS environment
> > variables. #
> > # For
> > example:
> #
> > #
>
> > #
> > # tmp_path =
> > $(TMPDIR) #
> > #
>
> > #
> >
>
########################################################################
> ######
> >
> > # if commented out, default is .EXE location
> > #GAUSSDIR = c:\gauss
> >
> > # multiple paths for program files
> > src_path = $(GAUSSDIR)\src;$(GAUSSDIR)\examples
> >
> > # one path for library files
> > lib_path = $(GAUSSDIR)\lib
> >
> > # one path for the error log file
> > err_path = $(GAUSSDIR)
> >
> > # one path and filename for the command log
> > log_file = $(GAUSSDIR)\command.log
> >
> > # one path for DLIBRARY command
> > dlib_path = $(GAUSSDIR)\dlib
> >
> > # one path for temporary files (should be on RAM disk)
> > # if commented out use TMP environment variable
> > #tmp_path =
> >
> > # one path for SAVE command
> > #save_path =
> >
> > # one path for LOADM command
> > #loadm_path =
> >
> > # one path for LOADP, LOADF, LOADK command
> > #loadp_path =
> >
> > # one path for LOADS command
> > #loads_path =
> >
> > # alternate editor name and flags
> > #alt_editor =
> >
> > # Set max_workspace to limit the use of extended or expanded
> memory.
> > When
> > # commented out, it defaults to 128.0 (Mbytes). If
> max_workspace is set
> > # significantly higher than the amount of physical memory
> available,
> > GSRUN
> > # may take an abnormally long time to load, depending on the
> memory
> > manager
> > # your system uses; that is why we are currently shipping GSRUN with
> > # max_workspace set to 4.0. You can try commenting it out to see
> if your
> > # system has this problem. If it does, set max_workspace to the
> amount
> > of
> > # free memory (in Mbytes) minus 2.0 (the amount needed to load
> GSRUN).> #
> > # You will also need to set max_workspace if you are running
> under MS
> > Windows
> > # 3.1, to reserve some memory for the system.
> > #max_workspace = 8.0 # maximum workspace in megabytes
> >
> > max_workspace = 300.0
> >
> > line_numbers = on # on, off Alt-X
> menu> autoload = on # on, off
> Ctrl-A
> > autodelete = on # on, off ''
> > translate = off # on, off Ctrl-T
> > transtrack = on # on, off Alt-X
> menu> declare_warn = off # on, off
> Ctrl-W
> > compiler_trace = off # off, file, line, symbol Ctrl-V
> > user_lib = on # on, off Ctrl-L
> > gauss_lib = on # on, off ''
> > enter_execute = on # on, off Alt-F4
> > tab_size = 4 # 1-8
> > escape_char = on # on, off, editor recognizes escape
> character> case_sensitivity = on # on, off, default case
> sensitivity for
> > editor
> > insert_mode = on # on, off, default insert state for
> editor> ctrl_z = off # on, off, editor saves file
> with
> > terminating ^Z
> >
> > complex_numbers = on # on, off
> SYSSTATE( 8
> > complex_char = i # single ascii character
> SYSSTATE( 9
> > screen = on # on, off
> SYSSTATE( 15
> > print = off # on, off
> SYSSTATE( 16
> > lprint = off # on, off
> SYSSTATE( 17
> > precision = 80 # 64, 80
> SYSSTATE( 12
> > lpwidth = 80 # 1-256
> SYSSTATE( 10
> > outwidth = 80 # 1-256
> SYSSTATE( 11
> > crout_tol = 1.0e-14 # >= 0.0
> SYSSTATE( 13
> > chol_tol = 1.0e-14 # >= 0.0
> SYSSTATE( 14
> >
> > cache_size = 64 # size of CPU data cache in Kbytes
> >
> > pqg_dpen_width = 0 # line width for displayed graphs; 0
> is
> > fastest
> > pqg_ppen_width = 0 # line width for printed graphs; 0 is
> fastest>
> > dat_fmt_version = v96 # data set / matrix file version
> > # v89 original DOS format
> > # v96 universal format
> >
> > fastio = 50 # 0-100; set higher for faster
> output, lower
> > for
> > # smoother output
> >
> > matwidth = 500 # right-hand margin (1-500) for
> displaying
> > matrices
> >
> > batchmode = 1 # 0 = interactive mode; 1 = batch
> mode,
> > append to
> > # gauss.log; 2 = batch mode,
> overwrite
> > gauss.log
> >
> > iconized = true # true, false, start GAUSS with
> Command
> > window
> > # iconized; for use when running
> programs in
> > DOS
> > # compatibility window (see DOSWinOpen)
> >
> >
>
########################################################################
> ######
> > # End of
> > GSRUN.CFG #
> >
>
########################################################################
> ######
> >
> >
>
>
Is there an absolute limit to the amount of physical memory that one
can allocate to Amelia for Windows?
I have run Amelia for several days on a PC with 256MB of RAM, Windows
2000 Pro, and a 1.2 ghz AMD Athlon (better floating point) CPU.
Neither increasing the value of the configuration file's max_workspace
parameter nor increasing the amount of physical RAM in the PC seems to
increase the RAM allocated to the GSRUN process.
1. I set the max_workspace parameter value to 50 MB, and the Windows
Task Manager's Processes window indicated that the GSRUN process was
using about 50MB of RAM.
2. I reset the parameter to 65MB, reran the same job, and found that
the GSRUN process still ran using only 50MB of RAM.
3. Today, I upgraded the RAM in the PC from 256MB to 768MB and set the
max_workspace parameter value to 300MB. The GSRUN process still runs
using only 50MB of RAM.
Note that in each instance, the RAM usage seemed to increase from ~27MB
during the variance computation stage to ~50MB during the importance
sampling stage.
Do I need to change something other than the value of the max_workspace
parameter? (Yes, I remembered to "uncomment" that line of
configuration code.) Also, my RAM is not cluttered up with a bunch of
memory-resident crud in the background, so I doubt there's a RAM
issue. The Task Manager indicates that, at any given moment during the
importance sampling stage, there is ~500MB of RAM available.
Thanks in advance for any assistance you can offer.
The text of the configuration file I use appears below. Thanks in
advance.
--------------------------
########################################################################
######
# GAUSS - Copyright (C) 1984-1998 Aptech Systems, Inc. All rights
reserved. #
########################################################################
######
# GSRUN.CFG configuration file for GAUSS Run-Time
Module #
########################################################################
######
# You may modify this file upon
installation. #
########################################################################
######
#
#
# Upon startup, GSRUN looks for this configuration file in the
same #
# directory as GSRUN.EXE. This can be changed by defining an
environment #
# variable called GSRUN_CFG which contains the full drive and
path #
# specification of an alternate configuration file. The file name
will #
# remain the
same. #
#
#
########################################################################
######
#
#
# The GAUSSDIR variable below defaults to the location of the .EXE
file #
# when commented out, so most users will not need to change it. If
you #
# want to change it, remove the comment character '#' and set it to
the #
# desired path. GAUSSDIR can be referenced using $(GAUSSDIR) for
any #
# of the other paths or filenames, or they can be hardcoded with
no #
# reference to
GAUSSDIR. #
#
#
# DO NOT put a trailing backslash on GAUSSDIR. GSRUN will remove
it #
# anyway before replacing GAUSSDIR in configuration
variables. #
#
#
########################################################################
######
#
#
# You may also use the $() syntax to reference OS environment
variables. #
# For
example: #
#
#
# tmp_path =
$(TMPDIR) #
#
#
########################################################################
######
# if commented out, default is .EXE location
#GAUSSDIR = c:\gauss
# multiple paths for program files
src_path = $(GAUSSDIR)\src;$(GAUSSDIR)\examples
# one path for library files
lib_path = $(GAUSSDIR)\lib
# one path for the error log file
err_path = $(GAUSSDIR)
# one path and filename for the command log
log_file = $(GAUSSDIR)\command.log
# one path for DLIBRARY command
dlib_path = $(GAUSSDIR)\dlib
# one path for temporary files (should be on RAM disk)
# if commented out use TMP environment variable
#tmp_path =
# one path for SAVE command
#save_path =
# one path for LOADM command
#loadm_path =
# one path for LOADP, LOADF, LOADK command
#loadp_path =
# one path for LOADS command
#loads_path =
# alternate editor name and flags
#alt_editor =
# Set max_workspace to limit the use of extended or expanded memory.
When
# commented out, it defaults to 128.0 (Mbytes). If max_workspace is set
# significantly higher than the amount of physical memory available,
GSRUN
# may take an abnormally long time to load, depending on the memory
manager
# your system uses; that is why we are currently shipping GSRUN with
# max_workspace set to 4.0. You can try commenting it out to see if your
# system has this problem. If it does, set max_workspace to the amount
of
# free memory (in Mbytes) minus 2.0 (the amount needed to load GSRUN).
#
# You will also need to set max_workspace if you are running under MS
Windows
# 3.1, to reserve some memory for the system.
#max_workspace = 8.0 # maximum workspace in megabytes
max_workspace = 300.0
line_numbers = on # on, off Alt-X menu
autoload = on # on, off Ctrl-A
autodelete = on # on, off ''
translate = off # on, off Ctrl-T
transtrack = on # on, off Alt-X menu
declare_warn = off # on, off Ctrl-W
compiler_trace = off # off, file, line, symbol Ctrl-V
user_lib = on # on, off Ctrl-L
gauss_lib = on # on, off ''
enter_execute = on # on, off Alt-F4
tab_size = 4 # 1-8
escape_char = on # on, off, editor recognizes escape character
case_sensitivity = on # on, off, default case sensitivity for
editor
insert_mode = on # on, off, default insert state for editor
ctrl_z = off # on, off, editor saves file with
terminating ^Z
complex_numbers = on # on, off SYSSTATE( 8
complex_char = i # single ascii character SYSSTATE( 9
screen = on # on, off SYSSTATE( 15
print = off # on, off SYSSTATE( 16
lprint = off # on, off SYSSTATE( 17
precision = 80 # 64, 80 SYSSTATE( 12
lpwidth = 80 # 1-256 SYSSTATE( 10
outwidth = 80 # 1-256 SYSSTATE( 11
crout_tol = 1.0e-14 # >= 0.0 SYSSTATE( 13
chol_tol = 1.0e-14 # >= 0.0 SYSSTATE( 14
cache_size = 64 # size of CPU data cache in Kbytes
pqg_dpen_width = 0 # line width for displayed graphs; 0 is
fastest
pqg_ppen_width = 0 # line width for printed graphs; 0 is fastest
dat_fmt_version = v96 # data set / matrix file version
# v89 original DOS format
# v96 universal format
fastio = 50 # 0-100; set higher for faster output, lower
for
# smoother output
matwidth = 500 # right-hand margin (1-500) for displaying
matrices
batchmode = 1 # 0 = interactive mode; 1 = batch mode,
append to
# gauss.log; 2 = batch mode, overwrite
gauss.log
iconized = true # true, false, start GAUSS with Command
window
# iconized; for use when running programs in
DOS
# compatibility window (see DOSWinOpen)
########################################################################
######
# End of
GSRUN.CFG #
########################################################################
######
I was wondering to what degree the recombination of test statistics can
be taken. Specifically, If I perform individual Ramsey Reset tests on
each of my estimations from imputed datasets can I average the value of
F produced by the test and compare this value to the critical at the
proper degrees of freedom (which I presume would be the same as for each
test)?
If I do this, should I also compute the variance across the individual
estimates?
If I do thus and the 95% confidence interval produces a range that goes
from below the critical to above the critical value of F at the proper
degrees of freedom, what do I conclude about the rejection of the null
of proper specification?
Matthew Vile