Hi all,
I would like to check my answers using the negative binomial distribution
with Zelig, but when I try to fit the negative binomial, I end up with the
following error:
z.2b <- zelig(conflu ~ ppid + sumil, model = "negbin", data = ussu)
Error in poisson(link = log) :
unused argument(s) (link = .Primitive("log"))
Strangely, I got it to work once before, but then it stopped working once
again. If anyone knows why I might be getting this error, I would appreciate
any suggestions.
Thanks,
Laurence
Does anyone know whether we're supposed to use the rpoiss and rnbinom
functions when simulating the expected values for the poisson and negative
binomial distributions?
Also, does the phrase "all quantities of interest" include standard errors
or confidence intervals? That is, when we simulate the expected annual
number of events under ppid = 1 and ppid = 0 while holding sumil at its
mean, do we need to report standard errors or confidence intervals for the
expected values?
Thanks!
Sean
Hi Everyone:
I am installing WhatIf and getting a problem installing a dependent
package lpSolve. Here is the error: Any thoughts? Thanks
* Installing *source* package 'lpSolve' ...
** libs
gcc-4.2 -std=gnu99 -I/usr/share/R/include -I/usr/share/R/include -I .
-DINTEGERTIME -DPARSER_LP -DBUILDING_FOR_R -DYY_NEVER_INTERACTIVE -DUSRDLL
-DCLOCKTIME -DRoleIsExternalInvEngine -DINVERSE_ACTIVE=INVERSE_LUSOL
-DINLINE=static -DParanoia -fpic -g -O2 -c colamd.c -o colamd.o
In file included from colamd.c:676:
colamd.h:66:20: error: stdlib.h: No such file or directory
In file included from colamd.c:676:
colamd.h:262: warning: parameter names (without types) in function
declaration
In file included from
/usr/lib/gcc/i486-linux-gnu/4.2.1/include/syslimits.h:7,
from /usr/lib/gcc/i486-linux-gnu/4.2.1/include/limits.h:11,
from colamd.c:677:
/usr/lib/gcc/i486-linux-gnu/4.2.1/include/limits.h:122:61: error: limits.h:
No such file or directory
colamd.c:683:19: error: stdio.h: No such file or directory
colamd.c:684:20: error: assert.h: No such file or directory
colamd.c:1026: warning: parameter names (without types) in function
declaration
colamd.c: In function 'print_report':
colamd.c:3052: warning: implicit declaration of function 'printf'
colamd.c:3052: warning: incompatible implicit declaration of built-in
function 'printf'
colamd.c:3062: warning: incompatible implicit declaration of built-in
function 'printf'
colamd.c:3066: warning: incompatible implicit declaration of built-in
function 'printf'
colamd.c:3074: warning: incompatible implicit declaration of built-in
function 'printf'
make: *** [colamd.o] Error 1
ERROR: compilation failed for package 'lpSolve'
** Removing '/usr/local/lib/R/site-library/lpSolve'
* Installing *source* package 'WhatIf' ...
** R
** data
** demo
** inst
** help
>>> Building/Updating help pages for package 'WhatIf'
Formats: text html latex example
peacecf text html latex
peacef text html latex
plot.whatif text html latex example
print.summary.whatif text html latex example
print.whatif text html latex example
summary.whatif text html latex example
whatif text html latex example
** building package indices ...
* DONE (WhatIf)
The downloaded packages are in
/tmp/Rtmp1Bdq7d/downloaded_packages
Warning message:
installation of package 'lpSolve' had non-zero exit status in:
install.packages()
>
Hi,
Would anyone know what the theta in the gamma function refers too? I am
writing my negative binomial function and am not exactly sure what theta
represents.
Thank you
Aaka
In the lecture notes (page 53 of the single-equation models slides), lambda
is subscripted in the systematic component, the PDF component of the
likelihood function, and the log-likelihood function. If one of the
assumptions of the model is that the same rate parameter underlies the
data-generating process, why does lambda have a subscript? Is this a typo?
This might be a stupid question--I've been staring at this stuff for too
long.
Thanks,
John
Quick question... is Zelig ok for the simulations in Problem 2, C2 (the last
problem on the pset) or should we simulate the expected values and first
differences for the two models manually?
thanks
Jane
Hi all,
I am trying to generate a confidence envelope for a lowess regression, but
cannot find the code in R that can does this. Fox's book ("Applied
regression...") did this on page 424. Lowess() runs the regression, but does
not give the confidence intervals. Does anyone know what command he used?
Best,
Viridiana R?os
617-997-2471
Hello classmates,
Despite my best efforts, I still have the same two questions I had
last night in this morning. I am listing them again in and
demonstrating what I tried to do to see if anyone has any other ideas.
First, for the ordered probit model, , zelig is giving me this error
message:
> snctmodel <- zelig(RES ~ IMPORT + COST + TARGET + COOP +
TARGET*COOP, model="oprobit", data=snct)
Warning message:
In function (formula, data, weights, start, ..., subset, na.action, :
design appears to be rank-deficient, so dropping some coefs
I don't know if it's giving me wrong coefficients because of this,
but it is a problem because it leads to these two errors as well:
#SIMULATIONS
> import.low <- setx(snctmodel, fn = list(numeric = mean, ordered =
median,others = mode), IMPORT=0)
> import.high <- setx(snctmodel, fn = list(numeric = mean, ordered =
median,others = mode), IMPORT=1)
> coop.low <- setx(snctmodel, fn = list(numeric = mean, ordered =
median,others = mode), COOP=1)
> coop.high <- setx(snctmodel, fn = list(numeric = mean, ordered =
median,others = mode), COOP=2)
>
> coopsim <- sim(snctmodel, coop.low, coop.high, 1000)
Error in x[, -1] %*% t(sim.coef) : non-conformable arguments
> importsim <- sim(snctmodel, import.low, import.high, 1000)
Error in x[, -1] %*% t(sim.coef) : non-conformable arguments
>
> summary(whatif(data = snctmodel, cfact = import.high))
[1] "Preprocessing data ..."
Error in whatif(data = snctmodel, cfact = import.high) :
number of columns of 'cfact' and 'data' are not equal
>
I thought it didn't know the variables were ordered, so I tried
imposing as.ordered() to the variables, but this didn't help very
much. I'm out of ideas on this one because I'm really not sure what
it means for the design to be rank-deficient.
Second, I am having trouble finding the bug in my negative binomial
log likelihood function:
ll.negbin <- function(par, X, Y){
theta <- X%*%par[1:ncol(X)]
gamma <- par[(ncol(X)+1)]
sigma2 <- exp(gamma) + 1
out <- sum(lgamma((theta/(sigma2 - 1)) + 1) - lgamma(Y+1) - lgamma
(theta/(sigma2 -1)) + Y*log((sigma2-1)/sigma2) - (theta/(sigma2 - 1))
*log(sigma2))
}
That is leading to this:
> opt.nb <- optim(c(0,0,0,0,0), ll.negbin, X = X, Y = Y, method =
"BFGS", control = list(fnscale = -1), hessian = TRUE)
Error in optim(c(0, 0, 0, 0, 0), ll.negbin, X = X, Y = Y, method =
"BFGS", :
initial value in 'vmmin' is not finite
In addition: There were 28 warnings (use warnings() to see them)
#(the warnings are all "1: In fn(par, ...) : value out of range in
'lgamma'")
Apparently I've written my function such that it evaluates to -Inf or
NaN at certain points. I can get numbers sometimes depending on the
par values I get it, but changing the starting values in optim is not
much help.
I would appreciate any advice you could offer.
Thanks,
Keith
It turns out lm() was just giving me answers for the first dataset,
so never-mind. I thought if it wasn't going to take mi() it just
wouldn't give me an answer.