Peter Ralph
26 November 2019 – Advanced Biological Statistics
To simulate from: \[\begin{aligned} \mu_i &= b_0 + b_1 X_{i1} + \cdots + b_k X_{ik} \\ Y_i &\sim \Normal(\mu_i, \sigma) . \end{aligned}\]
either
To simulate from: \[\begin{aligned} \mu_i &= b_0 + b_1 X_{i1} + \cdots + b_k X_{i1} \\ Y_i &\sim \Normal(\mu_i, \sigma) . \end{aligned}\]
of
In R, %*%
is matrix multiplication: if
then X %*% b
(or, \(X b\) in math notation) is shorthand for the \(n\)-vector \[ (Xb)_i = \sum_{j=1}^k X_{ij} b_j . \]
In Stan, matrix multiplication is *
.
(the “family”; describes the output)
(connects input to output)
(connects linear predictor to probability distribution)
Linear regression: \[ Y \sim \Normal(X \beta, \sigma) .\]
Logistic regression: \[ Y \sim \Binom(n, \logistic(X\beta)) .\]
Gamma regression: \[ Y \sim \Gam(\text{scale}=\exp(X\beta)/k, \text{shape}=k) .\]
glm(y ~ x, family=binomial)
family {stats}
Family objects provide a convenient way to specify the details of the models
used by functions such as glm.
binomial(link = "logit")
gaussian(link = "identity")
Gamma(link = "inverse")
inverse.gaussian(link = "1/mu^2")
poisson(link = "log")
Arguments
link: a specification for the model link function. This can be a
name/expression, a literal character string, a length-one character vector, or
an object of class "link-glm" (such as generated by make.link) provided it is
not specified via one of the standard names given next.
The gaussian family accepts the links (as names) identity, log and inverse; the
binomial family the links logit, probit, cauchit, (corresponding to logistic,
normal and Cauchy CDFs respectively) log and cloglog (complementary log-log);
the Gamma family the links inverse, identity and log; the poisson family the
links log, identity, and sqrt; and the inverse.gaussian family the links
1/mu^2, inverse, identity and log.
Note that the link function is the inverse of what we’d write down in the model: for instance, if the mean of the response is the exponential of the linear predictor, i.e., if \[ \E[Y] = \exp(X\beta) . \] then we have
link = "log"
Other inverses:
logistic
has inverse logit
identity
has inverse identity
x^2
(“squared”) has inverse sqrt
100 trials, where probability of success depends on \(x\):
glm()
##
## Call:
## glm(formula = zz ~ x, family = "binomial")
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -2.21167 -0.07659 -0.00864 0.16365 2.72105
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) -11.3818 3.0318 -3.754 0.000174 ***
## x 1.9278 0.4963 3.884 0.000103 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 133.750 on 99 degrees of freedom
## Residual deviance: 37.267 on 98 degrees of freedom
## AIC: 41.267
##
## Number of Fisher Scoring iterations: 8
\[\begin{aligned} Z &\sim \Binom(1, P) \\ P &= \logistic(\alpha + \beta X) \end{aligned}\]
## mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat
## alpha -12.579330 0.12448956 3.2416087 -20.08699 -14.562301 -12.263318 -10.229706 -7.121727 678.0401 1.001617
## beta 2.123924 0.02026774 0.5305953 1.22808 1.736981 2.071771 2.442721 3.340106 685.3559 1.001303
## lp__ -19.641846 0.03384329 1.0367456 -22.47400 -20.010380 -19.311850 -18.917676 -18.658669 938.4245 1.003825
samples <- extract(fit)
layout(t(1:2))
hist(samples$alpha, main=expression(alpha))
abline(v=alpha, col='red'); abline(v=coef(glm_fit)[1], col='green')
hist(samples$beta, main=expression(beta))
abline(v=beta, col='red'); abline(v=coef(glm_fit)[2], col='green')
legend("topright", lty=1, col=c('red', 'green'), legend=c('truth', 'glm'))
data {
int N;
vector[N] X;
vector[N] Y;
vector<lower=0> Z[N];
}
parameters {
real beta[2];
}
model {
Z ~ gamma(1, exp(- beta[1] * X - beta[2] * Y));
}
What is
the probability distribution? (describes the output)
the linear predictor? (connects input to output)
the link function? (connects linear predictor to probability distribution)
Then, simulate from it.
N <- 100
# Matrix of X, Y
XY <- cbind(X=rnorm(N),
Y=rnorm(N))
beta <- c(2, -3)
# Xbeta <- beta[1] * X + beta[2] * Y
Xbeta <- XY %*% beta
Z <- rgamma(N, shape=1, scale=exp(-Xbeta))
gZ <- glm(Z ~ XY[,"X"] + XY[,"Y"], family=Gamma(link='log'))
summary(gZ)
##
## Call:
## glm(formula = Z ~ XY[, "X"] + XY[, "Y"], family = Gamma(link = "log"))
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -3.3130 -0.9182 -0.3895 0.3900 1.7129
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 0.10181 0.09629 1.057 0.293
## XY[, "X"] -2.03873 0.10828 -18.829 <2e-16 ***
## XY[, "Y"] 3.00499 0.09137 32.888 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for Gamma family taken to be 0.9192264)
##
## Null deviance: 1151.41 on 99 degrees of freedom
## Residual deviance: 126.36 on 97 degrees of freedom
## AIC: 233.01
##
## Number of Fisher Scoring iterations: 8
If \(N \sim \Poisson(\mu)\) then \(N \ge 0\) and \[\begin{aligned} \P\{N = k\} = \frac{\mu^k}{k!} e^{-\mu} \end{aligned}\]
\(N\) is a nonnegative integer (i.e., a count)
\(\E[N] = \var[N] = \mu\)
If a machine makes widgets very fast, producing on average one broken widget per minute (and many good ones), each breaking independent of the others, then the number of broken widgets in \(\mu\) minutes is \(\Poisson(\mu)\).
If busses arrive randomly every \(\Exp(1)\) minutes, then the number of busses to arrive in \(\mu\) minutes is \(\Poisson(\mu)\).
We have counts of transcript numbers,
from some individuals of different ages and past exposures to solar irradiation,
of two genotypes.
Model:
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \end{aligned}\]
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &= a + b \times \text{age}_i + c \times \text{exposure}_i \end{aligned}\]
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &= \exp\left( a_{g_i} + b \times \text{age}_i + c_{g_i} \times \text{exposure}_i \right) \end{aligned}\]
glm( )
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &= \exp\left( a_{g_i} + b \times \text{age}_i + c_{g_i} \times \text{exposure}_i \right) \end{aligned}\]
Here are the data:
## genotype age exposure counts
## 1 1 32.002416 0.08573067 12
## 2 1 31.946713 3.34821235 0
## 3 2 9.937669 1.14303355 8
## 4 2 26.423782 27.50630521 0
## 5 1 41.199082 2.97643962 2
## 6 1 6.613676 0.62972565 0
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
\[\begin{aligned} Z_i &\sim \Poisson(y_i) \\ y_i &= \exp(a_{g_i} + b \times \text{age}_i \\ &\qquad {} + c_{g_i} \times \text{exposure}_i ) \end{aligned}\]
data {
int N;
int counts[N];
vector[N] age;
vector[N] exposure;
int genotype[N];
int ngeno;
}
parameters {
real a[ngeno]; // intercepts
real b; // slope for age
real c[ngeno]; // slopes for exposure
}
model {
vector[N] y; // means
y = exp(a[genotype] + b .* age + c[genotype] .* exposure);
counts ~ poisson(y);
// implicitly flat priors
}
Note: scaling the data helps Stan find the right scale to move on.
## Inference for Stan model: simple_poisson.
## 3 chains, each with iter=1000; warmup=500; thin=1;
## post-warmup draws per chain=500, total post-warmup draws=1500.
##
## mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat
## a[1] 2.13 0.00 0.02 2.09 2.12 2.13 2.15 2.17 1324 1.00
## a[2] 1.73 0.00 0.03 1.67 1.71 1.73 1.75 1.78 1009 1.00
## b 0.82 0.00 0.01 0.80 0.82 0.82 0.83 0.84 1304 1.00
## c[1] 0.10 0.00 0.01 0.08 0.09 0.10 0.11 0.12 1492 1.00
## c[2] -0.70 0.00 0.03 -0.76 -0.72 -0.70 -0.67 -0.63 1217 1.00
## lp__ 12164.36 0.06 1.62 12160.26 12163.55 12164.70 12165.58 12166.45 748 1.01
##
## Samples were drawn using NUTS(diag_e) at Mon Nov 25 21:24:14 2019.
## For each parameter, n_eff is a crude measure of effective sample size,
## and Rhat is the potential scale reduction factor on split chains (at
## convergence, Rhat=1).
Here are posterior distributions of the parameters, with the true values in red.
What happened?
Let’s simulate up data under this model to check for goodness of fit.
We expect to not see a good fit. (Why?)
model {
vector[N] mu;
mu = exp(a[geno]
+ b * age
+ c[geno]
.* expo);
counts ~ poisson(mu);
plot(data$counts[order(mu1)], ylab="counts", ylim=range(sim1), type='n')
segments(x0=seq_len(nrow(data)),
y0=rowMins(sim1)[order(mu1)],
y1=rowMaxs(sim1)[order(mu1)])
points(data$counts[order(mu1)], pch=20, col='red')
legend("topleft", pch=c(20,NA), lty=c(NA,1), legend=c("observed", "simulated range"), col=c('red', 'black'))
data {
// what we know: the input
// declarations only
}
parameters {
// what we want to know about:
// defines the space Stan random walks in
// declarations only
}
model {
// stuff to calculate the priors and the likelihoods
// happens every step
}
functions {
// user-defined functions
}
data {
// what we know: the input
// declarations only
}
transformed data {
// calculations to do once, at the start
}
parameters {
// what we want to know about:
// defines the space Stan random walks in
// declarations only
}
transformed parameters {
// other things that we want the posterior distribution of
// happens every step
}
model {
// stuff to calculate the priors and the likelihoods
// happens every step
}
generated quantities {
// more things we want the posterior distribution of
// but that don't affect the random walk
}
Under the hood,
z ~ poisson(mu);
is equivalent to
target += poisson_lpdf(z | mu);
(lpdf
= log posterior density function).
So, if you don’t put a prior on something, it implicitly has a uniform prior (i.e., a flat prior).
These are important. Pay attention to them, and fix the problems.
Run code in the console. Rstudio often hides useful information.
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
Actually, counts are overdispersed, so make the means random, and lognormally distributed.
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &= \exp\left( a_{g_i} + b \times \text{age}_i + c_{g_i} \times \text{exposure}_i \right) \end{aligned}\]
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
Actually, counts are overdispersed, so the means are random, and lognormally distributed.
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &= \exp(W_i) \\ W_i &\sim \Normal(y_i, \sigma) \\ y_i &= a_{g_i} + b \times \text{age}_i + c_{g_i} \times \text{exposure}_i \end{aligned}\]
Counts are Poisson,
with mean that depends on age and exposure,
but effect of exposure depends on genotype.
Actually, counts are overdispersed, so the means are random, and lognormally distributed.
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &\sim \log\Normal(y_i, \sigma) \\ y_i &= a_{g_i} + b \times \text{age}_i + c_{g_i} \times \text{exposure}_i \end{aligned}\]
\[\begin{aligned} Z_i &\sim \Poisson(\mu_i) \\ \mu_i &\sim \log\Normal(y_i, \sigma) \\ y_i &= a_{g_i} + b \times \text{age}_i \\ &\qquad {} + c_{g_i} \times \text{exposure}_i \end{aligned}\]
overdispersed_model <- stan_model(model_code="
data {
int N; // number of data points
vector[N] age;
vector[N] exposure;
int counts[N];
int genotype[N];
int ngenotypes;
}
parameters {
vector[ngenotypes] a; // intercepts
real b; // slope for age
vector[ngenotypes] c; // slopes for exposure
real<lower=0> sigma; // SD on lognormal
vector[N] mu; // mean of the poissons
}
model {
vector[N] y; // mean of the lognormals
y = a[genotype] + b * age + c[genotype] .* exposure;
mu ~ lognormal(y, sigma);
counts ~ poisson(mu);
a ~ normal(0, 100);
b ~ normal(0, 10);
c ~ normal(0, 20);
sigma ~ normal(0, 10);
}")
## Inference for Stan model: lognormal_poisson.
## 3 chains, each with iter=1000; warmup=500; thin=1;
## post-warmup draws per chain=500, total post-warmup draws=1500.
##
## mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat
## mu[1] 11.28 0.05 3.20 5.92 8.99 10.91 13.28 18.41 3883 1.00
## mu[2] 1.37 0.02 0.92 0.26 0.73 1.18 1.78 3.73 2732 1.00
## mu[3] 6.94 0.04 2.45 3.03 5.15 6.64 8.46 12.27 3678 1.00
## mu[4] 0.25 0.01 0.30 0.02 0.09 0.16 0.31 1.01 1413 1.00
## mu[5] 3.01 0.03 1.55 0.85 1.89 2.72 3.84 6.82 2707 1.00
## mu[6] 0.81 0.02 0.66 0.11 0.35 0.60 1.11 2.51 1659 1.00
## mu[7] 9.86 0.06 3.09 4.87 7.68 9.47 11.63 17.04 2644 1.00
## mu[8] 5.52 0.04 2.22 2.22 3.87 5.14 6.93 10.55 3025 1.00
## mu[9] 0.73 0.01 0.60 0.09 0.32 0.56 0.95 2.19 1946 1.00
## mu[10] 5.54 0.04 2.13 2.25 4.06 5.24 6.68 10.48 2567 1.00
## mu[11] 2.04 0.02 1.16 0.48 1.20 1.81 2.68 4.78 2993 1.00
## mu[12] 100.31 0.14 9.75 82.34 93.40 100.12 106.68 120.04 4764 1.00
## mu[13] 1.46 0.02 0.93 0.32 0.79 1.25 1.88 3.85 2678 1.00
## mu[14] 1.55 0.01 1.01 0.28 0.85 1.34 2.02 4.01 4665 1.00
## mu[15] 1.18 0.02 0.86 0.21 0.57 0.96 1.55 3.39 2574 1.00
## mu[16] 7.95 0.05 2.59 3.79 6.09 7.67 9.50 13.84 3248 1.00
## mu[17] 7.41 0.04 2.35 3.36 5.76 7.24 8.77 12.86 3717 1.00
## mu[18] 37.26 0.11 5.97 26.47 33.02 37.00 41.04 49.51 2825 1.00
## mu[19] 5.91 0.05 2.29 2.39 4.27 5.66 7.26 11.08 2442 1.00
## mu[20] 2.44 0.02 1.24 0.73 1.55 2.22 3.08 5.49 3713 1.00
## mu[21] 1.26 0.02 0.90 0.21 0.63 1.04 1.64 3.52 2604 1.00
## mu[22] 0.88 0.01 0.69 0.12 0.40 0.69 1.17 2.65 2117 1.00
## mu[23] 15.97 0.07 4.13 8.84 13.09 15.49 18.39 25.53 3278 1.00
## mu[24] 240.01 0.22 14.98 211.44 230.30 239.73 249.89 270.95 4461 1.00
## mu[25] 3.84 0.03 1.69 1.25 2.63 3.63 4.73 7.64 3592 1.00
## mu[26] 572.81 0.36 22.26 532.12 557.31 572.51 587.54 615.86 3915 1.00
## mu[27] 16.66 0.07 3.97 9.62 13.99 16.43 19.03 25.48 3574 1.00
## mu[28] 3.63 0.03 1.65 1.15 2.46 3.36 4.45 7.66 3511 1.00
## mu[29] 13.27 0.06 3.53 7.37 10.66 12.99 15.56 20.83 3442 1.00
## mu[30] 0.90 0.01 0.72 0.14 0.42 0.71 1.17 2.73 2481 1.00
## mu[31] 1.57 0.02 1.01 0.32 0.82 1.32 2.09 4.22 2510 1.00
## mu[32] 8.25 0.05 2.65 4.00 6.29 7.89 9.93 14.12 2589 1.00
## mu[33] 4.37 0.04 1.88 1.51 3.01 4.10 5.45 8.74 2850 1.00
## mu[34] 2.51 0.02 1.36 0.62 1.54 2.25 3.20 5.98 3095 1.00
## mu[35] 2.10 0.02 1.20 0.47 1.26 1.88 2.68 5.11 2809 1.00
## mu[36] 78.98 0.14 9.02 62.34 72.58 78.75 84.82 97.59 4006 1.00
## mu[37] 8.36 0.05 2.72 4.06 6.31 8.11 10.05 14.60 2941 1.00
## mu[38] 82.82 0.15 9.40 65.82 76.38 82.40 88.89 102.15 4040 1.00
## mu[39] 0.77 0.01 0.60 0.12 0.36 0.62 1.02 2.40 1656 1.00
## mu[40] 1.98 0.02 1.17 0.47 1.09 1.73 2.60 4.89 2828 1.00
## mu[41] 7.35 0.04 2.42 3.42 5.54 7.16 8.82 12.57 3670 1.00
## mu[42] 1.80 0.02 1.15 0.37 0.97 1.53 2.34 4.94 2841 1.00
## mu[43] 1.68 0.02 1.00 0.38 0.99 1.47 2.18 4.13 2643 1.00
## mu[44] 1.55 0.02 1.02 0.28 0.82 1.32 1.98 4.20 3067 1.00
## mu[45] 1.88 0.03 1.17 0.40 1.08 1.63 2.44 4.99 1851 1.00
## mu[46] 2.74 0.03 1.54 0.64 1.63 2.45 3.49 6.65 3325 1.00
## mu[47] 66.82 0.15 8.14 51.80 61.13 66.64 71.99 83.52 2924 1.00
## mu[48] 2.12 0.02 1.22 0.49 1.23 1.88 2.73 5.13 2748 1.00
## mu[49] 53.17 0.13 7.10 40.68 48.10 52.80 57.96 67.46 3152 1.00
## mu[50] 4.17 0.03 1.78 1.52 2.84 3.89 5.23 8.14 2888 1.00
## mu[51] 9.83 0.05 3.10 4.91 7.56 9.47 11.72 16.64 3273 1.00
## mu[52] 3.41 0.03 1.60 1.08 2.27 3.14 4.33 7.17 2863 1.00
## mu[53] 7.41 0.04 2.57 3.18 5.58 7.17 8.93 13.40 3342 1.00
## mu[54] 25.91 0.09 5.20 17.12 22.11 25.52 29.38 36.47 3163 1.00
## mu[55] 6.55 0.04 2.42 2.72 4.83 6.16 7.93 12.04 3157 1.00
## mu[56] 0.95 0.01 0.68 0.16 0.45 0.78 1.27 2.74 2233 1.00
## mu[57] 14.93 0.06 3.71 8.76 12.19 14.64 17.10 23.37 3303 1.00
## mu[58] 3.69 0.03 1.64 1.28 2.47 3.49 4.58 7.63 2483 1.00
## mu[59] 2.82 0.03 1.50 0.75 1.78 2.55 3.59 6.39 2537 1.00
## mu[60] 7.92 0.04 2.55 3.69 6.07 7.69 9.45 13.55 4373 1.00
## mu[61] 3.76 0.03 1.69 1.34 2.47 3.46 4.75 7.96 2366 1.00
## mu[62] 2.55 0.03 1.40 0.66 1.52 2.27 3.28 5.96 2545 1.00
## mu[63] 1.06 0.02 0.79 0.17 0.50 0.85 1.41 3.13 2710 1.00
## mu[64] 4.83 0.03 2.11 1.68 3.29 4.49 6.11 9.71 3856 1.00
## mu[65] 4.13 0.03 1.81 1.43 2.80 3.84 5.28 8.22 2768 1.00
## mu[66] 0.93 0.02 0.70 0.14 0.44 0.77 1.19 2.88 1662 1.00
## mu[67] 4.80 0.04 1.94 1.80 3.43 4.54 5.87 9.39 2515 1.00
## mu[68] 1.95 0.02 1.15 0.45 1.13 1.75 2.49 4.99 3199 1.00
## mu[69] 3.94 0.04 1.86 1.30 2.60 3.63 4.95 8.13 1930 1.00
## mu[70] 4.85 0.04 2.05 1.74 3.37 4.49 6.10 9.51 2724 1.00
## mu[71] 47.98 0.12 7.09 34.75 43.24 47.59 52.21 62.71 3640 1.00
## mu[72] 6.96 0.04 2.51 2.89 5.15 6.69 8.42 12.70 3159 1.00
## mu[73] 1.54 0.02 1.01 0.29 0.83 1.32 2.00 4.11 2577 1.00
## mu[74] 11.80 0.06 3.24 6.40 9.43 11.51 13.79 18.84 2491 1.00
## mu[75] 3.15 0.03 1.63 0.81 1.99 2.83 3.94 7.43 2367 1.00
## mu[76] 4.20 0.03 1.84 1.54 2.85 3.89 5.24 8.31 3962 1.00
## mu[77] 0.81 0.01 0.64 0.11 0.34 0.61 1.11 2.46 2056 1.00
## mu[78] 3.13 0.03 1.58 0.88 1.99 2.83 3.95 6.94 3266 1.00
## mu[79] 2.61 0.03 1.37 0.76 1.61 2.37 3.28 5.94 2666 1.00
## mu[80] 6.71 0.05 2.46 2.88 4.99 6.40 8.09 12.24 2700 1.00
## mu[81] 3.50 0.02 1.61 1.17 2.34 3.19 4.39 7.39 4333 1.00
## mu[82] 1.04 0.01 0.74 0.17 0.50 0.84 1.35 2.96 2577 1.00
## mu[83] 12.64 0.06 3.50 6.89 10.11 12.28 14.87 20.11 3332 1.00
## mu[84] 1.07 0.02 0.81 0.16 0.50 0.86 1.37 3.17 2057 1.00
## mu[85] 0.59 0.01 0.48 0.09 0.26 0.45 0.78 1.86 2013 1.00
## mu[86] 2.25 0.02 1.31 0.53 1.30 2.00 2.87 5.60 3410 1.00
## mu[87] 1.27 0.02 0.89 0.22 0.65 1.05 1.64 3.63 1784 1.00
## mu[88] 23.69 0.08 4.88 14.67 20.38 23.29 26.72 34.22 3638 1.00
## mu[89] 11.21 0.05 3.26 5.57 8.97 10.92 13.09 18.59 3518 1.00
## mu[90] 7.86 0.05 2.68 3.54 5.95 7.55 9.48 13.83 3032 1.00
## mu[91] 3.48 0.04 1.78 1.06 2.14 3.14 4.50 7.62 2226 1.00
## mu[92] 24.12 0.08 4.73 16.17 20.61 23.60 27.27 34.16 3199 1.00
## mu[93] 0.78 0.01 0.65 0.09 0.34 0.60 1.01 2.53 1939 1.00
## mu[94] 6.11 0.04 2.23 2.54 4.48 5.86 7.48 11.16 3385 1.00
## mu[95] 5.36 0.04 2.17 2.11 3.77 5.04 6.55 10.06 3078 1.00
## mu[96] 2.52 0.02 1.34 0.63 1.56 2.25 3.20 5.98 3670 1.00
## mu[97] 14.28 0.05 3.75 7.82 11.59 13.83 16.67 22.76 4764 1.00
## mu[98] 1.54 0.02 1.00 0.30 0.81 1.33 2.02 3.96 2462 1.00
## mu[99] 1.10 0.01 0.75 0.18 0.57 0.93 1.42 3.08 2562 1.00
## mu[100] 2.80 0.03 1.39 0.81 1.78 2.58 3.55 6.17 2664 1.00
## mu[101] 1.15 0.02 0.84 0.19 0.56 0.92 1.50 3.35 2218 1.00
## mu[102] 9.07 0.05 2.85 4.35 7.04 8.78 10.82 15.40 3689 1.00
## mu[103] 5.61 0.04 2.24 2.24 3.96 5.32 6.91 10.68 2839 1.00
## mu[104] 2.58 0.03 1.39 0.69 1.54 2.33 3.35 6.02 2722 1.00
## mu[105] 1.93 0.02 1.17 0.43 1.09 1.67 2.45 4.87 3010 1.00
## mu[106] 1.79 0.02 1.09 0.36 0.97 1.56 2.35 4.42 3096 1.00
## mu[107] 9.98 0.05 2.97 5.10 7.88 9.69 11.74 16.65 3085 1.00
## mu[108] 4.42 0.03 1.83 1.58 3.18 4.17 5.37 8.81 3219 1.00
## mu[109] 1.25 0.02 0.82 0.22 0.66 1.07 1.61 3.44 2750 1.00
## mu[110] 22.73 0.09 4.56 14.54 19.61 22.31 25.61 32.99 2438 1.00
## mu[111] 4.07 0.03 1.67 1.58 2.84 3.80 5.13 7.94 2604 1.00
## mu[112] 7.44 0.05 2.60 3.33 5.57 7.03 9.05 12.99 2797 1.00
## mu[113] 2.22 0.02 1.21 0.60 1.32 1.98 2.84 5.08 2884 1.00
## mu[114] 42.46 0.12 6.47 31.05 37.83 42.16 46.72 56.04 2797 1.00
## mu[115] 3.56 0.03 1.64 1.24 2.34 3.22 4.56 7.43 2706 1.00
## mu[116] 7.60 0.04 2.52 3.74 5.84 7.31 9.09 13.06 4233 1.00
## mu[117] 1.49 0.02 1.00 0.29 0.77 1.25 1.94 4.23 2457 1.00
## mu[118] 4.59 0.03 1.88 1.69 3.23 4.28 5.67 9.02 3443 1.00
## mu[119] 3.49 0.03 1.67 1.03 2.30 3.20 4.37 7.79 2968 1.00
## mu[120] 0.65 0.01 0.52 0.09 0.30 0.51 0.83 1.99 2101 1.00
## mu[121] 1.28 0.02 0.88 0.23 0.65 1.08 1.67 3.49 2743 1.00
## mu[122] 1.39 0.02 0.95 0.25 0.73 1.17 1.79 3.83 2362 1.00
## mu[123] 1.33 0.02 0.91 0.24 0.67 1.13 1.75 3.59 2680 1.00
## mu[124] 18.73 0.08 4.24 11.58 15.64 18.40 21.38 27.92 3169 1.00
## mu[125] 1.69 0.02 1.03 0.37 0.96 1.50 2.17 4.24 2793 1.00
## mu[126] 27.76 0.10 5.08 18.69 24.21 27.51 30.98 38.35 2820 1.00
## mu[127] 3.66 0.03 1.71 1.19 2.40 3.37 4.72 7.47 3059 1.00
## mu[128] 12.34 0.07 3.50 6.61 9.76 11.92 14.48 20.46 2741 1.00
## mu[129] 44.45 0.11 6.34 33.17 39.97 44.01 48.63 57.84 3170 1.00
## mu[130] 9.28 0.05 2.96 4.64 7.12 9.01 10.96 15.92 3030 1.00
## mu[131] 3.27 0.03 1.60 0.99 2.10 2.97 4.18 7.27 2560 1.00
## mu[132] 19.59 0.07 4.20 12.42 16.53 19.17 22.41 28.13 3890 1.00
## mu[133] 1.18 0.02 0.82 0.22 0.59 0.96 1.57 3.22 1876 1.00
## mu[134] 38.78 0.11 6.18 27.54 34.61 38.37 42.65 51.66 3264 1.00
## mu[135] 17.03 0.07 4.10 9.93 14.23 16.77 19.53 25.72 3644 1.00
## mu[136] 1.42 0.02 0.98 0.25 0.74 1.19 1.82 4.04 2546 1.00
## mu[137] 2.87 0.03 1.46 0.71 1.86 2.63 3.62 6.62 2965 1.00
## mu[138] 1.84 0.02 1.06 0.42 1.06 1.64 2.35 4.59 3375 1.00
## mu[139] 5.52 0.04 2.25 2.10 3.90 5.17 6.77 10.48 2927 1.00
## mu[140] 2.01 0.02 1.20 0.43 1.16 1.74 2.56 5.16 3460 1.00
## mu[141] 2.02 0.02 1.12 0.51 1.18 1.83 2.59 4.82 3058 1.00
## mu[142] 2.29 0.02 1.36 0.56 1.27 1.97 3.03 5.57 3358 1.00
## mu[143] 5.08 0.03 2.00 1.96 3.70 4.78 6.15 9.88 3427 1.00
## mu[144] 5.30 0.03 2.08 2.08 3.82 5.06 6.41 10.00 3633 1.00
## mu[145] 18.73 0.08 4.16 11.66 15.82 18.41 21.20 27.61 2925 1.00
## mu[146] 1.35 0.02 0.91 0.26 0.67 1.12 1.80 3.76 2138 1.00
## mu[147] 106.13 0.18 10.34 86.67 98.79 105.67 113.36 126.74 3354 1.00
## mu[148] 4.48 0.03 1.84 1.75 3.15 4.20 5.58 8.95 3154 1.00
## mu[149] 1.34 0.02 0.88 0.24 0.71 1.13 1.75 3.68 3020 1.00
## mu[150] 3.57 0.03 1.71 1.06 2.40 3.31 4.42 7.56 2915 1.00
## mu[151] 33.96 0.10 5.98 23.84 29.51 33.58 37.86 46.28 3708 1.00
## mu[152] 1.42 0.02 0.93 0.28 0.74 1.18 1.86 3.74 2217 1.00
## mu[153] 0.32 0.01 0.31 0.03 0.12 0.23 0.43 1.15 1862 1.00
## mu[154] 9.91 0.06 2.86 5.04 7.80 9.66 11.82 16.19 2608 1.00
## mu[155] 17.42 0.07 3.92 10.73 14.67 17.13 19.74 26.03 2964 1.00
## mu[156] 2.06 0.02 1.29 0.41 1.12 1.79 2.73 5.21 3492 1.00
## mu[157] 52.12 0.13 7.23 39.09 47.32 51.80 56.31 67.19 2982 1.00
## mu[158] 10.48 0.05 3.11 5.28 8.27 10.21 12.46 17.45 3525 1.00
## mu[159] 1.71 0.03 1.08 0.42 0.95 1.45 2.25 4.36 1860 1.00
## mu[160] 0.83 0.02 0.70 0.10 0.36 0.62 1.08 2.62 1892 1.00
## mu[161] 3.38 0.04 1.69 1.00 2.15 3.10 4.30 7.40 2235 1.00
## mu[162] 6.73 0.04 2.60 2.75 4.79 6.35 8.22 12.76 3614 1.00
## mu[163] 0.61 0.01 0.54 0.08 0.26 0.45 0.79 2.05 2337 1.00
## mu[164] 3.25 0.03 1.64 0.99 2.03 2.97 4.12 7.30 2703 1.00
## mu[165] 2.55 0.02 1.35 0.72 1.53 2.31 3.30 5.67 3278 1.00
## mu[166] 12.01 0.06 3.48 6.18 9.45 11.69 14.24 19.60 3275 1.00
## mu[167] 124.38 0.20 10.81 103.85 117.19 124.04 131.35 147.23 3063 1.00
## mu[168] 13.47 0.06 3.76 7.12 10.76 13.13 15.71 21.88 3449 1.00
## mu[169] 23.45 0.09 4.70 15.19 20.06 23.20 26.46 33.36 2736 1.00
## mu[170] 2.42 0.02 1.34 0.61 1.42 2.20 3.04 5.81 3406 1.00
## mu[171] 1.41 0.02 0.92 0.29 0.79 1.22 1.76 3.70 1800 1.00
## mu[172] 6.94 0.04 2.49 2.91 5.14 6.59 8.43 12.70 3411 1.00
## mu[173] 9.33 0.05 3.02 4.28 7.15 9.07 11.04 15.98 3621 1.00
## mu[174] 44.44 0.13 6.68 32.93 39.73 44.01 48.98 58.51 2783 1.00
## mu[175] 9.85 0.05 2.87 5.13 7.83 9.61 11.64 16.29 3064 1.00
## mu[176] 25.32 0.09 4.73 16.42 22.02 25.07 28.28 34.91 2613 1.00
## mu[177] 1.76 0.02 1.11 0.37 0.99 1.52 2.25 4.73 2296 1.00
## mu[178] 3.60 0.03 1.74 1.09 2.38 3.30 4.52 7.66 3038 1.00
## mu[179] 8.24 0.05 2.73 3.90 6.27 7.88 9.94 14.67 2716 1.00
## mu[180] 2.42 0.03 1.37 0.59 1.41 2.19 3.09 5.87 2534 1.00
## mu[181] 13.06 0.06 3.50 6.97 10.48 12.75 15.17 20.82 3175 1.00
## mu[182] 11.53 0.05 3.32 6.01 9.08 11.22 13.42 19.32 3758 1.00
## mu[183] 8.12 0.05 2.71 3.84 6.16 7.80 9.74 14.04 2575 1.00
## mu[184] 2.77 0.03 1.42 0.82 1.71 2.50 3.49 6.30 2323 1.00
## mu[185] 2.08 0.02 1.21 0.49 1.19 1.80 2.72 5.01 2523 1.00
## mu[186] 4.51 0.03 1.86 1.69 3.11 4.26 5.67 8.77 3799 1.00
## mu[187] 2.60 0.03 1.36 0.70 1.62 2.34 3.33 5.96 2540 1.00
## mu[188] 2.43 0.03 1.38 0.59 1.42 2.13 3.20 5.78 2844 1.00
## mu[189] 72.67 0.15 8.52 57.31 66.51 72.13 78.47 90.27 3123 1.00
## mu[190] 2.05 0.02 1.25 0.49 1.13 1.82 2.66 4.95 2743 1.00
## mu[191] 3.81 0.03 1.77 1.20 2.47 3.52 4.84 7.97 3213 1.00
## mu[192] 0.41 0.01 0.39 0.04 0.15 0.29 0.54 1.43 1807 1.00
## mu[193] 6.04 0.04 2.20 2.49 4.43 5.79 7.40 11.16 3713 1.00
## mu[194] 13.59 0.06 3.46 8.02 10.98 13.28 15.69 21.15 2869 1.00
## mu[195] 2.65 0.03 1.34 0.75 1.69 2.42 3.39 5.84 2632 1.00
## mu[196] 0.96 0.01 0.72 0.16 0.45 0.78 1.28 2.77 2743 1.00
## mu[197] 1.44 0.02 0.95 0.27 0.73 1.22 1.89 3.79 2295 1.00
## mu[198] 2.10 0.02 1.16 0.54 1.23 1.85 2.78 5.02 3313 1.00
## mu[199] 2.32 0.02 1.34 0.56 1.33 2.01 3.08 5.44 3382 1.00
## mu[200] 4.37 0.03 1.89 1.53 3.00 4.04 5.47 8.75 3207 1.00
## mu[201] 3.60 0.03 1.68 1.21 2.36 3.31 4.54 7.61 2726 1.00
## mu[202] 2.16 0.02 1.24 0.52 1.25 1.91 2.75 5.28 3051 1.00
## mu[203] 1.82 0.02 1.08 0.40 1.02 1.59 2.35 4.52 3212 1.00
## mu[204] 1.53 0.02 0.99 0.30 0.81 1.32 2.00 4.10 2899 1.00
## mu[205] 1.15 0.02 0.81 0.20 0.57 0.95 1.50 3.26 2133 1.00
## mu[206] 0.77 0.01 0.64 0.10 0.33 0.59 1.02 2.40 2333 1.00
## mu[207] 7.41 0.04 2.55 3.18 5.62 7.15 8.87 13.46 3395 1.00
## mu[208] 7.52 0.04 2.64 3.29 5.60 7.21 9.04 13.53 3694 1.00
## mu[209] 0.24 0.01 0.28 0.02 0.07 0.14 0.30 1.03 1472 1.00
## mu[210] 2.64 0.03 1.39 0.78 1.65 2.37 3.33 6.03 2283 1.00
## mu[211] 18.09 0.07 4.08 11.00 15.19 17.76 20.52 27.34 3072 1.00
## mu[212] 1.38 0.02 0.94 0.24 0.71 1.15 1.79 3.75 2369 1.00
## mu[213] 1.87 0.02 1.13 0.40 1.07 1.64 2.41 4.62 2973 1.00
## mu[214] 15.23 0.07 3.85 8.89 12.40 14.82 17.66 23.53 3083 1.00
## mu[215] 1.27 0.02 0.84 0.25 0.69 1.08 1.61 3.39 2471 1.00
## mu[216] 6.25 0.04 2.30 2.64 4.56 6.00 7.70 11.57 3006 1.00
## mu[217] 11.90 0.06 3.49 6.13 9.31 11.52 13.95 19.73 3121 1.00
## mu[218] 2.28 0.02 1.31 0.54 1.33 2.03 2.96 5.44 3621 1.00
## mu[219] 3.74 0.03 1.77 1.17 2.42 3.44 4.75 7.88 3249 1.00
## mu[220] 3.05 0.03 1.49 0.92 1.95 2.77 3.88 6.77 3198 1.00
## mu[221] 6.37 0.04 2.33 2.71 4.71 6.03 7.76 11.72 3078 1.00
## mu[222] 5.83 0.04 2.21 2.53 4.22 5.56 7.08 10.82 3070 1.00
## mu[223] 1.98 0.02 1.21 0.43 1.11 1.71 2.57 4.98 2655 1.00
## mu[224] 2.58 0.02 1.37 0.66 1.57 2.34 3.30 5.79 3128 1.00
## mu[225] 26.28 0.09 5.05 17.55 22.86 25.96 29.33 36.82 3251 1.00
## mu[226] 7.09 0.05 2.50 3.21 5.27 6.73 8.58 12.90 2885 1.00
## mu[227] 5.58 0.04 2.26 2.11 3.96 5.31 6.82 10.75 3974 1.00
## mu[228] 39.42 0.10 6.26 28.53 34.95 39.00 43.58 51.86 4123 1.00
## mu[229] 8.08 0.05 2.73 3.78 6.17 7.74 9.67 14.21 2755 1.00
## mu[230] 3.72 0.03 1.71 1.16 2.50 3.43 4.63 7.98 2665 1.00
## mu[231] 9.11 0.06 2.90 4.40 7.16 8.72 10.75 15.79 2440 1.00
## mu[232] 1.03 0.01 0.76 0.16 0.49 0.83 1.32 3.13 2678 1.00
## mu[233] 11.41 0.06 3.21 6.18 9.12 11.04 13.40 18.67 2711 1.00
## mu[234] 14.01 0.06 3.65 7.73 11.48 13.71 16.24 21.86 3717 1.00
## mu[235] 7.71 0.04 2.60 3.59 5.89 7.45 9.18 13.97 3466 1.00
## mu[236] 12.53 0.07 3.55 6.56 9.90 12.17 14.71 20.10 2490 1.00
## mu[237] 1.46 0.02 0.94 0.29 0.80 1.26 1.85 3.90 2787 1.00
## mu[238] 7.72 0.04 2.68 3.31 5.76 7.40 9.33 14.09 3752 1.00
## mu[239] 67.03 0.16 8.35 52.62 61.07 66.41 72.46 84.68 2856 1.00
## mu[240] 3.52 0.03 1.59 1.23 2.33 3.22 4.36 7.26 2680 1.00
## mu[241] 2.56 0.03 1.44 0.68 1.53 2.23 3.34 6.10 2542 1.00
## mu[242] 5.56 0.04 2.28 2.12 3.93 5.18 6.85 11.11 2716 1.00
## mu[243] 1.57 0.02 0.97 0.35 0.88 1.33 2.07 3.94 2698 1.00
## mu[244] 4.36 0.04 1.85 1.59 3.06 4.08 5.38 8.79 2463 1.00
## mu[245] 2.92 0.03 1.49 0.88 1.85 2.64 3.69 6.50 3037 1.00
## mu[246] 3.03 0.03 1.52 0.88 1.90 2.73 3.88 6.57 2794 1.00
## mu[247] 1.92 0.03 1.21 0.42 1.07 1.63 2.48 4.98 2111 1.00
## mu[248] 6.80 0.04 2.37 3.06 5.18 6.56 8.08 12.16 3055 1.00
## mu[249] 63.46 0.15 7.76 49.01 57.95 63.25 68.86 78.75 2696 1.00
## mu[250] 2.44 0.03 1.30 0.67 1.49 2.20 3.11 5.62 2283 1.00
## mu[251] 0.74 0.01 0.59 0.10 0.33 0.57 0.99 2.29 2109 1.00
## mu[252] 8.79 0.05 2.78 4.36 6.72 8.46 10.49 14.91 2807 1.00
## mu[253] 42.60 0.10 6.71 31.08 37.86 42.14 46.98 56.33 4525 1.00
## mu[254] 4.77 0.04 2.02 1.81 3.24 4.45 5.96 9.51 2787 1.00
## mu[255] 3.72 0.03 1.71 1.27 2.45 3.43 4.60 7.83 3103 1.00
## mu[256] 4.49 0.04 1.98 1.53 3.02 4.21 5.64 9.12 3034 1.00
## mu[257] 10.08 0.06 3.32 4.93 7.63 9.59 12.06 17.75 3381 1.00
## mu[258] 3.85 0.03 1.78 1.25 2.56 3.56 4.90 8.22 3305 1.00
## mu[259] 3.86 0.03 1.77 1.28 2.55 3.52 4.91 7.95 2553 1.00
## mu[260] 1.44 0.02 0.95 0.27 0.77 1.23 1.88 3.97 2672 1.00
## mu[261] 6.38 0.04 2.30 2.81 4.63 6.12 7.81 11.42 3200 1.00
## mu[262] 4.26 0.03 1.95 1.44 2.79 3.95 5.36 8.83 3982 1.00
## mu[263] 2.38 0.02 1.30 0.61 1.44 2.17 3.05 5.63 3416 1.00
## mu[264] 3.92 0.03 1.72 1.31 2.72 3.65 4.82 8.20 3164 1.00
## mu[265] 1.01 0.01 0.73 0.17 0.49 0.82 1.33 2.81 2462 1.00
## mu[266] 32.56 0.09 5.69 22.62 28.68 32.31 36.08 44.82 4115 1.00
## mu[267] 1.51 0.02 1.00 0.32 0.77 1.28 1.99 4.11 2571 1.00
## mu[268] 0.89 0.02 0.72 0.12 0.39 0.69 1.16 2.74 1953 1.00
## mu[269] 1.94 0.03 1.15 0.47 1.14 1.69 2.45 5.09 1805 1.00
## mu[270] 1.27 0.02 0.91 0.23 0.63 1.04 1.64 3.63 2730 1.00
## mu[271] 13.18 0.06 3.50 7.28 10.62 12.81 15.42 20.81 3195 1.00
## mu[272] 6.85 0.04 2.46 2.98 5.12 6.58 8.29 12.71 3832 1.00
## mu[273] 8.39 0.04 2.73 3.95 6.43 8.05 9.98 14.62 3690 1.00
## mu[274] 1.40 0.02 0.95 0.27 0.69 1.16 1.86 3.93 2873 1.00
## mu[275] 46.38 0.13 7.08 33.56 41.51 46.03 50.99 61.87 2981 1.00
## mu[276] 2.66 0.03 1.43 0.69 1.62 2.37 3.41 6.24 2441 1.00
## mu[277] 15.74 0.06 3.80 9.16 12.93 15.44 18.27 23.42 3677 1.00
## mu[278] 0.39 0.01 0.37 0.05 0.16 0.27 0.50 1.37 1963 1.00
## mu[279] 1.43 0.02 0.94 0.30 0.78 1.23 1.83 3.90 2329 1.00
## mu[280] 2.14 0.02 1.23 0.52 1.25 1.89 2.73 5.14 3056 1.00
## mu[281] 11.51 0.06 3.36 5.91 9.08 11.13 13.47 19.02 3517 1.00
## mu[282] 4.88 0.04 1.99 1.77 3.54 4.63 5.96 9.70 3079 1.00
## mu[283] 1.07 0.01 0.76 0.18 0.51 0.87 1.42 2.99 2827 1.00
## mu[284] 0.96 0.01 0.71 0.15 0.46 0.79 1.25 2.75 2325 1.00
## mu[285] 9.14 0.05 2.92 4.35 6.99 8.86 10.92 15.69 3387 1.00
## mu[286] 72.28 0.14 8.59 56.16 66.62 72.13 77.54 89.71 3824 1.00
## mu[287] 1.63 0.02 0.98 0.38 0.95 1.42 2.09 4.06 2621 1.00
## mu[288] 2.12 0.02 1.17 0.55 1.25 1.91 2.74 4.96 2277 1.00
## mu[289] 2.27 0.02 1.32 0.51 1.28 1.97 3.00 5.39 3215 1.00
## mu[290] 2.85 0.03 1.53 0.80 1.77 2.52 3.62 6.54 2156 1.00
## mu[291] 1.80 0.02 1.04 0.39 1.06 1.60 2.31 4.53 2526 1.00
## mu[292] 1.42 0.02 0.95 0.28 0.73 1.19 1.89 3.81 2485 1.00
## mu[293] 4.23 0.03 1.78 1.61 2.94 3.90 5.28 8.51 3397 1.00
## mu[294] 1.72 0.02 1.13 0.34 0.91 1.45 2.27 4.66 2127 1.00
## mu[295] 4.09 0.03 1.83 1.28 2.74 3.83 5.11 8.43 3415 1.00
## mu[296] 38.81 0.11 6.04 27.44 34.82 38.53 42.55 51.42 2985 1.00
## mu[297] 2.87 0.03 1.54 0.76 1.79 2.55 3.68 6.65 2672 1.00
## mu[298] 7.65 0.05 2.58 3.40 5.75 7.33 9.24 13.19 2801 1.00
## mu[299] 3.81 0.03 1.70 1.28 2.59 3.54 4.75 7.85 3269 1.00
## mu[300] 5.03 0.04 2.10 1.78 3.49 4.81 6.14 10.13 3175 1.00
## mu[301] 43.09 0.10 6.27 31.81 38.64 42.74 47.06 56.33 3674 1.00
## mu[302] 1.18 0.02 0.81 0.21 0.61 1.00 1.51 3.29 1976 1.00
## mu[303] 1.98 0.02 1.13 0.52 1.14 1.76 2.54 4.66 2557 1.00
## mu[304] 0.82 0.02 0.62 0.13 0.38 0.67 1.08 2.43 1565 1.00
## mu[305] 6.11 0.04 2.40 2.51 4.38 5.74 7.45 11.48 3155 1.00
## mu[306] 4.90 0.03 2.00 1.88 3.43 4.61 6.04 9.69 3566 1.00
## mu[307] 97.97 0.16 10.07 79.48 91.11 97.56 104.61 118.68 3745 1.00
## mu[308] 2.90 0.03 1.45 0.85 1.86 2.65 3.66 6.50 3329 1.00
## mu[309] 22.97 0.09 4.76 14.80 19.60 22.47 25.94 32.65 2907 1.00
## mu[310] 1.55 0.02 1.05 0.30 0.77 1.29 2.04 4.20 2691 1.00
## mu[311] 10.13 0.06 3.14 4.96 7.84 9.78 12.13 16.87 2732 1.00
## mu[312] 4.73 0.04 2.00 1.73 3.29 4.44 5.92 9.27 2963 1.00
## mu[313] 60.32 0.13 7.60 46.50 54.86 59.94 65.55 76.29 3204 1.00
## mu[314] 18.26 0.07 4.11 11.18 15.33 18.03 20.92 26.67 3066 1.00
## mu[315] 7.05 0.05 2.52 3.22 5.19 6.74 8.46 12.91 2828 1.00
## mu[316] 1.14 0.02 0.85 0.18 0.55 0.89 1.48 3.44 2215 1.00
## mu[317] 111.38 0.18 10.59 91.38 104.11 111.25 118.63 132.49 3314 1.00
## mu[318] 33.65 0.12 5.82 22.97 29.66 33.45 37.28 45.79 2352 1.00
## mu[319] 4.39 0.03 1.89 1.50 3.00 4.12 5.43 8.87 3266 1.00
## mu[320] 0.82 0.01 0.66 0.11 0.35 0.63 1.09 2.57 2462 1.00
## mu[321] 8.94 0.05 2.66 4.43 7.04 8.66 10.55 14.99 3377 1.00
## mu[322] 2.05 0.02 1.21 0.53 1.17 1.77 2.65 4.80 2755 1.00
## mu[323] 42.21 0.11 6.38 31.08 37.48 41.88 46.58 55.81 3677 1.00
## mu[324] 2.59 0.02 1.29 0.71 1.65 2.37 3.31 5.70 3647 1.00
## mu[325] 2.37 0.02 1.32 0.56 1.39 2.12 3.10 5.51 2821 1.00
## mu[326] 3.00 0.03 1.48 0.87 1.92 2.75 3.88 6.61 2672 1.00
## mu[327] 17.45 0.07 3.97 10.44 14.72 17.16 20.05 25.88 3237 1.00
## mu[328] 16.18 0.07 3.89 9.59 13.30 15.87 18.63 24.48 3121 1.00
## mu[329] 0.74 0.01 0.62 0.10 0.32 0.56 0.97 2.36 2153 1.00
## mu[330] 13.86 0.06 3.68 7.83 11.15 13.52 16.15 22.08 4140 1.00
## mu[331] 2.28 0.02 1.26 0.56 1.37 2.01 2.94 5.48 3074 1.00
## mu[332] 6.04 0.04 2.23 2.62 4.35 5.74 7.36 11.01 2681 1.00
## mu[333] 31.14 0.10 5.49 21.23 27.37 30.77 34.61 42.28 3036 1.00
## mu[334] 3.41 0.03 1.59 1.10 2.27 3.15 4.32 7.06 2634 1.00
## mu[335] 3.99 0.03 1.74 1.35 2.69 3.69 4.95 8.23 3068 1.00
## mu[336] 1.25 0.02 0.85 0.22 0.64 1.06 1.60 3.55 2220 1.00
## mu[337] 6.81 0.04 2.31 3.16 5.14 6.49 8.19 12.16 2833 1.00
## mu[338] 5.14 0.03 1.92 2.12 3.77 4.89 6.28 9.53 3376 1.00
## mu[339] 2.02 0.02 1.18 0.48 1.16 1.77 2.63 5.05 2366 1.00
## mu[340] 6.21 0.04 2.44 2.47 4.49 5.85 7.54 12.09 3422 1.00
## mu[341] 9.41 0.06 2.95 4.47 7.33 9.17 11.22 16.03 2418 1.00
## mu[342] 2.46 0.02 1.33 0.63 1.51 2.22 3.14 5.68 3216 1.00
## mu[343] 3.14 0.03 1.55 0.95 2.08 2.90 3.92 7.05 2938 1.00
## mu[344] 8.30 0.05 2.66 4.08 6.32 7.97 9.89 14.17 2723 1.00
## mu[345] 65.75 0.12 7.71 51.39 60.46 65.54 70.73 81.11 4271 1.00
## mu[346] 14.46 0.07 3.67 7.89 11.93 14.27 16.81 22.15 2931 1.00
## mu[347] 1.18 0.02 0.82 0.22 0.59 0.97 1.56 3.28 2417 1.00
## mu[348] 14.70 0.06 3.59 8.61 12.02 14.37 17.06 22.12 3102 1.00
## mu[349] 2.51 0.02 1.37 0.67 1.53 2.24 3.20 6.09 3210 1.00
## mu[350] 0.45 0.01 0.42 0.05 0.16 0.32 0.59 1.54 1832 1.00
## mu[351] 1.96 0.02 1.12 0.45 1.11 1.72 2.55 4.71 2841 1.00
## mu[352] 17.80 0.08 4.02 10.65 14.86 17.48 20.52 26.30 2781 1.00
## mu[353] 2.09 0.02 1.26 0.49 1.18 1.80 2.71 5.31 2635 1.00
## mu[354] 1.75 0.02 1.05 0.36 0.99 1.52 2.30 4.37 2972 1.00
## mu[355] 1.39 0.02 0.92 0.26 0.73 1.18 1.79 3.69 2364 1.00
## mu[356] 56.81 0.15 7.55 42.56 51.71 56.57 61.50 72.48 2540 1.00
## mu[357] 3.16 0.03 1.56 0.92 2.03 2.94 3.91 6.84 3223 1.00
## mu[358] 3.51 0.03 1.66 1.17 2.25 3.17 4.45 7.50 3154 1.00
## mu[359] 6.05 0.05 2.31 2.41 4.44 5.80 7.29 10.92 2263 1.00
## mu[360] 2.54 0.03 1.46 0.61 1.48 2.24 3.28 6.11 2380 1.00
## mu[361] 2.95 0.03 1.51 0.81 1.90 2.70 3.71 6.51 2823 1.00
## mu[362] 10.64 0.05 3.08 5.36 8.39 10.44 12.57 17.49 3810 1.00
## mu[363] 3.10 0.03 1.58 0.92 1.95 2.81 3.95 7.05 3167 1.00
## mu[364] 5.58 0.04 2.15 2.19 4.04 5.29 6.83 10.40 3145 1.00
## mu[365] 20.07 0.08 4.37 12.53 16.97 19.82 22.89 29.24 3038 1.00
## mu[366] 5.93 0.04 2.29 2.37 4.20 5.57 7.27 11.14 3601 1.00
## mu[367] 2.26 0.03 1.33 0.46 1.35 1.99 2.84 5.57 2752 1.00
## mu[368] 1.07 0.02 0.77 0.19 0.53 0.86 1.40 3.01 2261 1.00
## mu[369] 1.21 0.02 0.82 0.21 0.60 1.00 1.62 3.29 2298 1.00
## mu[370] 1.38 0.02 0.90 0.27 0.73 1.19 1.75 3.65 2161 1.00
## mu[371] 1.23 0.02 0.88 0.20 0.61 0.99 1.60 3.59 3012 1.00
## mu[372] 5.93 0.04 2.23 2.47 4.25 5.64 7.25 11.07 2555 1.00
## mu[373] 2.87 0.03 1.48 0.76 1.78 2.59 3.61 6.40 2941 1.00
## mu[374] 1.27 0.02 0.91 0.20 0.62 1.06 1.66 3.68 2376 1.00
## mu[375] 21.93 0.07 4.67 13.52 18.72 21.66 24.93 31.92 4324 1.00
## mu[376] 16.93 0.07 3.96 9.99 14.11 16.75 19.48 25.41 2964 1.00
## mu[377] 2.22 0.02 1.30 0.52 1.27 1.96 2.88 5.50 3672 1.00
## mu[378] 0.93 0.01 0.66 0.17 0.43 0.75 1.24 2.65 2725 1.00
## mu[379] 1.97 0.02 1.18 0.47 1.09 1.71 2.60 4.95 2994 1.00
## mu[380] 0.77 0.01 0.58 0.12 0.37 0.62 1.00 2.44 2739 1.00
## mu[381] 2.00 0.02 1.19 0.47 1.15 1.72 2.53 5.02 2861 1.00
## mu[382] 8.97 0.05 2.78 4.46 6.94 8.63 10.63 15.14 3584 1.00
## mu[383] 1.84 0.02 1.12 0.42 1.03 1.58 2.37 4.79 2768 1.00
## mu[384] 3.26 0.03 1.62 0.99 2.03 2.98 4.17 6.82 2928 1.00
## mu[385] 1.61 0.02 1.00 0.36 0.91 1.39 2.05 4.23 3046 1.00
## mu[386] 5.31 0.04 2.09 2.08 3.75 5.01 6.48 10.19 2941 1.00
## mu[387] 19.98 0.07 4.47 12.02 16.91 19.67 22.57 29.89 4152 1.00
## mu[388] 4.26 0.03 1.91 1.47 2.79 3.97 5.37 8.92 3506 1.00
## mu[389] 24.81 0.07 4.85 16.34 21.31 24.53 28.03 34.87 4757 1.00
## mu[390] 4.61 0.04 1.97 1.59 3.21 4.32 5.73 9.26 2947 1.00
## mu[391] 1.28 0.02 0.86 0.22 0.66 1.07 1.67 3.65 2028 1.00
## mu[392] 4.03 0.03 1.72 1.30 2.78 3.80 5.07 8.13 2993 1.00
## mu[393] 25.12 0.07 4.87 16.41 21.80 24.79 28.18 35.66 4764 1.00
## mu[394] 6.27 0.04 2.35 2.71 4.61 6.05 7.56 11.58 2838 1.00
## mu[395] 7.98 0.05 2.72 3.62 5.99 7.65 9.61 13.94 3306 1.00
## mu[396] 3.93 0.04 1.79 1.34 2.62 3.62 4.91 8.15 2182 1.00
## mu[397] 18.40 0.08 4.24 11.08 15.33 18.09 20.92 27.51 2834 1.00
## mu[398] 4.97 0.04 2.06 1.94 3.52 4.64 6.06 10.08 2852 1.00
## mu[399] 205.23 0.23 14.23 177.93 195.66 205.03 214.68 234.30 3956 1.00
## mu[400] 8.10 0.05 2.70 3.85 6.15 7.75 9.65 14.38 2987 1.00
## mu[401] 0.94 0.01 0.73 0.14 0.40 0.73 1.29 2.86 2816 1.00
## mu[402] 5.85 0.04 2.17 2.42 4.32 5.57 7.04 10.81 2407 1.00
## mu[403] 7.19 0.05 2.46 3.35 5.38 6.88 8.71 12.79 2564 1.00
## mu[404] 1.57 0.02 1.01 0.33 0.83 1.35 1.99 4.17 2788 1.00
## mu[405] 33.27 0.09 5.65 22.97 29.37 33.00 36.98 45.36 3884 1.00
## mu[406] 18.65 0.07 4.42 11.06 15.49 18.33 21.34 28.11 3658 1.00
## mu[407] 1.23 0.02 0.82 0.23 0.64 1.04 1.60 3.40 2399 1.00
## mu[408] 3.28 0.03 1.55 1.00 2.09 3.04 4.21 6.86 2774 1.00
## mu[409] 3.93 0.04 1.80 1.34 2.61 3.60 4.95 8.20 2500 1.00
## mu[410] 4.30 0.04 1.87 1.53 2.87 4.04 5.44 8.61 2787 1.00
## mu[411] 2.79 0.03 1.42 0.78 1.74 2.53 3.56 6.16 3049 1.00
## mu[412] 8.41 0.05 2.72 4.10 6.41 8.12 10.02 14.59 2990 1.00
## mu[413] 6.36 0.04 2.36 2.67 4.62 6.06 7.73 11.76 3416 1.00
## mu[414] 11.53 0.06 3.31 6.17 9.13 11.19 13.57 18.59 3472 1.00
## mu[415] 18.95 0.09 4.20 11.67 15.98 18.67 21.54 28.28 2380 1.00
## mu[416] 2.16 0.02 1.25 0.48 1.27 1.90 2.79 5.25 2681 1.00
## mu[417] 1.63 0.02 1.00 0.34 0.92 1.44 2.13 3.95 3048 1.00
## mu[418] 2.57 0.03 1.42 0.68 1.58 2.30 3.21 6.19 2693 1.00
## mu[419] 48.25 0.11 6.73 35.84 43.24 48.00 52.87 61.82 3594 1.00
## mu[420] 6.47 0.04 2.41 2.80 4.64 6.13 8.03 12.02 2904 1.00
## mu[421] 1.05 0.02 0.81 0.17 0.49 0.83 1.40 2.93 2220 1.00
## mu[422] 23.86 0.09 4.62 15.32 20.81 23.69 26.55 33.69 2591 1.00
## mu[423] 4.84 0.04 2.04 1.76 3.36 4.52 6.11 9.44 3112 1.00
## mu[424] 1.54 0.02 1.06 0.27 0.77 1.28 2.04 4.24 2868 1.00
## mu[425] 1.12 0.02 0.81 0.20 0.55 0.91 1.44 3.27 2413 1.00
## mu[426] 11.36 0.05 3.17 6.02 9.06 11.02 13.25 18.34 3337 1.00
## mu[427] 3.22 0.03 1.53 1.01 2.11 3.00 4.05 6.90 2683 1.00
## mu[428] 0.88 0.02 0.75 0.12 0.38 0.69 1.14 2.83 1703 1.00
## mu[429] 1.92 0.02 1.12 0.42 1.06 1.66 2.56 4.75 2498 1.00
## mu[430] 2.27 0.03 1.33 0.52 1.29 2.01 2.90 5.75 2220 1.00
## mu[431] 3.46 0.03 1.66 1.06 2.29 3.20 4.32 7.45 2920 1.00
## mu[432] 1.78 0.02 1.04 0.38 1.02 1.57 2.31 4.32 3478 1.00
## mu[433] 8.67 0.05 2.79 4.14 6.67 8.35 10.37 15.31 3450 1.00
## mu[434] 5.35 0.04 2.07 2.14 3.90 5.09 6.47 10.50 3172 1.00
## mu[435] 7.56 0.04 2.58 3.35 5.64 7.25 9.24 13.29 3302 1.00
## mu[436] 0.74 0.01 0.59 0.11 0.35 0.59 0.95 2.33 1954 1.00
## mu[437] 21.25 0.09 4.39 13.59 18.22 20.69 24.02 30.52 2565 1.00
## mu[438] 7.27 0.04 2.55 3.22 5.41 6.94 8.69 13.19 3251 1.00
## mu[439] 4.27 0.03 1.83 1.61 2.93 3.97 5.28 8.37 2925 1.00
## mu[440] 10.95 0.06 3.19 5.80 8.61 10.66 12.96 18.04 2782 1.00
## mu[441] 14.94 0.07 3.64 8.65 12.43 14.65 17.18 22.77 2856 1.00
## mu[442] 2.14 0.02 1.18 0.50 1.30 1.90 2.71 5.16 2780 1.00
## mu[443] 15.09 0.07 3.64 8.86 12.41 14.86 17.51 22.79 2646 1.00
## mu[444] 6.80 0.04 2.47 2.87 4.94 6.48 8.40 12.23 3241 1.00
## mu[445] 2.31 0.03 1.32 0.57 1.38 2.00 2.93 5.47 2357 1.00
## mu[446] 1.72 0.02 1.12 0.34 0.91 1.49 2.23 4.32 2469 1.00
## mu[447] 1.01 0.02 0.74 0.17 0.49 0.82 1.30 2.87 2210 1.00
## mu[448] 4.59 0.04 2.06 1.62 3.07 4.26 5.71 9.69 2942 1.00
## mu[449] 15.88 0.07 3.82 9.43 13.13 15.53 18.24 23.91 3210 1.00
## mu[450] 2.78 0.03 1.49 0.76 1.68 2.49 3.56 6.45 2255 1.00
## mu[451] 2.04 0.02 1.22 0.49 1.14 1.78 2.62 5.18 2655 1.00
## mu[452] 1.55 0.02 1.00 0.31 0.80 1.29 2.02 4.11 3028 1.00
## mu[453] 13.01 0.07 3.37 7.33 10.49 12.68 15.08 20.20 2209 1.00
## mu[454] 5.65 0.05 2.24 2.29 3.98 5.33 7.01 10.83 2433 1.00
## mu[455] 17.75 0.06 4.12 10.77 14.88 17.41 20.13 27.05 4349 1.00
## mu[456] 8.45 0.05 2.87 3.92 6.35 8.06 10.15 15.03 3205 1.00
## mu[457] 3.29 0.03 1.62 0.98 2.09 3.02 4.12 7.19 3431 1.00
## mu[458] 2.05 0.02 1.13 0.51 1.21 1.85 2.63 4.95 2585 1.00
## mu[459] 39.42 0.12 6.27 27.95 35.25 38.88 43.26 53.24 2552 1.00
## mu[460] 5.86 0.04 2.27 2.33 4.18 5.52 7.21 11.03 2829 1.00
## mu[461] 3.54 0.03 1.64 1.18 2.38 3.26 4.47 7.37 3130 1.00
## mu[462] 2.61 0.02 1.45 0.69 1.56 2.30 3.32 6.29 3622 1.00
## mu[463] 2.87 0.03 1.48 0.84 1.80 2.62 3.65 6.58 2839 1.00
## mu[464] 11.06 0.06 3.33 5.67 8.65 10.72 13.01 18.38 3249 1.00
## mu[465] 2.12 0.02 1.16 0.54 1.25 1.93 2.75 4.98 2757 1.00
## mu[466] 46.17 0.12 6.86 33.83 41.38 45.87 50.64 61.08 3485 1.00
## mu[467] 2.11 0.02 1.29 0.46 1.21 1.84 2.67 5.61 2952 1.00
## mu[468] 15.72 0.08 4.05 8.95 12.80 15.23 18.36 24.43 2624 1.00
## mu[469] 15.74 0.07 3.90 9.07 12.90 15.44 18.15 24.30 3121 1.00
## mu[470] 9.31 0.05 2.88 4.67 7.17 8.98 11.04 15.64 3489 1.00
## mu[471] 0.71 0.01 0.61 0.09 0.30 0.53 0.90 2.37 1994 1.00
## mu[472] 43.60 0.10 6.86 32.11 38.64 43.03 48.01 57.96 4310 1.00
## mu[473] 3.24 0.03 1.57 0.96 2.09 2.97 4.14 6.86 2942 1.00
## mu[474] 8.02 0.05 2.64 3.80 6.12 7.66 9.56 14.18 2943 1.00
## mu[475] 9.42 0.05 3.04 4.62 7.26 9.05 11.29 16.14 3329 1.00
## mu[476] 17.04 0.07 4.23 10.04 13.99 16.64 19.74 26.12 4098 1.00
## mu[477] 0.40 0.01 0.40 0.04 0.15 0.28 0.50 1.39 1703 1.00
## mu[478] 1.07 0.02 0.71 0.21 0.55 0.90 1.41 2.87 1988 1.00
## mu[479] 6.52 0.04 2.33 2.86 4.80 6.24 7.87 11.83 2761 1.00
## mu[480] 2.42 0.03 1.32 0.64 1.45 2.17 3.08 5.59 2565 1.00
## mu[481] 55.60 0.13 7.20 42.82 50.84 55.38 60.00 70.72 3159 1.00
## mu[482] 15.16 0.06 3.58 9.02 12.54 14.89 17.48 22.62 3567 1.00
## mu[483] 4.08 0.03 1.96 1.31 2.68 3.75 5.15 8.69 3260 1.00
## mu[484] 10.60 0.06 3.23 5.30 8.30 10.26 12.55 17.97 3092 1.00
## mu[485] 18.18 0.07 4.29 10.68 15.19 18.01 20.78 27.55 3548 1.00
## mu[486] 22.83 0.08 4.46 14.99 19.78 22.48 25.61 32.47 2853 1.00
## mu[487] 2.79 0.03 1.45 0.78 1.73 2.55 3.53 6.35 2150 1.00
## mu[488] 7.67 0.04 2.58 3.66 5.88 7.34 9.24 13.58 3971 1.00
## mu[489] 3.84 0.03 1.83 1.22 2.52 3.53 4.79 8.40 3536 1.00
## mu[490] 3.07 0.03 1.57 0.88 1.90 2.86 3.89 7.07 3553 1.00
## mu[491] 7.81 0.05 2.55 3.78 5.95 7.42 9.39 13.58 2337 1.00
## mu[492] 52.72 0.13 7.22 40.10 47.66 52.30 57.46 68.13 2985 1.00
## mu[493] 4.55 0.03 1.98 1.55 3.09 4.26 5.62 9.20 3313 1.00
## mu[494] 1.43 0.02 0.92 0.28 0.75 1.22 1.87 3.83 2444 1.00
## mu[495] 1.75 0.02 1.15 0.35 0.94 1.50 2.27 4.44 2718 1.00
## mu[496] 1.26 0.02 0.90 0.20 0.63 1.05 1.65 3.82 2293 1.00
## mu[497] 0.86 0.01 0.63 0.13 0.40 0.69 1.16 2.43 2543 1.00
## mu[498] 2.39 0.03 1.37 0.57 1.41 2.14 3.03 5.90 2983 1.00
## mu[499] 1.13 0.02 0.79 0.18 0.58 0.94 1.52 3.09 2495 1.00
## mu[500] 1.26 0.02 0.87 0.22 0.63 1.05 1.60 3.64 2646 1.00
## a[1] 1.54 0.00 0.07 1.40 1.49 1.54 1.59 1.69 1487 1.00
## a[2] 1.28 0.00 0.08 1.12 1.23 1.28 1.33 1.43 1459 1.00
## b 0.81 0.00 0.05 0.71 0.78 0.81 0.85 0.92 2683 1.00
## c[1] 0.15 0.00 0.07 0.02 0.11 0.15 0.20 0.28 2168 1.00
## c[2] -0.62 0.00 0.09 -0.80 -0.68 -0.62 -0.55 -0.43 1256 1.00
## sigma 1.04 0.00 0.05 0.95 1.01 1.04 1.07 1.14 1136 1.00
## lp__ 15065.69 0.92 18.24 15028.83 15054.00 15066.51 15078.23 15100.35 395 1.01
##
## Samples were drawn using NUTS(diag_e) at Mon Nov 25 21:34:21 2019.
## For each parameter, n_eff is a crude measure of effective sample size,
## and Rhat is the potential scale reduction factor on split chains (at
## convergence, Rhat=1).
Here are posterior distributions from the full model, with the true values in red.
# 100 datasets:
params2 <- list(a=colMeans(post2$a),
b=mean(post2$b),
c=colMeans(post2$c),
sigma=mean(post2$sigma))
y2 <- with(list2env(scaled_data),
params2$a[geno]
+ params2$b * age
+ params2$c[geno] * expo)
sim2 <- replicate(100, {
mu = exp(rnorm(length(y2),
mean=y2,
sd=params2$sigma))
rpois(length(mu), mu)
})
model {
vector[N] y;
y = a[geno] + b * age + c[geno] .* expo;
mu ~ lognormal(y, sigma);
counts ~ poisson(mu);
}
plot(data$counts[order(mu1)], ylab="counts", ylim=range(sim2), type='n')
segments(x0=seq_len(nrow(data)),
y0=rowMins(sim2)[order(mu1)],
y1=rowMaxs(sim2)[order(mu1)])
points(data$counts[order(mu1)], pch=20, col='red')
legend("topleft", pch=c(20,NA), lty=c(NA,1), legend=c("observed", "simulated range"), col=c('red', 'black'))
plot(data$counts[order(mu1)], ylab="counts", ylim=c(0,500), type='n')
segments(x0=seq_len(nrow(data)),
y0=rowMins(sim2)[order(mu1)],
y1=rowMaxs(sim2)[order(mu1)])
points(data$counts[order(mu1)], pch=20, col='red')
legend("topleft", pch=c(20,NA), lty=c(NA,1), legend=c("observed", "simulated range"), col=c('red', 'black'))