R version 4.2.2 Patched (2022-11-10 r83330) -- "Innocent and Trusting"
Copyright (C) 2022 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> pdf("g13.pdf");options(width=64)
> #setwd("C:\\Users\\kolassa\\Class542")
> setwd("~/Taught1/960-542/Data")
> #***********************************************************
> #Bayesian survival analysis.                               *
> #The program stan applies a Bayesian analysis to a variety *
> #of statistical models.  The package rstan is an R front   *
> #end.  The package rstanarm includes survival models, but  *
> #only if you download from the stan website.  There's a    *
> #version on CRAN, but it doesn't support survival analysis.*
> #This is suggested:                                        *
> #install.packages("rstanarm",                              *
> #  repos="https://mc-stan.org/r-packages/")                *
> #If this attempted install leads to error messages about   *
> #an install failure because rstanarm requires some un-     *
> #installed packages, go back and install them from the de- *
> #fault repository.  In my case, running                    *
> #install.packages("rstan")                                 *
> #install.packages(c("lme4","rstantools","shinystan",       *
> #   "splines2"))                                           *
> # before the load from mc-stan.org did the trick.          *
> #***********************************************************
> #*******************************************************/
> # Larynx cancer data from Klelin and Moeschberger. See */
> # class web page for link.  Cancers are graded on an   */
> # integer scale, with increasing numbers indicating    */
> # more sever.  Variables are stage, time to death, year*/
> # of diagnosis, and status.                            */
> #*******************************************************/
> larynx<-read.table("larynx.txt",header=TRUE)
> library(rstanarm)#For stan_surv and support functions
> # Fit a proportional hazards regression.  Accelerated fail-
> # ure models can be fit by adding 'basehaz="weibull-aft".
> # Between R 4.1.3 to R 4.2.1 there was a change in handling 
> # of logical, which causes a warning in the below.
> so1<-stan_surv(Surv(time,delta)~stage,data=larynx)

SAMPLING FOR MODEL 'surv' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 4.3e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.43 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.322231 seconds (Warm-up)
Chain 1:                0.262608 seconds (Sampling)
Chain 1:                0.584839 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 2.1e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.301597 seconds (Warm-up)
Chain 2:                0.204341 seconds (Sampling)
Chain 2:                0.505938 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 1.8e-05 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.18 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.278357 seconds (Warm-up)
Chain 3:                0.189942 seconds (Sampling)
Chain 3:                0.468299 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 1.9e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.19 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.277307 seconds (Warm-up)
Chain 4:                0.177606 seconds (Sampling)
Chain 4:                0.454913 seconds (Total)
Chain 4: 
> # Default prior for stan_surv is normal on parameters.  The
> # prior on the intercept is very broad.  Prior on slope 
> # parameters is not as broad, but, given that slope para-
> # meters generally have a smaller se than do intercept para-
> # meters, for the context this slope prior is also quite
> # broad.  These priors are close to flat.  Some might call
> # them non-informative
> prior_summary(so1)
Priors for model 'so1' 
------
Intercept
 ~ normal(location = 0, scale = 20)

Coefficients
 ~ normal(location = 0, scale = 2.5)

Auxiliary (M-spline-coefficients)
 ~ dirichlet(concentration = [1,1,1,...])
------
See help('prior_summary.stanreg') for more details
> # Baseline distribution is fit using a spline. fixef ex-
> # tracts posterior medians, but inference requires more.
> fixef(so1)
    (Intercept)           stage m-splines-coef1 m-splines-coef2 
    -0.76409616      0.50964080      0.02857221      0.10723104 
m-splines-coef3 m-splines-coef4 m-splines-coef5 m-splines-coef6 
     0.14413170      0.33117665      0.22262559      0.10323562 
> # Plotting the fit gives a plot of baseline hazard.
> plot(so1)
> print(so1,digits=4)
stan_surv
 baseline hazard: M-splines on hazard scale
 formula:         Surv(time, delta) ~ stage
 observations:    90
 events:          50 (55.6%)
 right censored:  40 (44.4%)
 delayed entry:   no
------
                Median  MAD_SD  exp(Median)
(Intercept)     -0.7641  0.3854      NA    
stage            0.5096  0.1441  1.6647    
m-splines-coef1  0.0286  0.0221      NA    
m-splines-coef2  0.1072  0.0516      NA    
m-splines-coef3  0.1441  0.0980      NA    
m-splines-coef4  0.3312  0.1802      NA    
m-splines-coef5  0.2226  0.1712      NA    
m-splines-coef6  0.1032  0.0989      NA    

------
* For help interpreting the printed output see ?print.stanreg
* For info on the priors used see ?prior_summary.stanreg
> posterior_interval(so1)
                          5%         95%
(Intercept)     -1.460697093 -0.12859932
stage            0.272464386  0.75657467
m-splines-coef1  0.003840048  0.07792207
m-splines-coef2  0.030682878  0.20381906
m-splines-coef3  0.017676637  0.33691655
m-splines-coef4  0.070078556  0.61251997
m-splines-coef5  0.022047775  0.49730199
m-splines-coef6  0.008994999  0.34172966
> # The above use of stage was as a continuous regressor.
> # Use it as categorical:
> so2<-stan_surv(Surv(time,delta)~factor(stage),data=larynx)

SAMPLING FOR MODEL 'surv' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 3.2e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.32 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.241085 seconds (Warm-up)
Chain 1:                0.193904 seconds (Sampling)
Chain 1:                0.434989 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 2.1e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.269747 seconds (Warm-up)
Chain 2:                0.183298 seconds (Sampling)
Chain 2:                0.453045 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 2e-05 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.2 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.263524 seconds (Warm-up)
Chain 3:                0.205674 seconds (Sampling)
Chain 3:                0.469198 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 2.1e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.277125 seconds (Warm-up)
Chain 4:                0.213253 seconds (Sampling)
Chain 4:                0.490378 seconds (Total)
Chain 4: 
> # Again examine priors.
> prior_summary(so2)
Priors for model 'so2' 
------
Intercept
 ~ normal(location = 0, scale = 20)

Coefficients
 ~ normal(location = [0,0,0], scale = [2.5,2.5,2.5])

Auxiliary (M-spline-coefficients)
 ~ dirichlet(concentration = [1,1,1,...])
------
See help('prior_summary.stanreg') for more details
> # Do inference on levels of stage.
> plot(so2)
> print(so2,digits=4)
stan_surv
 baseline hazard: M-splines on hazard scale
 formula:         Surv(time, delta) ~ factor(stage)
 observations:    90
 events:          50 (55.6%)
 right censored:  40 (44.4%)
 delayed entry:   no
------
                Median  MAD_SD  exp(Median)
(Intercept)     -0.0216  0.2922      NA    
factor(stage)2   0.0329  0.4681  1.0334    
factor(stage)3   0.5886  0.3528  1.8014    
factor(stage)4   1.6765  0.4219  5.3468    
m-splines-coef1  0.0249  0.0201      NA    
m-splines-coef2  0.0987  0.0478      NA    
m-splines-coef3  0.1305  0.0918      NA    
m-splines-coef4  0.3392  0.1832      NA    
m-splines-coef5  0.2339  0.1751      NA    
m-splines-coef6  0.1045  0.1004      NA    

------
* For help interpreting the printed output see ?print.stanreg
* For info on the priors used see ?prior_summary.stanreg
> (in2<-posterior_interval(so2))
                          5%        95%
(Intercept)     -0.525042522 0.44212920
factor(stage)2  -0.743526949 0.75418094
factor(stage)3  -0.011148536 1.16717473
factor(stage)4   0.992480246 2.35950934
m-splines-coef1  0.003264344 0.07192327
m-splines-coef2  0.032115756 0.19167670
m-splines-coef3  0.016805164 0.31932466
m-splines-coef4  0.074260343 0.61428779
m-splines-coef5  0.025019544 0.51971387
m-splines-coef6  0.009570915 0.36311387
> # Fit with Cauchy prior
> so3<-stan_surv(Surv(time,delta)~factor(stage),data=larynx,
+    prior=cauchy(0,.2))

SAMPLING FOR MODEL 'surv' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 3.5e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.35 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.383037 seconds (Warm-up)
Chain 1:                0.488869 seconds (Sampling)
Chain 1:                0.871906 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 2.4e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.24 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.366309 seconds (Warm-up)
Chain 2:                0.378959 seconds (Sampling)
Chain 2:                0.745268 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 2.1e-05 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.422258 seconds (Warm-up)
Chain 3:                0.323136 seconds (Sampling)
Chain 3:                0.745394 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 2.2e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.22 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.439953 seconds (Warm-up)
Chain 4:                0.356464 seconds (Sampling)
Chain 4:                0.796417 seconds (Total)
Chain 4: 
> prior_summary(so3)
Priors for model 'so3' 
------
Intercept
 ~ normal(location = 0, scale = 20)

Coefficients
 ~ cauchy(location = [0,0,0], scale = [0.2,0.2,0.2])

Auxiliary (M-spline-coefficients)
 ~ dirichlet(concentration = [1,1,1,...])
------
See help('prior_summary.stanreg') for more details
> plot(so3)
> print(so3,digits=4)
stan_surv
 baseline hazard: M-splines on hazard scale
 formula:         Surv(time, delta) ~ factor(stage)
 observations:    90
 events:          50 (55.6%)
 right censored:  40 (44.4%)
 delayed entry:   no
------
                Median  MAD_SD  exp(Median)
(Intercept)      0.1608  0.2518      NA    
factor(stage)2  -0.0413  0.1913  0.9595    
factor(stage)3   0.1850  0.2428  1.2032    
factor(stage)4   1.3186  0.4489  3.7383    
m-splines-coef1  0.0272  0.0213      NA    
m-splines-coef2  0.1024  0.0521      NA    
m-splines-coef3  0.1380  0.0967      NA    
m-splines-coef4  0.3153  0.1761      NA    
m-splines-coef5  0.2377  0.1747      NA    
m-splines-coef6  0.1010  0.1004      NA    

------
* For help interpreting the printed output see ?print.stanreg
* For info on the priors used see ?prior_summary.stanreg
> (in3<-posterior_interval(so3))
                          5%        95%
(Intercept)     -0.265664568 0.61044159
factor(stage)2  -0.491879978 0.31163658
factor(stage)3  -0.132278612 0.72102374
factor(stage)4   0.464693754 2.00271327
m-splines-coef1  0.003592583 0.07575442
m-splines-coef2  0.032211548 0.19890131
m-splines-coef3  0.026058672 0.32343700
m-splines-coef4  0.062225410 0.60136464
m-splines-coef5  0.027238874 0.52949733
m-splines-coef6  0.007528231 0.37498799
> # prior=NULL gives the truely flat prior.
> so4<-stan_surv(Surv(time,delta)~factor(stage),data=larynx,
+    prior=NULL)

SAMPLING FOR MODEL 'surv' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 3.8e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.38 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.251673 seconds (Warm-up)
Chain 1:                0.212849 seconds (Sampling)
Chain 1:                0.464522 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 2.1e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.21 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.25085 seconds (Warm-up)
Chain 2:                0.197035 seconds (Sampling)
Chain 2:                0.447885 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 2e-05 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.2 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.272434 seconds (Warm-up)
Chain 3:                0.224843 seconds (Sampling)
Chain 3:                0.497277 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 2e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.2 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.254908 seconds (Warm-up)
Chain 4:                0.196727 seconds (Sampling)
Chain 4:                0.451635 seconds (Total)
Chain 4: 
> prior_summary(so4)
Priors for model 'so4' 
------
Intercept
 ~ normal(location = 0, scale = 20)

Coefficients
 ~ flat

Auxiliary (M-spline-coefficients)
 ~ dirichlet(concentration = [1,1,1,...])
------
See help('prior_summary.stanreg') for more details
> plot(so4)
> print(so4,digits=4)
stan_surv
 baseline hazard: M-splines on hazard scale
 formula:         Surv(time, delta) ~ factor(stage)
 observations:    90
 events:          50 (55.6%)
 right censored:  40 (44.4%)
 delayed entry:   no
------
                Median  MAD_SD  exp(Median)
(Intercept)     -0.0621  0.2990      NA    
factor(stage)2   0.0902  0.4653  1.0944    
factor(stage)3   0.6268  0.3476  1.8717    
factor(stage)4   1.7480  0.4088  5.7431    
m-splines-coef1  0.0244  0.0192      NA    
m-splines-coef2  0.0975  0.0487      NA    
m-splines-coef3  0.1315  0.0918      NA    
m-splines-coef4  0.3281  0.1826      NA    
m-splines-coef5  0.2428  0.1753      NA    
m-splines-coef6  0.1030  0.1005      NA    

------
* For help interpreting the printed output see ?print.stanreg
* For info on the priors used see ?prior_summary.stanreg
> (in4<-posterior_interval(so4))
                         5%        95%
(Intercept)     -0.56504662 0.44621931
factor(stage)2  -0.75574371 0.81100866
factor(stage)3   0.05708219 1.21073650
factor(stage)4   1.02657498 2.43812912
m-splines-coef1  0.00318015 0.06990692
m-splines-coef2  0.03048702 0.18976262
m-splines-coef3  0.02066326 0.32326186
m-splines-coef4  0.06807140 0.61372900
m-splines-coef5  0.02893657 0.53645004
m-splines-coef6  0.00802626 0.37292453
> # Compare the posterior intervals for three model fits,
> # 1. Default prior, normal with relatively broad sd.
> # 2. Cauchy prior, with relatively small dispersion.
> # 3. Flat prior. 
> # Ignore baseline hazard coefficients.  These three
> # prior specifications apply only to slope parameters.
> # See documentation for how to specify priors on other
> # parameters.
> options(digits=4)
> cbind(posterior_interval(so2),posterior_interval(so3),
+    posterior_interval(so4))[1:4,]
                     5%    95%      5%    95%       5%    95%
(Intercept)    -0.52504 0.4421 -0.2657 0.6104 -0.56505 0.4462
factor(stage)2 -0.74353 0.7542 -0.4919 0.3116 -0.75574 0.8110
factor(stage)3 -0.01115 1.1672 -0.1323 0.7210  0.05708 1.2107
factor(stage)4  0.99248 2.3595  0.4647 2.0027  1.02657 2.4381
> #***********************************************************
> # Explore priors and posteriors.  First, draw contours of  *
> # the log likelihood.
> #***********************************************************
> nn<-100
> out<-array(NA,c(nn,nn))
> larynx$cstage<-larynx$stage-mean(larynx$stage)
> # Fit the exponential regression as before.
> fU<-function(u) exp(-exp(u))*exp(u)
> SU<-function(u) exp(-exp(u))
> ll<-function(beta,times,deltas,xmat){
+     uv<-log(times)-xmat%*%beta
+     return(sum((1-deltas)*log(SU(uv))+deltas*log(fU(uv))))
+ }
> betas<-list(NULL,NULL)
> library(survival)#For survreg
> (larynxfit<-survreg(Surv(time,delta)~cstage,data=larynx,
+    dist="exp",x=TRUE))
Call:
survreg(formula = Surv(time, delta) ~ cstage, data = larynx, 
    dist = "exp", x = TRUE)

Coefficients:
(Intercept)      cstage 
     2.0167     -0.5048 

Scale fixed at 1 

Loglik(model)= -144.3   Loglik(intercept only)= -151.1
	Chisq= 13.58 on 1 degrees of freedom, p= 2e-04 
n= 90 
> rad<-4
> for(i in 1:2) betas[[i]]<-coef(larynxfit)[i]+
+    2*rad*(seq(nn)-(nn+1)/2)*sqrt(larynxfit$var[i,i])/nn
> for(j in 1:nn) for(k in 1:nn){
+    beta<-c(betas[[1]][j],betas[[2]][k])
+    out[j,k]<-ll(beta,larynx$time,larynx$delta,larynxfit$x)
+ }
> contour(betas[[1]],betas[[2]],out,xlab="Intercept",
+    ylab="Centered Stage",
+    main="Likelihood contours for larynx data",
+    sub=paste("Exponential accelerated failure errors,",
+       "stage as continuous"),
+    levels=max(out)-qchisq(c(.95,.99,.999),2)/2)
> points(coef(larynxfit),pch=1)
> coef(larynxfit)
(Intercept)      cstage 
     2.0167     -0.5048 
> #Code above and below fit the Accelerated Failure model.
> #stan_surv default priors are normal wth expectation
> #0 and standard deviation 2.5.  Given the (frequentist)
> #standard errors of the estimates, this is only min-
> #imally informative.
> bayeslarynxfit<-stan_surv(Surv(time,delta)~cstage,data=larynx,
+    basehaz="exp-aft")

SAMPLING FOR MODEL 'surv' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 3.2e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.32 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.069503 seconds (Warm-up)
Chain 1:                0.070753 seconds (Sampling)
Chain 1:                0.140256 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1.6e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.071436 seconds (Warm-up)
Chain 2:                0.068266 seconds (Sampling)
Chain 2:                0.139702 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 1.5e-05 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.15 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.065944 seconds (Warm-up)
Chain 3:                0.072028 seconds (Sampling)
Chain 3:                0.137972 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 1.7e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.17 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.066859 seconds (Warm-up)
Chain 4:                0.062874 seconds (Sampling)
Chain 4:                0.129733 seconds (Total)
Chain 4: 
> fixef(bayeslarynxfit)
(Intercept)      cstage 
      2.037      -0.508 
> points(fixef(bayeslarynxfit),pch=2)
> fixef(bayeslarynxfit)
(Intercept)      cstage 
      2.037      -0.508 
> prior_summary(bayeslarynxfit)
Priors for model 'bayeslarynxfit' 
------
Intercept
 ~ normal(location = 0, scale = 20)

Coefficients
 ~ normal(location = 0, scale = 2.5)

Auxiliary (NA)
 ~ flat
------
See help('prior_summary.stanreg') for more details
> legend(betas[[1]][1],betas[[2]][1],pch=1:3,yjust=0,
+    legend=c("Frequentist MLE","Bayesian Flat","Bayesian Peaked"))
> #Something more informative:
> bb2<-stan_surv(Surv(time,delta)~cstage,
+    data=larynx,basehaz="exp-aft",
+    prior_intercept=normal(location=2),
+    prior=normal(location=0,scale=.1))

SAMPLING FOR MODEL 'surv' NOW (CHAIN 1).
Chain 1: 
Chain 1: Gradient evaluation took 2.8e-05 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.28 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1: 
Chain 1: 
Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 1: 
Chain 1:  Elapsed Time: 0.073352 seconds (Warm-up)
Chain 1:                0.057834 seconds (Sampling)
Chain 1:                0.131186 seconds (Total)
Chain 1: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 2).
Chain 2: 
Chain 2: Gradient evaluation took 1.6e-05 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2: 
Chain 2: 
Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 2: 
Chain 2:  Elapsed Time: 0.072743 seconds (Warm-up)
Chain 2:                0.0654 seconds (Sampling)
Chain 2:                0.138143 seconds (Total)
Chain 2: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 3).
Chain 3: 
Chain 3: Gradient evaluation took 1.6e-05 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3: 
Chain 3: 
Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 3: 
Chain 3:  Elapsed Time: 0.072581 seconds (Warm-up)
Chain 3:                0.069263 seconds (Sampling)
Chain 3:                0.141844 seconds (Total)
Chain 3: 

SAMPLING FOR MODEL 'surv' NOW (CHAIN 4).
Chain 4: 
Chain 4: Gradient evaluation took 1.6e-05 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4: 
Chain 4: 
Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
Chain 4: 
Chain 4:  Elapsed Time: 0.070209 seconds (Warm-up)
Chain 4:                0.065016 seconds (Sampling)
Chain 4:                0.135225 seconds (Total)
Chain 4: 
> fixef(bb2)
(Intercept)      cstage 
     1.9956     -0.1777 
> points(fixef(bb2),pch=3)
>