Extract the posterior draws of the linear predictor, possibly transformed by the inverse-link function. This function is occasionally useful, but it should be used sparingly: inference and model checking should generally be carried out using the posterior predictive distribution (i.e., using posterior_predict).

# S3 method for stanreg
posterior_linpred(
object,
transform = FALSE,
newdata = NULL,
draws = NULL,
re.form = NULL,
offset = NULL,
XZ = FALSE,
...
)

# S3 method for stanreg
posterior_epred(
object,
newdata = NULL,
draws = NULL,
re.form = NULL,
offset = NULL,
XZ = FALSE,
...
)

## Arguments

object A fitted model object returned by one of the rstanarm modeling functions. See stanreg-objects. Should the linear predictor be transformed using the inverse-link function? The default is FALSE. This argument is still allowed but not recommended because the posterior_epred function now provides the equivalent of posterior_linpred(..., transform=TRUE). See Examples. Same as for posterior_predict. If TRUE then instead of computing the linear predictor the design matrix X (or cbind(X,Z) for models with group-specific terms) constructed from newdata is returned. The default is FALSE. Currently ignored.

## Value

The default is to return a draws by nrow(newdata) matrix of simulations from the posterior distribution of the (possibly transformed) linear predictor. The exception is if the argument XZ is set to TRUE (see the XZ argument description above).

## Details

The posterior_linpred function returns the posterior distribution of the linear predictor, while the posterior_epred function returns the posterior distribution of the conditional expectation. In the special case of a Gaussian likelihood with an identity link function, these two concepts are the same. The posterior_epred function is a less noisy way to obtain expectations over the output of posterior_predict.

## Note

For models estimated with stan_clogit, the number of successes per stratum is ostensibly fixed by the research design. Thus, when calling posterior_linpred with new data and transform = TRUE, the data.frame passed to the newdata argument must contain an outcome variable and a stratifying factor, both with the same name as in the original data.frame. Then, the probabilities will condition on this outcome in the new data.

posterior_predict to draw from the posterior predictive distribution of the outcome, which is typically preferable.

## Examples

if (!exists("example_model")) example(example_model)
print(family(example_model))#>
#> Family: binomial
#>
# linear predictor on log-odds scale
linpred <- posterior_linpred(example_model)
colMeans(linpred)#>          1          2          3          4          5          6          7
#> -0.8152845 -1.8116712 -1.9709559 -2.4348252 -1.7497529 -2.7594669 -2.8787698
#>          8          9         10         11         12         13         14
#> -0.9964918 -2.0195330 -2.1588268 -2.5693873 -1.4414774 -2.4245369 -2.5704944
#>         15         16         17         18         19         20         21
#> -3.0277001 -1.6833250 -2.6197392 -2.7656967 -3.3361838 -1.9274010 -2.9104606
#>         22         23         24         25         26         27         28
#> -3.0430908 -3.4669784 -0.5042808 -1.5273220 -1.6732794 -2.1371487 -0.7737788
#>         29         30         31         32         33         34         35
#> -1.7486796 -2.7517300 -2.8776966 -3.3282387 -2.0344536 -3.0175131 -3.1834614
#>         36         37         38         39         40         41         42
#> -3.5940218 -1.5258870 -2.4956193 -2.6682312 -3.1054461 -1.5412175 -2.5376043
#>         43         44         45         46         47         48         49
#> -2.6902253 -3.1341038 -2.2197226 -3.1827913 -3.3554031 -3.7659636 -0.3867961
#>         50         51         52         53         54         55         56
#> -1.4831370 -1.6157672 -2.0596457 -2.0323548 -3.0420688 -3.1813626 -3.6185775
# probabilities
# same as posterior_linpred(example_model, transform = TRUE)
probs <- posterior_epred(example_model)
colMeans(probs)#>          1          2          3          4          5          6          7
#> 0.31245693 0.14846041 0.13049734 0.09004107 0.15494397 0.06446564 0.05786010
#>          8          9         10         11         12         13         14
#> 0.27432841 0.12250403 0.10884892 0.07806808 0.20191414 0.08918999 0.07851123
#>         15         16         17         18         19         20         21
#> 0.05352155 0.16431822 0.07440405 0.06456711 0.04192484 0.13552889 0.05644167
#>         22         23         24         25         26         27         28
#> 0.05044326 0.03489637 0.38017984 0.18722851 0.16638486 0.11754685 0.32105377
#>         29         30         31         32         33         34         35
#> 0.16055695 0.06827046 0.06009342 0.04104611 0.12361467 0.05126156 0.04421960
#>         36         37         38         39         40         41         42
#> 0.03082442 0.18451711 0.08135044 0.06981631 0.04784589 0.18888894 0.08259130
#>         43         44         45         46         47         48         49
#> 0.07228890 0.04980291 0.10599290 0.04435613 0.03810799 0.02665810 0.40784966
#>         50         51         52         53         54         55         56
#> 0.19999285 0.17971539 0.12854926 0.12490419 0.05102307 0.04528235 0.03049865
# not conditioning on any group-level parameters
probs2 <- posterior_epred(example_model, re.form = NA)
apply(probs2, 2, median)#>          1          2          3          4          5          6          7
#> 0.19518575 0.08246163 0.07080653 0.04701302 0.20556206 0.08432065 0.07673299
#>          8          9         10         11         12         13         14
#> 0.20556206 0.08414459 0.07369313 0.05124092 0.19363180 0.08114487 0.07080653
#>         15         16         17         18         19         20         21
#> 0.04669380 0.19925389 0.08835134 0.07812142 0.04686144 0.19857247 0.08382122
#>         22         23         24         25         26         27         28
#> 0.07456111 0.05124092 0.19765705 0.08114487 0.07080653 0.04701302 0.21394070
#>         29         30         31         32         33         34         35
#> 0.19276949 0.08019697 0.07027879 0.04669380 0.20556206 0.08711956 0.07456111
#>         36         37         38         39         40         41         42
#> 0.05176011 0.20800334 0.08817986 0.07711859 0.05176011 0.19363180 0.08069798
#>         43         44         45         46         47         48         49
#> 0.06918648 0.04701302 0.20485308 0.08780747 0.07533712 0.05239358 0.20156132
#>         50         51         52         53         54         55         56
#> 0.07888034 0.06892321 0.04684458 0.20156132 0.08384163 0.07312601 0.04901285