The crps() and scrps() functions and their loo_*() counterparts can be used to compute the continuously ranked probability score (CRPS) and scaled CRPS (SCRPS) (see Bolin and Wallin, 2022). CRPS is a proper scoring rule, and strictly proper when the first moment of the predictive distribution is finite. Both can be expressed in terms of samples form the predictive distribution. See e.g. Gneiting and Raftery (2007) for a comprehensive discussion on CRPS.

crps(x, ...)

scrps(x, ...)

loo_crps(x, ...)

loo_scrps(x, ...)

# S3 method for matrix
crps(x, x2, y, ..., permutations = 1)

# S3 method for numeric
crps(x, x2, y, ..., permutations = 1)

# S3 method for matrix
loo_crps(
  x,
  x2,
  y,
  log_lik,
  ...,
  permutations = 1,
  r_eff = 1,
  cores = getOption("mc.cores", 1)
)

# S3 method for matrix
scrps(x, x2, y, ..., permutations = 1)

# S3 method for numeric
scrps(x, x2, y, ..., permutations = 1)

# S3 method for matrix
loo_scrps(
  x,
  x2,
  y,
  log_lik,
  ...,
  permutations = 1,
  r_eff = 1,
  cores = getOption("mc.cores", 1)
)

Arguments

x

A S by N matrix (draws by observations), or a vector of length S when only single observation is provided in y.

...

Passed on to E_loo() in the loo_*() version of these functions.

x2

Independent draws from the same distribution as draws in x. Should be of the identical dimension.

y

A vector of observations or a single value.

permutations

An integer, with default value of 1, specifying how many times the expected value of |X - X'| (|x - x2|) is computed. The row order of x2 is shuffled as elements x and x2 are typically drawn given the same values of parameters. This happens, e.g., when one calls posterior_predict() twice for a fitted rstanarm or brms model. Generating more permutations is expected to decrease the variance of the computed expected value.

log_lik

A log-likelihood matrix the same size as x.

r_eff

An optional vector of relative effective sample size estimates containing one element per observation. See psis() for details.

cores

The number of cores to use for parallelization of [psis()]. See psis() for details.

Value

A list containing two elements: estimates and pointwise. The former reports estimator and standard error and latter the pointwise values.

Details

To compute (S)CRPS, the user needs to provide two sets of draws, x and x2, from the predictive distribution. This is due to the fact that formulas used to compute CRPS involve an expectation of the absolute difference of x and x2, both having the same distribution. See the permutations argument, as well as Gneiting and Raftery (2007) for details.

References

Bolin, D., & Wallin, J. (2022). Local scale invariance and robustness of proper scoring rules. arXiv. doi:10.48550/arXiv.1912.05642

Gneiting, T., & Raftery, A. E. (2007). Strictly Proper Scoring Rules, Prediction, and Estimation. Journal of the American Statistical Association, 102(477), 359–378.

Examples

# \dontrun{
# An example using rstanarm
library(rstanarm)
data("kidiq")
fit <- stan_glm(kid_score ~ mom_hs + mom_iq, data = kidiq)
#> 
#> SAMPLING FOR MODEL 'continuous' NOW (CHAIN 1).
#> Chain 1: 
#> Chain 1: Gradient evaluation took 2.6e-05 seconds
#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.26 seconds.
#> Chain 1: Adjust your expectations accordingly!
#> Chain 1: 
#> Chain 1: 
#> Chain 1: Iteration:    1 / 2000 [  0%]  (Warmup)
#> Chain 1: Iteration:  200 / 2000 [ 10%]  (Warmup)
#> Chain 1: Iteration:  400 / 2000 [ 20%]  (Warmup)
#> Chain 1: Iteration:  600 / 2000 [ 30%]  (Warmup)
#> Chain 1: Iteration:  800 / 2000 [ 40%]  (Warmup)
#> Chain 1: Iteration: 1000 / 2000 [ 50%]  (Warmup)
#> Chain 1: Iteration: 1001 / 2000 [ 50%]  (Sampling)
#> Chain 1: Iteration: 1200 / 2000 [ 60%]  (Sampling)
#> Chain 1: Iteration: 1400 / 2000 [ 70%]  (Sampling)
#> Chain 1: Iteration: 1600 / 2000 [ 80%]  (Sampling)
#> Chain 1: Iteration: 1800 / 2000 [ 90%]  (Sampling)
#> Chain 1: Iteration: 2000 / 2000 [100%]  (Sampling)
#> Chain 1: 
#> Chain 1:  Elapsed Time: 0.051 seconds (Warm-up)
#> Chain 1:                0.063 seconds (Sampling)
#> Chain 1:                0.114 seconds (Total)
#> Chain 1: 
#> 
#> SAMPLING FOR MODEL 'continuous' NOW (CHAIN 2).
#> Chain 2: 
#> Chain 2: Gradient evaluation took 1.1e-05 seconds
#> Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.11 seconds.
#> Chain 2: Adjust your expectations accordingly!
#> Chain 2: 
#> Chain 2: 
#> Chain 2: Iteration:    1 / 2000 [  0%]  (Warmup)
#> Chain 2: Iteration:  200 / 2000 [ 10%]  (Warmup)
#> Chain 2: Iteration:  400 / 2000 [ 20%]  (Warmup)
#> Chain 2: Iteration:  600 / 2000 [ 30%]  (Warmup)
#> Chain 2: Iteration:  800 / 2000 [ 40%]  (Warmup)
#> Chain 2: Iteration: 1000 / 2000 [ 50%]  (Warmup)
#> Chain 2: Iteration: 1001 / 2000 [ 50%]  (Sampling)
#> Chain 2: Iteration: 1200 / 2000 [ 60%]  (Sampling)
#> Chain 2: Iteration: 1400 / 2000 [ 70%]  (Sampling)
#> Chain 2: Iteration: 1600 / 2000 [ 80%]  (Sampling)
#> Chain 2: Iteration: 1800 / 2000 [ 90%]  (Sampling)
#> Chain 2: Iteration: 2000 / 2000 [100%]  (Sampling)
#> Chain 2: 
#> Chain 2:  Elapsed Time: 0.076 seconds (Warm-up)
#> Chain 2:                0.071 seconds (Sampling)
#> Chain 2:                0.147 seconds (Total)
#> Chain 2: 
#> 
#> SAMPLING FOR MODEL 'continuous' NOW (CHAIN 3).
#> Chain 3: 
#> Chain 3: Gradient evaluation took 1.3e-05 seconds
#> Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.13 seconds.
#> Chain 3: Adjust your expectations accordingly!
#> Chain 3: 
#> Chain 3: 
#> Chain 3: Iteration:    1 / 2000 [  0%]  (Warmup)
#> Chain 3: Iteration:  200 / 2000 [ 10%]  (Warmup)
#> Chain 3: Iteration:  400 / 2000 [ 20%]  (Warmup)
#> Chain 3: Iteration:  600 / 2000 [ 30%]  (Warmup)
#> Chain 3: Iteration:  800 / 2000 [ 40%]  (Warmup)
#> Chain 3: Iteration: 1000 / 2000 [ 50%]  (Warmup)
#> Chain 3: Iteration: 1001 / 2000 [ 50%]  (Sampling)
#> Chain 3: Iteration: 1200 / 2000 [ 60%]  (Sampling)
#> Chain 3: Iteration: 1400 / 2000 [ 70%]  (Sampling)
#> Chain 3: Iteration: 1600 / 2000 [ 80%]  (Sampling)
#> Chain 3: Iteration: 1800 / 2000 [ 90%]  (Sampling)
#> Chain 3: Iteration: 2000 / 2000 [100%]  (Sampling)
#> Chain 3: 
#> Chain 3:  Elapsed Time: 0.052 seconds (Warm-up)
#> Chain 3:                0.068 seconds (Sampling)
#> Chain 3:                0.12 seconds (Total)
#> Chain 3: 
#> 
#> SAMPLING FOR MODEL 'continuous' NOW (CHAIN 4).
#> Chain 4: 
#> Chain 4: Gradient evaluation took 1.2e-05 seconds
#> Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.12 seconds.
#> Chain 4: Adjust your expectations accordingly!
#> Chain 4: 
#> Chain 4: 
#> Chain 4: Iteration:    1 / 2000 [  0%]  (Warmup)
#> Chain 4: Iteration:  200 / 2000 [ 10%]  (Warmup)
#> Chain 4: Iteration:  400 / 2000 [ 20%]  (Warmup)
#> Chain 4: Iteration:  600 / 2000 [ 30%]  (Warmup)
#> Chain 4: Iteration:  800 / 2000 [ 40%]  (Warmup)
#> Chain 4: Iteration: 1000 / 2000 [ 50%]  (Warmup)
#> Chain 4: Iteration: 1001 / 2000 [ 50%]  (Sampling)
#> Chain 4: Iteration: 1200 / 2000 [ 60%]  (Sampling)
#> Chain 4: Iteration: 1400 / 2000 [ 70%]  (Sampling)
#> Chain 4: Iteration: 1600 / 2000 [ 80%]  (Sampling)
#> Chain 4: Iteration: 1800 / 2000 [ 90%]  (Sampling)
#> Chain 4: Iteration: 2000 / 2000 [100%]  (Sampling)
#> Chain 4: 
#> Chain 4:  Elapsed Time: 0.058 seconds (Warm-up)
#> Chain 4:                0.062 seconds (Sampling)
#> Chain 4:                0.12 seconds (Total)
#> Chain 4: 
ypred1 <- posterior_predict(fit)
ypred2 <- posterior_predict(fit)
crps(ypred1, ypred2, y = fit$y)
#> $estimates
#>    Estimate          SE 
#> -10.2289025   0.3452416 
#> 
#> $pointwise
#>          1          2          3          4          5          6          7 
#> -24.990099  -9.657260  -7.446334  -4.754076 -21.781540  -6.939491 -31.090066 
#>          8          9         10         11         12         13         14 
#>  -4.377036 -15.248244  -6.142388  -5.870952 -17.920907  -4.157854 -11.234899 
#>         15         16         17         18         19         20         21 
#> -11.631440  -6.526809  -5.191024  -4.660734  -4.408232 -12.844526  -5.705731 
#>         22         23         24         25         26         27         28 
#>  -7.449201  -7.832754  -4.233057 -10.975285  -4.664320  -8.833488  -5.819207 
#>         29         30         31         32         33         34         35 
#>  -4.244282  -4.281328 -15.787284 -30.400706  -6.543652 -11.570146  -5.365094 
#>         36         37         38         39         40         41         42 
#> -10.039881 -12.907251  -5.066769  -5.872869 -18.649738 -10.353712  -5.780542 
#>         43         44         45         46         47         48         49 
#>  -7.948311  -4.292212 -19.103743  -5.837223 -24.585742  -4.507041 -17.647253 
#>         50         51         52         53         54         55         56 
#>  -4.679263  -6.744241  -5.743212 -19.628792 -15.193175  -5.510478  -8.599100 
#>         57         58         59         60         61         62         63 
#>  -5.879795  -5.264341  -5.302749  -5.459433  -9.509765  -9.016372  -4.417851 
#>         64         65         66         67         68         69         70 
#>  -6.229068 -23.648908 -25.582084 -13.216741  -4.283352  -5.174674  -5.239265 
#>         71         72         73         74         75         76         77 
#>  -5.740032  -4.890393  -4.694475 -15.123030 -11.614009 -16.269604  -4.425453 
#>         78         79         80         81         82         83         84 
#>  -3.985703 -19.914424  -4.869179 -11.808568 -15.787271  -8.495717 -10.265243 
#>         85         86         87         88         89         90         91 
#>  -6.284692  -4.447534 -35.372187  -5.701886  -4.676611  -7.591189  -6.973633 
#>         92         93         94         95         96         97         98 
#> -10.311266  -8.454235  -4.948828 -12.294160  -4.893688  -4.091693  -6.053500 
#>         99        100        101        102        103        104        105 
#> -13.841933 -15.967705  -5.257588 -15.755618 -10.709925  -5.230169  -8.366270 
#>        106        107        108        109        110        111        112 
#>  -4.699766 -13.235634  -7.650373  -5.811290  -8.271055 -30.021811 -12.808877 
#>        113        114        115        116        117        118        119 
#> -25.690976  -7.579447  -5.295584  -4.823782 -25.034339 -29.766666  -4.703118 
#>        120        121        122        123        124        125        126 
#> -15.814769 -11.796324  -5.321439  -4.894910  -4.866492 -18.634125 -13.633340 
#>        127        128        129        130        131        132        133 
#> -10.435583  -5.558117  -4.323684  -8.764079 -24.145383  -8.272895  -6.855163 
#>        134        135        136        137        138        139        140 
#>  -5.680793  -7.667393 -27.461299 -12.903087  -8.124463  -4.205648  -6.941802 
#>        141        142        143        144        145        146        147 
#>  -4.304912  -4.712101  -4.145628  -5.963981  -4.414419  -4.172647  -6.444587 
#>        148        149        150        151        152        153        154 
#> -22.406717 -15.149492 -13.283477  -5.211036 -28.989773  -4.116259  -9.207646 
#>        155        156        157        158        159        160        161 
#>  -8.349683  -4.497955  -7.351267  -7.698720 -10.160305  -5.044352  -5.914767 
#>        162        163        164        165        166        167        168 
#>  -7.305688  -5.942986  -5.897738  -5.220435  -8.471887  -8.523607  -4.379158 
#>        169        170        171        172        173        174        175 
#> -15.717298  -6.463893  -5.090596  -5.174440  -5.968395  -7.052573 -12.503024 
#>        176        177        178        179        180        181        182 
#>  -4.257968  -4.151539  -6.436262  -5.037180  -4.481005  -5.102446  -5.276279 
#>        183        184        185        186        187        188        189 
#>  -4.303738 -14.279255 -20.152343 -13.826707  -4.959577 -20.592879  -4.310917 
#>        190        191        192        193        194        195        196 
#> -10.245202  -8.206200  -4.327003  -7.032611  -6.891264  -7.731745  -4.833334 
#>        197        198        199        200        201        202        203 
#>  -6.818601 -11.206995  -6.681627  -4.293995  -5.121928  -5.164004  -7.386156 
#>        204        205        206        207        208        209        210 
#>  -4.545963  -8.056687  -7.632516  -7.401501  -4.169851 -16.423923  -4.641051 
#>        211        212        213        214        215        216        217 
#>  -8.037991 -13.010750 -39.523277 -19.259364 -11.365988  -6.901006  -5.784103 
#>        218        219        220        221        222        223        224 
#>  -8.667910  -4.746662  -6.066699  -8.123340 -23.402953  -4.773369  -5.343618 
#>        225        226        227        228        229        230        231 
#>  -5.511369  -4.480713  -4.879941  -6.744629 -15.529893  -9.500877  -4.837963 
#>        232        233        234        235        236        237        238 
#>  -4.352123  -9.559010  -6.614750  -6.257536  -4.833381  -7.676039  -4.607315 
#>        239        240        241        242        243        244        245 
#>  -4.086174  -5.994318  -4.352820 -21.581770  -4.497125 -13.824002  -6.868494 
#>        246        247        248        249        250        251        252 
#> -16.223607  -4.326620 -27.018989  -6.645658  -5.128369  -5.138602  -8.851399 
#>        253        254        255        256        257        258        259 
#> -11.606303  -5.176009  -4.551235  -5.608509 -10.513894 -10.912436  -8.979487 
#>        260        261        262        263        264        265        266 
#>  -4.479191  -5.093842  -6.521964  -6.528088 -15.025644  -6.212760  -6.337282 
#>        267        268        269        270        271        272        273 
#>  -6.278270 -21.348668 -19.386367  -8.453199  -4.610565 -27.265781 -42.444792 
#>        274        275        276        277        278        279        280 
#>  -9.983181  -5.747517  -8.426348 -16.350867  -4.237857 -17.538770  -6.118426 
#>        281        282        283        284        285        286        287 
#> -13.649757 -19.936231 -21.592292  -4.719718  -4.338067 -42.461423 -18.667060 
#>        288        289        290        291        292        293        294 
#> -22.304015  -4.352973 -17.930965  -9.453872 -11.265408 -22.047764  -6.028313 
#>        295        296        297        298        299        300        301 
#>  -7.565701 -22.713323  -4.669132 -17.469485  -7.750113 -15.119233  -5.552908 
#>        302        303        304        305        306        307        308 
#> -22.968198  -4.752557  -6.035101  -4.152780  -4.523250 -36.486349  -5.668813 
#>        309        310        311        312        313        314        315 
#>  -4.459167 -29.144196 -12.710347 -29.830408 -11.802292  -4.519779  -4.568674 
#>        316        317        318        319        320        321        322 
#>  -4.474499  -6.063925  -6.560643  -9.741876  -7.005294  -4.986478  -4.376984 
#>        323        324        325        326        327        328        329 
#> -12.441260 -17.577987  -8.799388  -9.197150  -7.202683 -26.762243  -5.024355 
#>        330        331        332        333        334        335        336 
#>  -5.951065  -4.354393 -18.842222 -27.313189 -15.229946  -6.646184 -12.086292 
#>        337        338        339        340        341        342        343 
#> -15.402234 -19.601663  -8.238985  -5.861255 -21.171311  -4.523171 -12.616253 
#>        344        345        346        347        348        349        350 
#>  -4.303528 -17.032356 -15.928884 -29.005767 -16.610456  -5.470516  -7.039136 
#>        351        352        353        354        355        356        357 
#>  -9.268780 -10.102659 -11.594327  -4.354506 -20.154592 -26.965057 -15.562839 
#>        358        359        360        361        362        363        364 
#> -19.996679  -4.328289  -5.086946  -4.522773 -10.254820  -4.486712  -4.616746 
#>        365        366        367        368        369        370        371 
#> -11.284862  -4.404655 -12.710210 -30.091597 -11.704346  -4.222466  -9.221634 
#>        372        373        374        375        376        377        378 
#>  -4.233227  -9.017408  -5.236285 -15.313609 -15.953206 -18.322510  -4.285541 
#>        379        380        381        382        383        384        385 
#>  -4.501041  -6.166170 -14.049663  -9.343921  -8.076468 -10.511973 -11.436564 
#>        386        387        388        389        390        391        392 
#> -13.109214 -11.114473  -4.760043  -5.524175  -9.720257  -9.001863  -4.996974 
#>        393        394        395        396        397        398        399 
#> -19.563169 -20.812703  -4.575574  -5.277737 -20.523523 -23.311206  -4.393605 
#>        400        401        402        403        404        405        406 
#>  -6.755891  -4.453456  -4.687144  -6.067947  -4.303689 -13.890801  -5.598087 
#>        407        408        409        410        411        412        413 
#> -16.181107  -5.265551 -27.431619  -4.230414  -4.475243 -12.686568 -10.125541 
#>        414        415        416        417        418        419        420 
#>  -4.770692  -6.470942  -7.015164  -4.135998 -18.012834  -4.656753 -17.092118 
#>        421        422        423        424        425        426        427 
#> -19.233416  -4.555802 -11.404717 -10.396064 -23.909090  -4.384484  -4.651913 
#>        428        429        430        431        432        433        434 
#>  -7.312231 -16.149626 -12.577736  -5.289945 -19.919982  -4.350266  -7.985258 
#> 
loo_crps(ypred1, ypred2, y = fit$y, log_lik = log_lik(fit))
#> Error in is_constant(x_i): could not find function "is_constant"
# }