Bayesian Markov-Chain-Monte-Carlo and Claims Data

JuliaActuary is an ecosystem of packages that makes Julia the easiest language to get started for actuarial workflows.
mortalitytables
exposures
experience-analysis
dataframes
tutorial
statistics
bayesian
using MortalityTables
using Turing
using UUIDs
using DataFramesMeta
using MCMCChains, Plots, StatsPlots
using LinearAlgebra
using Pipe
using StatsFuns

Generating fake data

The problem of interest is to look at mortality rates, which are given in terms of exposures (whether or not a life experienced a death in a given year).

We’ll grab some example rates from an insurance table, which has a “selection” component: When someone enters observation, say at age 50, their mortality is path dependent (so for someone who started being observed at 50 will have a different risk/mortality rate at age 55 than someone who started being observed at 45).

Addtionally, there may be additional groups of interest, such as: - high/medium/low risk classification - sex - group (e.g. company, data source, etc.) - type of insurance product offered

The example data will start with only the risk classification above

src = MortalityTables.table("2001 VBT Residual Standard Select and Ultimate - Male Nonsmoker, ANB")
MortalityTable (Insured Lives Mortality):
   Name:
       2001 VBT Residual Standard Select and Ultimate - Male Nonsmoker, ANB
   Fields: 
       (:select, :ultimate, :metadata)
   Provider:
       Society of Actuaries
   mort.SOA.org ID:
       1118
   mort.SOA.org link:
       https://mort.soa.org/ViewTable.aspx?&TableIdentity=1118
   Description:
       2001 Valuation Basic Table (VBT) Residual Standard Select and Ultimate Table -  Male Nonsmoker. Basis: Age Nearest Birthday. Minimum Select Age: 0. Maximum Select Age: 99. Minimum Ultimate Age: 25. Maximum Ultimate Age: 120
n = 10_000

function generate_data_individual(tbl, issue_age=rand(50:55), inforce_years=rand(1:30), risklevel=rand(1:3))
    # risk_factors will scale the "true" parameter up or down
    # we observe the assigned risklevel, but not risk_factor
    risk_factors = [0.7, 1.0, 1.5]
    rf = risk_factors[risklevel]
    deaths = rand(inforce_years) .< (tbl.select[issue_age][issue_age.+inforce_years.-1] .* rf)

    endpoint = if sum(deaths) == 0
        last(inforce_years)
    else
        findfirst(deaths)
    end
    id = uuid1()
    map(1:endpoint) do i
        (
            issue_age=issue_age,
            risklevel=risklevel,
            att_age=issue_age + i - 1,
            death=deaths[i],
            id=id,
        )
    end

end

exposures = vcat([generate_data_individual(src) for _ in 1:n]...) |> DataFrame
108913×5 DataFrame
108888 rows omitted
Row issue_age risklevel att_age death id
Int64 Int64 Int64 Bool UUID
1 54 2 54 false 19d9bd90-2c56-11ef-221d-ab3d14a6f167
2 54 2 55 false 19d9bd90-2c56-11ef-221d-ab3d14a6f167
3 55 1 55 false 19dd329a-2c56-11ef-246b-334b05232974
4 55 1 56 false 19dd329a-2c56-11ef-246b-334b05232974
5 55 1 57 true 19dd329a-2c56-11ef-246b-334b05232974
6 55 2 55 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
7 55 2 56 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
8 55 2 57 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
9 55 2 58 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
10 55 2 59 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
11 55 2 60 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
12 55 2 61 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
13 55 2 62 false 19dd32f4-2c56-11ef-1da4-c926e1b3d176
108902 52 3 56 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108903 52 3 57 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108904 52 3 58 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108905 52 3 59 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108906 52 3 60 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108907 52 3 61 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108908 52 3 62 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108909 52 3 63 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108910 52 3 64 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108911 52 3 65 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108912 52 3 66 false 19deb6c4-2c56-11ef-2c57-9b6e03cb9296
108913 52 3 67 true 19deb6c4-2c56-11ef-2c57-9b6e03cb9296

Two groupings, one with and without the risk level:

data = combine(groupby(exposures, [:issue_age, :att_age])) do subdf
    (exposures=nrow(subdf),
        deaths=sum(subdf.death),
        fraction=sum(subdf.death) / nrow(subdf))
end
180×5 DataFrame
155 rows omitted
Row issue_age att_age exposures deaths fraction
Int64 Int64 Int64 Int64 Float64
1 50 50 1672 35 0.020933
2 50 51 1576 22 0.0139594
3 50 52 1509 28 0.0185553
4 50 53 1412 19 0.0134561
5 50 54 1334 32 0.023988
6 50 55 1257 38 0.0302307
7 50 56 1159 23 0.0198447
8 50 57 1080 22 0.0203704
9 50 58 996 27 0.0271084
10 50 59 921 21 0.0228013
11 50 60 861 27 0.0313589
12 50 61 781 24 0.0307298
13 50 62 709 11 0.0155148
169 55 73 220 8 0.0363636
170 55 74 182 12 0.0659341
171 55 75 147 7 0.047619
172 55 76 122 3 0.0245902
173 55 77 102 6 0.0588235
174 55 78 76 5 0.0657895
175 55 79 56 7 0.125
176 55 80 37 5 0.135135
177 55 81 26 1 0.0384615
178 55 82 21 3 0.142857
179 55 83 12 0 0.0
180 55 84 5 0 0.0
# ╔═╡ 45237199-f8e8-4f61-b644-89ab37c31a5d
data2 = combine(groupby(exposures, [:issue_age, :att_age, :risklevel])) do subdf
    (exposures=nrow(subdf),
        deaths=sum(subdf.death),
        fraction=sum(subdf.death) / nrow(subdf))
end
531×6 DataFrame
506 rows omitted
Row issue_age att_age risklevel exposures deaths fraction
Int64 Int64 Int64 Int64 Int64 Float64
1 50 50 1 550 16 0.0290909
2 50 50 2 601 9 0.014975
3 50 50 3 521 10 0.0191939
4 50 51 1 518 6 0.011583
5 50 51 2 568 7 0.0123239
6 50 51 3 490 9 0.0183673
7 50 52 1 498 6 0.0120482
8 50 52 2 542 12 0.0221402
9 50 52 3 469 10 0.021322
10 50 53 1 474 1 0.0021097
11 50 53 2 507 6 0.0118343
12 50 53 3 431 12 0.0278422
13 50 54 1 450 7 0.0155556
520 55 80 2 8 2 0.25
521 55 80 3 4 0 0.0
522 55 81 1 19 0 0.0
523 55 81 2 3 1 0.333333
524 55 81 3 4 0 0.0
525 55 82 1 15 2 0.133333
526 55 82 2 2 0 0.0
527 55 82 3 4 1 0.25
528 55 83 1 9 0 0.0
529 55 83 2 1 0 0.0
530 55 83 3 2 0 0.0
531 55 84 1 5 0 0.0

1: A single binomial parameter model

Estiamte \(p\), the average mortality rate, not accounting for any variation within the population/sample:

@model function mortality(data, deaths)
    p ~ Beta(1, 1)
    for i = 1:nrow(data)
        deaths[i] ~ Binomial(data.exposures[i], p)
    end
end


m1 = mortality(data, data.deaths)

num_chains = 4
4

Sampling from the posterior

We use a No-U-Turn-Sampler (NUTS) technique to sample multile chains at once:

chain = sample(m1, NUTS(), 1000)

plot(chain)
┌ Info: Found initial step size
└   ϵ = 0.0125
Sampling:   4%|█▌                                       |  ETA: 0:00:03Sampling: 100%|█████████████████████████████████████████| Time: 0:00:00

Plotting samples from the posterior

We can see that the sampling of possible posterior parameters doesn’t really fit the data very well since our model was so simplified. The lines represent the posterior binomial probability.

This is saying that for the observed data, if there really is just a single probability p that governs the true process that came up with the data, there’s a pretty narrow range of values it could possibly be:

let
    data_weight = data.exposures ./ sum(data.exposures)
    data_weight = .√(data_weight ./ maximum(data_weight) .* 20)

    p = scatter(
        data.att_age,
        data.fraction,
        markersize=data_weight,
        alpha=0.5,
        label="Experience data point (size indicates relative exposure quantity)",
        xlabel="age",
        ylim=(0.0, 0.25),
        ylabel="mortality rate",
        title="Parametric Bayseian Mortality"
    )

    # show n samples from the posterior plotted on the graph
    n = 300
    ages = sort!(unique(data.att_age))

    for i in 1:n
        p_posterior = sample(chain, 1)[:p][1]
        hline!([p_posterior], label="", alpha=0.1)
    end
    p

end

The posterior mean of p is of course very close to the simple proportoin of claims to exposures:

let
    a = mean(chain, :p)
    b = sum(data.deaths) / sum(data.exposures)
    a, b
end
(0.03054276093985464, 0.030538135943367642)

2. Parametric model

In this example, we utilize a MakehamBeard parameterization because it’s already very similar in form to a logistic function. This is important because our desired output is a probability (ie the probablity of a death at a given age), so the value must be constrained to be in the interval between zero and one.

The prior values for a,b,c, and k are chosen to constrain the hazard (mortality) rate to be between zero and one.

This isn’t an ideal parameterization (e.g. we aren’t including information about the select underwriting period), but is an example of utilizing Bayesian techniques on life experience data.

@model function mortality2(data, deaths)
    a ~ Exponential(0.1)
    b ~ Exponential(0.1)
    c = 0.0
    k ~ truncated(Exponential(1), 1, Inf)

    # use the variables to create a parametric mortality model
    m = MortalityTables.MakehamBeard(; a, b, c, k)

    # loop through the rows of the dataframe to let Turing observe the data 
    # and how consistent the parameters are with the data
    for i = 1:nrow(data)
        age = data.att_age[i]
        q = MortalityTables.hazard(m, age)
        deaths[i] ~ Binomial(data.exposures[i], q)
    end
end
mortality2 (generic function with 2 methods)

Sampling from the posterior

We combine the model with the data and a use use a No-U-Turn-Sampler (NUTS) technique to sample:

m2 = mortality2(data, data.deaths)
chain2 = sample(m2, NUTS(), 1000)
summarize(chain2)
┌ Info: Found initial step size
└   ϵ = 0.0125
Sampling:   2%|▊                                        |  ETA: 0:00:06Sampling:   3%|█▎                                       |  ETA: 0:00:12Sampling:   4%|█▊                                       |  ETA: 0:00:14Sampling:   5%|█▉                                       |  ETA: 0:00:15Sampling:   6%|██▎                                      |  ETA: 0:00:15Sampling:   6%|██▋                                      |  ETA: 0:00:14Sampling:   7%|███                                      |  ETA: 0:00:14Sampling:   8%|███▎                                     |  ETA: 0:00:14Sampling:   9%|███▉                                     |  ETA: 0:00:14Sampling:  11%|████▍                                    |  ETA: 0:00:13Sampling:  12%|█████                                    |  ETA: 0:00:12Sampling:  14%|█████▌                                   |  ETA: 0:00:11Sampling:  16%|██████▌                                  |  ETA: 0:00:10Sampling:  18%|███████▎                                 |  ETA: 0:00:09Sampling:  19%|███████▉                                 |  ETA: 0:00:09Sampling:  21%|████████▊                                |  ETA: 0:00:08Sampling:  23%|█████████▌                               |  ETA: 0:00:08Sampling:  26%|██████████▌                              |  ETA: 0:00:07Sampling:  28%|███████████▎                             |  ETA: 0:00:07Sampling:  30%|████████████▍                            |  ETA: 0:00:06Sampling:  33%|█████████████▍                           |  ETA: 0:00:06Sampling:  34%|██████████████▏                          |  ETA: 0:00:06Sampling:  36%|██████████████▉                          |  ETA: 0:00:05Sampling:  39%|████████████████▏                        |  ETA: 0:00:05Sampling:  42%|█████████████████                        |  ETA: 0:00:05Sampling:  44%|██████████████████▏                      |  ETA: 0:00:04Sampling:  47%|███████████████████▏                     |  ETA: 0:00:04Sampling:  48%|███████████████████▉                     |  ETA: 0:00:04Sampling:  50%|████████████████████▋                    |  ETA: 0:00:04Sampling:  53%|█████████████████████▋                   |  ETA: 0:00:03Sampling:  55%|██████████████████████▍                  |  ETA: 0:00:03Sampling:  57%|███████████████████████▍                 |  ETA: 0:00:03Sampling:  59%|████████████████████████▏                |  ETA: 0:00:03Sampling:  61%|████████████████████████▉                |  ETA: 0:00:03Sampling:  63%|█████████████████████████▉               |  ETA: 0:00:03Sampling:  65%|██████████████████████████▋              |  ETA: 0:00:02Sampling:  67%|███████████████████████████▌             |  ETA: 0:00:02Sampling:  69%|████████████████████████████▎            |  ETA: 0:00:02Sampling:  71%|█████████████████████████████▎           |  ETA: 0:00:02Sampling:  74%|██████████████████████████████▎          |  ETA: 0:00:02Sampling:  76%|███████████████████████████████          |  ETA: 0:00:02Sampling:  77%|███████████████████████████████▊         |  ETA: 0:00:02Sampling:  79%|████████████████████████████████▌        |  ETA: 0:00:01Sampling:  82%|█████████████████████████████████▌       |  ETA: 0:00:01Sampling:  84%|██████████████████████████████████▎      |  ETA: 0:00:01Sampling:  85%|███████████████████████████████████      |  ETA: 0:00:01Sampling:  88%|████████████████████████████████████     |  ETA: 0:00:01Sampling:  90%|████████████████████████████████████▉    |  ETA: 0:00:01Sampling:  92%|█████████████████████████████████████▋   |  ETA: 0:00:01Sampling:  94%|██████████████████████████████████████▋  |  ETA: 0:00:00Sampling:  96%|███████████████████████████████████████▍ |  ETA: 0:00:00Sampling:  98%|████████████████████████████████████████▏|  ETA: 0:00:00Sampling:  99%|████████████████████████████████████████▉|  ETA: 0:00:00Sampling: 100%|█████████████████████████████████████████| Time: 0:00:06
  parameters      mean       std      mcse   ess_bulk   ess_tail      rhat   e     Symbol   Float64   Float64   Float64    Float64    Float64   Float64     ⋯
           a    0.0036    0.0006    0.0000   240.9349   198.5353    0.9998     ⋯
           b    0.0366    0.0031    0.0002   215.0610   171.8594    1.0001     ⋯
           k    2.0919    1.1672    0.0756   269.2057   214.9856    1.0071     ⋯
                                                                1 column omitted
plot(chain2)

Plotting samples from the posterior

We can see that the sampling of possible posterior parameters fits the data well:

let
    data_weight = data.exposures ./ sum(data.exposures)
    data_weight = .√(data_weight ./ maximum(data_weight) .* 20)

    p = scatter(
        data.att_age,
        data.fraction,
        markersize=data_weight,
        alpha=0.5,
        label="Experience data point (size indicates relative exposure quantity)",
        xlabel="age",
        ylim=(0.0, 0.25),
        ylabel="mortality rate",
        title="Parametric Bayseian Mortality"
    )


    # show n samples from the posterior plotted on the graph
    n = 300
    ages = sort!(unique(data.att_age))

    for i in 1:n
        s = sample(chain2, 1)
        a = only(s[:a])
        b = only(s[:b])
        k = only(s[:k])
        c = 0
        m = MortalityTables.MakehamBeard(; a, b, c, k)
        plot!(ages, age -> MortalityTables.hazard(m, age), alpha=0.1, label="")
    end
    p
end

3. Parametric model

This model extends the prior to create a multi-level model. Each risk class (risklevel) gets its own \(a\) paramater in the MakhamBeard model. The prior for \(a_i\) is determined by the hyperparameter \(\bar{a}\).

@model function mortality3(data, deaths)
    risk_levels = length(levels(data.risklevel))
    b ~ Exponential(0.1)
    ā ~ Exponential(0.1)
    a ~ filldist(Exponential(ā), risk_levels)
    c = 0
    k ~ truncated(Exponential(1), 1, Inf)

    # use the variables to create a parametric mortality model

    # loop through the rows of the dataframe to let Turing observe the data 
    # and how consistent the parameters are with the data
    for i = 1:nrow(data)
        risk = data.risklevel[i]

        m = MortalityTables.MakehamBeard(; a=a[risk], b, c, k)
        age = data.att_age[i]
        q = MortalityTables.hazard(m, age)
        deaths[i] ~ Binomial(data.exposures[i], q)
    end
end
mortality3 (generic function with 2 methods)

Instantiate model with the data, sample, and summarize:

m3 = mortality3(data2, data2.deaths)
chain3 = sample(m3, NUTS(), 1000)
summarize(chain3)
┌ Info: Found initial step size
└   ϵ = 0.0125
Sampling:   1%|▋                                        |  ETA: 0:00:17Sampling:   2%|▊                                        |  ETA: 0:00:19Sampling:   3%|█▏                                       |  ETA: 0:00:34Sampling:   3%|█▎                                       |  ETA: 0:00:52Sampling:   4%|█▌                                       |  ETA: 0:01:06Sampling:   4%|█▊                                       |  ETA: 0:01:03Sampling:   5%|█▉                                       |  ETA: 0:01:06Sampling:   5%|██▏                                      |  ETA: 0:01:08Sampling:   6%|██▎                                      |  ETA: 0:01:10Sampling:   6%|██▌                                      |  ETA: 0:01:09Sampling:   6%|██▋                                      |  ETA: 0:01:10Sampling:   7%|██▉                                      |  ETA: 0:01:13Sampling:   7%|███                                      |  ETA: 0:01:11Sampling:   8%|███▎                                     |  ETA: 0:01:09Sampling:   8%|███▌                                     |  ETA: 0:01:08Sampling:   9%|███▉                                     |  ETA: 0:01:02Sampling:  10%|████                                     |  ETA: 0:01:00Sampling:  11%|████▍                                    |  ETA: 0:00:56Sampling:  11%|████▋                                    |  ETA: 0:00:54Sampling:  12%|████▊                                    |  ETA: 0:00:53Sampling:  13%|█████▏                                   |  ETA: 0:00:50Sampling:  14%|█████▌                                   |  ETA: 0:00:47Sampling:  14%|█████▉                                   |  ETA: 0:00:45Sampling:  15%|██████▍                                  |  ETA: 0:00:42Sampling:  16%|██████▋                                  |  ETA: 0:00:40Sampling:  17%|██████▉                                  |  ETA: 0:00:40Sampling:  17%|███████                                  |  ETA: 0:00:39Sampling:  18%|███████▎                                 |  ETA: 0:00:39Sampling:  18%|███████▌                                 |  ETA: 0:00:38Sampling:  19%|███████▋                                 |  ETA: 0:00:38Sampling:  19%|███████▉                                 |  ETA: 0:00:37Sampling:  20%|████████▎                                |  ETA: 0:00:36Sampling:  20%|████████▍                                |  ETA: 0:00:36Sampling:  21%|████████▊                                |  ETA: 0:00:34Sampling:  22%|█████████                                |  ETA: 0:00:34Sampling:  22%|█████████▏                               |  ETA: 0:00:33Sampling:  23%|█████████▌                               |  ETA: 0:00:32Sampling:  24%|█████████▊                               |  ETA: 0:00:32Sampling:  24%|█████████▉                               |  ETA: 0:00:32Sampling:  25%|██████████▍                              |  ETA: 0:00:30Sampling:  26%|██████████▌                              |  ETA: 0:00:30Sampling:  26%|██████████▊                              |  ETA: 0:00:30Sampling:  27%|███████████▏                             |  ETA: 0:00:29Sampling:  28%|███████████▌                             |  ETA: 0:00:28Sampling:  28%|███████████▋                             |  ETA: 0:00:27Sampling:  29%|████████████                             |  ETA: 0:00:26Sampling:  30%|████████████▎                            |  ETA: 0:00:26Sampling:  31%|████████████▋                            |  ETA: 0:00:26Sampling:  31%|████████████▊                            |  ETA: 0:00:25Sampling:  32%|█████████████▎                           |  ETA: 0:00:25Sampling:  33%|█████████████▋                           |  ETA: 0:00:24Sampling:  34%|██████████████                           |  ETA: 0:00:24Sampling:  35%|██████████████▍                          |  ETA: 0:00:23Sampling:  35%|██████████████▌                          |  ETA: 0:00:23Sampling:  36%|██████████████▉                          |  ETA: 0:00:22Sampling:  37%|███████████████▏                         |  ETA: 0:00:22Sampling:  37%|███████████████▎                         |  ETA: 0:00:22Sampling:  38%|███████████████▌                         |  ETA: 0:00:21Sampling:  39%|███████████████▉                         |  ETA: 0:00:21Sampling:  39%|████████████████▏                        |  ETA: 0:00:21Sampling:  40%|████████████████▌                        |  ETA: 0:00:20Sampling:  41%|████████████████▊                        |  ETA: 0:00:20Sampling:  42%|█████████████████▎                       |  ETA: 0:00:19Sampling:  43%|█████████████████▋                       |  ETA: 0:00:19Sampling:  44%|██████████████████                       |  ETA: 0:00:18Sampling:  45%|██████████████████▍                      |  ETA: 0:00:18Sampling:  46%|██████████████████▊                      |  ETA: 0:00:17Sampling:  47%|███████████████████▏                     |  ETA: 0:00:17Sampling:  48%|███████████████████▌                     |  ETA: 0:00:16Sampling:  49%|████████████████████▎                    |  ETA: 0:00:15Sampling:  52%|█████████████████████▎                   |  ETA: 0:00:14Sampling:  54%|██████████████████████▏                  |  ETA: 0:00:13Sampling:  55%|██████████████████████▌                  |  ETA: 0:00:13Sampling:  56%|███████████████████████                  |  ETA: 0:00:13Sampling:  57%|███████████████████████▍                 |  ETA: 0:00:12Sampling:  57%|███████████████████████▌                 |  ETA: 0:00:12Sampling:  58%|███████████████████████▉                 |  ETA: 0:00:12Sampling:  59%|████████████████████████▏                |  ETA: 0:00:12Sampling:  59%|████████████████████████▎                |  ETA: 0:00:12Sampling:  60%|████████████████████████▌                |  ETA: 0:00:11Sampling:  60%|████████████████████████▋                |  ETA: 0:00:11Sampling:  61%|█████████████████████████                |  ETA: 0:00:11Sampling:  62%|█████████████████████████▍               |  ETA: 0:00:11Sampling:  62%|█████████████████████████▋               |  ETA: 0:00:10Sampling:  63%|██████████████████████████               |  ETA: 0:00:10Sampling:  64%|██████████████████████████▎              |  ETA: 0:00:10Sampling:  65%|██████████████████████████▋              |  ETA: 0:00:10Sampling:  66%|███████████████████████████              |  ETA: 0:00:09Sampling:  66%|███████████████████████████▏             |  ETA: 0:00:09Sampling:  67%|███████████████████████████▍             |  ETA: 0:00:09Sampling:  67%|███████████████████████████▌             |  ETA: 0:00:09Sampling:  68%|███████████████████████████▉             |  ETA: 0:00:09Sampling:  69%|████████████████████████████▏            |  ETA: 0:00:09Sampling:  70%|████████████████████████████▌            |  ETA: 0:00:08Sampling:  70%|████████████████████████████▉            |  ETA: 0:00:08Sampling:  71%|█████████████████████████████▏           |  ETA: 0:00:08Sampling:  72%|█████████████████████████████▌           |  ETA: 0:00:08Sampling:  72%|█████████████████████████████▋           |  ETA: 0:00:08Sampling:  73%|██████████████████████████████           |  ETA: 0:00:07Sampling:  74%|██████████████████████████████▎          |  ETA: 0:00:07Sampling:  75%|██████████████████████████████▋          |  ETA: 0:00:07Sampling:  76%|███████████████████████████████          |  ETA: 0:00:07Sampling:  76%|███████████████████████████████▏         |  ETA: 0:00:06Sampling:  77%|███████████████████████████████▋         |  ETA: 0:00:06Sampling:  77%|███████████████████████████████▊         |  ETA: 0:00:06Sampling:  78%|████████████████████████████████         |  ETA: 0:00:06Sampling:  79%|████████████████████████████████▎        |  ETA: 0:00:06Sampling:  80%|████████████████████████████████▊        |  ETA: 0:00:05Sampling:  81%|█████████████████████████████████▏       |  ETA: 0:00:05Sampling:  81%|█████████████████████████████████▎       |  ETA: 0:00:05Sampling:  82%|█████████████████████████████████▋       |  ETA: 0:00:05Sampling:  83%|██████████████████████████████████       |  ETA: 0:00:04Sampling:  84%|██████████████████████████████████▎      |  ETA: 0:00:04Sampling:  84%|██████████████████████████████████▋      |  ETA: 0:00:04Sampling:  85%|███████████████████████████████████      |  ETA: 0:00:04Sampling:  86%|███████████████████████████████████▏     |  ETA: 0:00:04Sampling:  87%|███████████████████████████████████▋     |  ETA: 0:00:03Sampling:  88%|████████████████████████████████████     |  ETA: 0:00:03Sampling:  88%|████████████████████████████████████▏    |  ETA: 0:00:03Sampling:  89%|████████████████████████████████████▍    |  ETA: 0:00:03Sampling:  89%|████████████████████████████████████▌    |  ETA: 0:00:03Sampling:  90%|████████████████████████████████████▊    |  ETA: 0:00:03Sampling:  90%|█████████████████████████████████████▏   |  ETA: 0:00:02Sampling:  91%|█████████████████████████████████████▌   |  ETA: 0:00:02Sampling:  92%|█████████████████████████████████████▋   |  ETA: 0:00:02Sampling:  92%|█████████████████████████████████████▉   |  ETA: 0:00:02Sampling:  93%|██████████████████████████████████████▎  |  ETA: 0:00:02Sampling:  94%|██████████████████████████████████████▌  |  ETA: 0:00:02Sampling:  95%|██████████████████████████████████████▉  |  ETA: 0:00:01Sampling:  95%|███████████████████████████████████████  |  ETA: 0:00:01Sampling:  96%|███████████████████████████████████████▎ |  ETA: 0:00:01Sampling:  97%|███████████████████████████████████████▋ |  ETA: 0:00:01Sampling:  98%|████████████████████████████████████████ |  ETA: 0:00:01Sampling:  98%|████████████████████████████████████████▏|  ETA: 0:00:01Sampling:  98%|████████████████████████████████████████▍|  ETA: 0:00:00Sampling:  99%|████████████████████████████████████████▌|  ETA: 0:00:00Sampling:  99%|████████████████████████████████████████▊|  ETA: 0:00:00Sampling: 100%|█████████████████████████████████████████| Time: 0:00:25
  parameters      mean       std      mcse   ess_bulk   ess_tail      rhat   e     Symbol   Float64   Float64   Float64    Float64    Float64   Float64     ⋯
           b    0.0447    0.0053    0.0012    33.2602    21.5493    1.0398     ⋯
           ā    0.0065    0.0071    0.0005    54.2101    85.6676    1.0458     ⋯
        a[1]    0.0016    0.0004    0.0001    34.7822    21.6119    1.0297     ⋯
        a[2]    0.0022    0.0005    0.0001    34.5119    18.3744    1.0348     ⋯
        a[3]    0.0035    0.0008    0.0001    37.7816    42.2177    1.0315     ⋯
           k    2.8153    1.9264    0.4387    38.2738    20.6857    1.0268     ⋯
                                                                1 column omitted
let data = data2

    data_weight = data.exposures ./ sum(data.exposures)
    data_weight = .√(data_weight ./ maximum(data_weight) .* 20)
    color_i = data.risklevel

    p = scatter(
        data.att_age,
        data.fraction,
        markersize=data_weight,
        alpha=0.5,
        color=color_i,
        label="Experience data point (size indicates relative exposure quantity)",
        xlabel="age",
        ylim=(0.0, 0.25),
        ylabel="mortality rate",
        title="Parametric Bayseian Mortality"
    )


    # show n samples from the posterior plotted on the graph
    n = 100

    ages = sort!(unique(data.att_age))
    for r in 1:3
        for i in 1:n
            s = sample(chain3, 1)
            a = only(s[Symbol("a[$r]")])
            b = only(s[:b])
            k = only(s[:k])
            c = 0
            m = MortalityTables.MakehamBeard(; a, b, c, k)
            if i == 1
                plot!(ages, age -> MortalityTables.hazard(m, age), label="risk level $r", alpha=0.2, color=r)
            else
                plot!(ages, age -> MortalityTables.hazard(m, age), label="", alpha=0.2, color=r)
            end
        end
    end
    p
end

Handling non-unit exposures

The key is to use the Poisson distribution:

@model function mortality4(data, deaths)
    risk_levels = length(levels(data.risklevel))
    b ~ Exponential(0.1)
    ā ~ Exponential(0.1)
    a ~ filldist(Exponential(ā), risk_levels)
    c ~ Beta(4, 18)
    k ~ truncated(Exponential(1), 1, Inf)

    # use the variables to create a parametric mortality model

    # loop through the rows of the dataframe to let Turing observe the data 
    # and how consistent the parameters are with the data
    for i = 1:nrow(data)
        risk = data.risklevel[i]

        m = MortalityTables.MakehamBeard(; a=a[risk], b, c, k)
        age = data.att_age[i]
        q = MortalityTables.hazard(m, age)
        deaths[i] ~ Poisson(data.exposures[i] * q)
    end
end
mortality4 (generic function with 2 methods)

Instantiate model with the data, sample, and summarize:

m4 = mortality4(data2, data2.deaths)
chain4 = sample(m4, NUTS(), 1000)
summarize(chain4)
┌ Info: Found initial step size
└   ϵ = 0.0125
Sampling:   1%|▋                                        |  ETA: 0:00:35Sampling:   2%|▊                                        |  ETA: 0:00:39Sampling:   3%|█▏                                       |  ETA: 0:00:32Sampling:   4%|█▌                                       |  ETA: 0:00:34Sampling:   4%|█▊                                       |  ETA: 0:00:50Sampling:   5%|█▉                                       |  ETA: 0:00:49Sampling:   5%|██▏                                      |  ETA: 0:00:56Sampling:   6%|██▎                                      |  ETA: 0:00:58Sampling:   6%|██▌                                      |  ETA: 0:01:01Sampling:   6%|██▋                                      |  ETA: 0:01:03Sampling:   7%|██▉                                      |  ETA: 0:01:02Sampling:   7%|███                                      |  ETA: 0:01:01Sampling:   8%|███▎                                     |  ETA: 0:01:00Sampling:   8%|███▌                                     |  ETA: 0:00:58Sampling:   9%|███▋                                     |  ETA: 0:00:58Sampling:   9%|███▉                                     |  ETA: 0:00:58Sampling:  10%|████                                     |  ETA: 0:00:56Sampling:  10%|████▏                                    |  ETA: 0:00:55Sampling:  11%|████▍                                    |  ETA: 0:00:54Sampling:  11%|████▋                                    |  ETA: 0:00:52Sampling:  12%|████▊                                    |  ETA: 0:00:52Sampling:  12%|█████                                    |  ETA: 0:00:51Sampling:  13%|█████▏                                   |  ETA: 0:00:50Sampling:  13%|█████▍                                   |  ETA: 0:00:50Sampling:  14%|█████▌                                   |  ETA: 0:00:49Sampling:  14%|█████▊                                   |  ETA: 0:00:48Sampling:  14%|█████▉                                   |  ETA: 0:00:47Sampling:  15%|██████▏                                  |  ETA: 0:00:47Sampling:  16%|██████▌                                  |  ETA: 0:00:45Sampling:  16%|██████▋                                  |  ETA: 0:00:44Sampling:  17%|██████▉                                  |  ETA: 0:00:43Sampling:  18%|███████▎                                 |  ETA: 0:00:41Sampling:  18%|███████▌                                 |  ETA: 0:00:41Sampling:  19%|███████▋                                 |  ETA: 0:00:41Sampling:  19%|███████▉                                 |  ETA: 0:00:40Sampling:  20%|████████                                 |  ETA: 0:00:39Sampling:  20%|████████▎                                |  ETA: 0:00:39Sampling:  20%|████████▍                                |  ETA: 0:00:38Sampling:  21%|████████▋                                |  ETA: 0:00:38Sampling:  21%|████████▊                                |  ETA: 0:00:37Sampling:  22%|█████████                                |  ETA: 0:00:37Sampling:  23%|█████████▍                               |  ETA: 0:00:35Sampling:  23%|█████████▌                               |  ETA: 0:00:35Sampling:  24%|█████████▉                               |  ETA: 0:00:34Sampling:  25%|██████████▏                              |  ETA: 0:00:33Sampling:  25%|██████████▍                              |  ETA: 0:00:33Sampling:  26%|██████████▊                              |  ETA: 0:00:32Sampling:  27%|██████████▉                              |  ETA: 0:00:32Sampling:  27%|███████████▏                             |  ETA: 0:00:31Sampling:  28%|███████████▎                             |  ETA: 0:00:31Sampling:  28%|███████████▌                             |  ETA: 0:00:30Sampling:  28%|███████████▋                             |  ETA: 0:00:30Sampling:  29%|███████████▉                             |  ETA: 0:00:30Sampling:  29%|████████████                             |  ETA: 0:00:29Sampling:  30%|████████████▎                            |  ETA: 0:00:29Sampling:  30%|████████████▍                            |  ETA: 0:00:29Sampling:  31%|████████████▋                            |  ETA: 0:00:29Sampling:  31%|████████████▊                            |  ETA: 0:00:29Sampling:  32%|█████████████                            |  ETA: 0:00:28Sampling:  32%|█████████████▎                           |  ETA: 0:00:28Sampling:  33%|█████████████▋                           |  ETA: 0:00:27Sampling:  34%|█████████████▊                           |  ETA: 0:00:27Sampling:  34%|██████████████                           |  ETA: 0:00:27Sampling:  34%|██████████████▏                          |  ETA: 0:00:27Sampling:  35%|██████████████▍                          |  ETA: 0:00:27Sampling:  35%|██████████████▌                          |  ETA: 0:00:26Sampling:  36%|██████████████▊                          |  ETA: 0:00:26Sampling:  36%|██████████████▉                          |  ETA: 0:00:26Sampling:  37%|███████████████▏                         |  ETA: 0:00:26Sampling:  37%|███████████████▎                         |  ETA: 0:00:25Sampling:  38%|███████████████▌                         |  ETA: 0:00:25Sampling:  38%|███████████████▋                         |  ETA: 0:00:25Sampling:  39%|███████████████▉                         |  ETA: 0:00:24Sampling:  39%|████████████████▏                        |  ETA: 0:00:24Sampling:  40%|████████████████▎                        |  ETA: 0:00:24Sampling:  40%|████████████████▌                        |  ETA: 0:00:24Sampling:  41%|████████████████▋                        |  ETA: 0:00:24Sampling:  41%|████████████████▊                        |  ETA: 0:00:23Sampling:  42%|█████████████████                        |  ETA: 0:00:23Sampling:  42%|█████████████████▎                       |  ETA: 0:00:23Sampling:  42%|█████████████████▍                       |  ETA: 0:00:23Sampling:  43%|█████████████████▋                       |  ETA: 0:00:23Sampling:  43%|█████████████████▊                       |  ETA: 0:00:22Sampling:  44%|██████████████████                       |  ETA: 0:00:22Sampling:  44%|██████████████████▏                      |  ETA: 0:00:22Sampling:  45%|██████████████████▍                      |  ETA: 0:00:22Sampling:  45%|██████████████████▌                      |  ETA: 0:00:21Sampling:  46%|██████████████████▊                      |  ETA: 0:00:21Sampling:  46%|███████████████████                      |  ETA: 0:00:21Sampling:  47%|███████████████████▏                     |  ETA: 0:00:21Sampling:  47%|███████████████████▎                     |  ETA: 0:00:21Sampling:  48%|███████████████████▌                     |  ETA: 0:00:20Sampling:  48%|███████████████████▋                     |  ETA: 0:00:20Sampling:  48%|███████████████████▉                     |  ETA: 0:00:20Sampling:  49%|████████████████████▏                    |  ETA: 0:00:20Sampling:  49%|████████████████████▎                    |  ETA: 0:00:20Sampling:  50%|████████████████████▌                    |  ETA: 0:00:19Sampling:  50%|████████████████████▋                    |  ETA: 0:00:19Sampling:  51%|████████████████████▉                    |  ETA: 0:00:19Sampling:  51%|█████████████████████                    |  ETA: 0:00:19Sampling:  52%|█████████████████████▎                   |  ETA: 0:00:19Sampling:  52%|█████████████████████▍                   |  ETA: 0:00:18Sampling:  53%|█████████████████████▋                   |  ETA: 0:00:18Sampling:  53%|█████████████████████▊                   |  ETA: 0:00:18Sampling:  54%|██████████████████████                   |  ETA: 0:00:18Sampling:  54%|██████████████████████▏                  |  ETA: 0:00:18Sampling:  55%|██████████████████████▍                  |  ETA: 0:00:17Sampling:  55%|██████████████████████▌                  |  ETA: 0:00:17Sampling:  56%|██████████████████████▊                  |  ETA: 0:00:17Sampling:  56%|███████████████████████                  |  ETA: 0:00:17Sampling:  56%|███████████████████████▏                 |  ETA: 0:00:17Sampling:  57%|███████████████████████▍                 |  ETA: 0:00:16Sampling:  57%|███████████████████████▌                 |  ETA: 0:00:16Sampling:  58%|███████████████████████▊                 |  ETA: 0:00:16Sampling:  58%|███████████████████████▉                 |  ETA: 0:00:16Sampling:  59%|████████████████████████▏                |  ETA: 0:00:16Sampling:  59%|████████████████████████▎                |  ETA: 0:00:16Sampling:  60%|████████████████████████▌                |  ETA: 0:00:15Sampling:  60%|████████████████████████▋                |  ETA: 0:00:15Sampling:  61%|████████████████████████▉                |  ETA: 0:00:15Sampling:  61%|█████████████████████████                |  ETA: 0:00:15Sampling:  62%|█████████████████████████▍               |  ETA: 0:00:14Sampling:  62%|█████████████████████████▋               |  ETA: 0:00:14Sampling:  63%|█████████████████████████▉               |  ETA: 0:00:14Sampling:  63%|██████████████████████████               |  ETA: 0:00:14Sampling:  64%|██████████████████████████▎              |  ETA: 0:00:14Sampling:  64%|██████████████████████████▍              |  ETA: 0:00:13Sampling:  65%|██████████████████████████▋              |  ETA: 0:00:13Sampling:  65%|██████████████████████████▊              |  ETA: 0:00:13Sampling:  66%|███████████████████████████              |  ETA: 0:00:13Sampling:  66%|███████████████████████████▏             |  ETA: 0:00:13Sampling:  67%|███████████████████████████▍             |  ETA: 0:00:12Sampling:  67%|███████████████████████████▌             |  ETA: 0:00:12Sampling:  68%|███████████████████████████▊             |  ETA: 0:00:12Sampling:  68%|███████████████████████████▉             |  ETA: 0:00:12Sampling:  69%|████████████████████████████▎            |  ETA: 0:00:11Sampling:  70%|████████████████████████████▌            |  ETA: 0:00:11Sampling:  70%|████████████████████████████▊            |  ETA: 0:00:11Sampling:  70%|████████████████████████████▉            |  ETA: 0:00:11Sampling:  71%|█████████████████████████████▏           |  ETA: 0:00:11Sampling:  71%|█████████████████████████████▎           |  ETA: 0:00:10Sampling:  72%|█████████████████████████████▌           |  ETA: 0:00:10Sampling:  72%|█████████████████████████████▋           |  ETA: 0:00:10Sampling:  73%|█████████████████████████████▉           |  ETA: 0:00:10Sampling:  73%|██████████████████████████████           |  ETA: 0:00:10Sampling:  74%|██████████████████████████████▎          |  ETA: 0:00:10Sampling:  74%|██████████████████████████████▍          |  ETA: 0:00:09Sampling:  75%|██████████████████████████████▋          |  ETA: 0:00:09Sampling:  75%|██████████████████████████████▊          |  ETA: 0:00:09Sampling:  76%|███████████████████████████████          |  ETA: 0:00:09Sampling:  76%|███████████████████████████████▏         |  ETA: 0:00:09Sampling:  76%|███████████████████████████████▍         |  ETA: 0:00:09Sampling:  77%|███████████████████████████████▊         |  ETA: 0:00:08Sampling:  78%|████████████████████████████████         |  ETA: 0:00:08Sampling:  78%|████████████████████████████████▏        |  ETA: 0:00:08Sampling:  79%|████████████████████████████████▎        |  ETA: 0:00:08Sampling:  79%|████████████████████████████████▌        |  ETA: 0:00:08Sampling:  80%|████████████████████████████████▊        |  ETA: 0:00:07Sampling:  80%|████████████████████████████████▉        |  ETA: 0:00:07Sampling:  81%|█████████████████████████████████▏       |  ETA: 0:00:07Sampling:  81%|█████████████████████████████████▎       |  ETA: 0:00:07Sampling:  82%|█████████████████████████████████▌       |  ETA: 0:00:07Sampling:  82%|█████████████████████████████████▋       |  ETA: 0:00:06Sampling:  83%|█████████████████████████████████▉       |  ETA: 0:00:06Sampling:  83%|██████████████████████████████████       |  ETA: 0:00:06Sampling:  84%|██████████████████████████████████▎      |  ETA: 0:00:06Sampling:  84%|██████████████████████████████████▌      |  ETA: 0:00:06Sampling:  84%|██████████████████████████████████▋      |  ETA: 0:00:06Sampling:  85%|██████████████████████████████████▊      |  ETA: 0:00:05Sampling:  86%|███████████████████████████████████▏     |  ETA: 0:00:05Sampling:  86%|███████████████████████████████████▍     |  ETA: 0:00:05Sampling:  87%|███████████████████████████████████▋     |  ETA: 0:00:05Sampling:  87%|███████████████████████████████████▊     |  ETA: 0:00:05Sampling:  88%|████████████████████████████████████     |  ETA: 0:00:04Sampling:  88%|████████████████████████████████████▏    |  ETA: 0:00:04Sampling:  89%|████████████████████████████████████▍    |  ETA: 0:00:04Sampling:  90%|████████████████████████████████████▊    |  ETA: 0:00:04Sampling:  90%|████████████████████████████████████▉    |  ETA: 0:00:04Sampling:  90%|█████████████████████████████████████▏   |  ETA: 0:00:03Sampling:  91%|█████████████████████████████████████▌   |  ETA: 0:00:03Sampling:  92%|█████████████████████████████████████▋   |  ETA: 0:00:03Sampling:  92%|█████████████████████████████████████▉   |  ETA: 0:00:03Sampling:  93%|██████████████████████████████████████   |  ETA: 0:00:03Sampling:  93%|██████████████████████████████████████▎  |  ETA: 0:00:02Sampling:  94%|██████████████████████████████████████▌  |  ETA: 0:00:02Sampling:  94%|██████████████████████████████████████▋  |  ETA: 0:00:02Sampling:  95%|██████████████████████████████████████▉  |  ETA: 0:00:02Sampling:  95%|███████████████████████████████████████  |  ETA: 0:00:02Sampling:  96%|███████████████████████████████████████▎ |  ETA: 0:00:02Sampling:  97%|███████████████████████████████████████▋ |  ETA: 0:00:01Sampling:  97%|███████████████████████████████████████▊ |  ETA: 0:00:01Sampling:  98%|████████████████████████████████████████ |  ETA: 0:00:01Sampling:  98%|████████████████████████████████████████▏|  ETA: 0:00:01Sampling:  98%|████████████████████████████████████████▍|  ETA: 0:00:01Sampling:  99%|████████████████████████████████████████▌|  ETA: 0:00:00Sampling:  99%|████████████████████████████████████████▊|  ETA: 0:00:00Sampling:  99%|████████████████████████████████████████▉|  ETA: 0:00:00Sampling: 100%|█████████████████████████████████████████| Time: 0:00:35
  parameters      mean       std      mcse   ess_bulk   ess_tail      rhat   e     Symbol   Float64   Float64   Float64    Float64    Float64   Float64     ⋯
           b    0.0546    0.0070    0.0006   142.3672   246.8651    1.0063     ⋯
           ā    0.0032    0.0055    0.0003   244.7927   360.4686    0.9995     ⋯
        a[1]    0.0006    0.0003    0.0000   138.6061   241.4863    1.0037     ⋯
        a[2]    0.0010    0.0005    0.0000   137.3028   231.4561    1.0049     ⋯
        a[3]    0.0017    0.0007    0.0001   142.5674   228.3792    1.0042     ⋯
           c    0.0068    0.0023    0.0002   169.0722   366.3223    0.9991     ⋯
           k    2.8094    1.6466    0.1453   114.4244   208.0089    1.0187     ⋯
                                                                1 column omitted

PRECIS(DataFrame(chain4))

risk_factors4 = [mean(chain4[Symbol("a[$f]")]) for f in 1:3]
3-element Vector{Float64}:
 0.0006251049456225291
 0.000990412265130522
 0.0016772235717042031
let data = data2

    data_weight = data.exposures ./ sum(data.exposures)
    data_weight = .√(data_weight ./ maximum(data_weight) .* 20)
    color_i = data.risklevel

    p = scatter(
        data.att_age,
        data.fraction,
        markersize=data_weight,
        alpha=0.5,
        color=color_i,
        label="Experience data point (size indicates relative exposure quantity)",
        xlabel="age",
        ylim=(0.0, 0.25),
        ylabel="mortality rate",
        title="Parametric Bayseian Mortality"
    )


    # show n samples from the posterior plotted on the graph
    n = 100

    ages = sort!(unique(data.att_age))
    for r in 1:3
        for i in 1:n
            s = sample(chain4, 1)
            a = only(s[Symbol("a[$r]")])
            b = only(s[:b])
            k = only(s[:k])
            c = 0
            m = MortalityTables.MakehamBeard(; a, b, c, k)
            if i == 1
                plot!(ages, age -> MortalityTables.hazard(m, age), label="risk level $r", alpha=0.2, color=r)
            else
                plot!(ages, age -> MortalityTables.hazard(m, age), label="", alpha=0.2, color=r)
            end
        end
    end
    p
end

Predictions

We can generate predictive estimates by passing a vector of missing in place of the outcome variables and then calling predict.

We get a table of values where each row is the the prediction implied by the corresponding chain sample, and the columns are the predicted value for each of the outcomes in our original dataset.

preds = predict(mortality4(data2, fill(missing, length(data2.deaths))), chain4)
Chains MCMC chain (1000×531×1 Array{Float64, 3}):
Iterations        = 1:1:1000
Number of chains  = 1
Samples per chain = 1000
parameters        = deaths[1], deaths[2], deaths[3], deaths[4], deaths[5], deaths[6], deaths[7], deaths[8], deaths[9], deaths[10], deaths[11], deaths[12], deaths[13], deaths[14], deaths[15], deaths[16], deaths[17], deaths[18], deaths[19], deaths[20], deaths[21], deaths[22], deaths[23], deaths[24], deaths[25], deaths[26], deaths[27], deaths[28], deaths[29], deaths[30], deaths[31], deaths[32], deaths[33], deaths[34], deaths[35], deaths[36], deaths[37], deaths[38], deaths[39], deaths[40], deaths[41], deaths[42], deaths[43], deaths[44], deaths[45], deaths[46], deaths[47], deaths[48], deaths[49], deaths[50], deaths[51], deaths[52], deaths[53], deaths[54], deaths[55], deaths[56], deaths[57], deaths[58], deaths[59], deaths[60], deaths[61], deaths[62], deaths[63], deaths[64], deaths[65], deaths[66], deaths[67], deaths[68], deaths[69], deaths[70], deaths[71], deaths[72], deaths[73], deaths[74], deaths[75], deaths[76], deaths[77], deaths[78], deaths[79], deaths[80], deaths[81], deaths[82], deaths[83], deaths[84], deaths[85], deaths[86], deaths[87], deaths[88], deaths[89], deaths[90], deaths[91], deaths[92], deaths[93], deaths[94], deaths[95], deaths[96], deaths[97], deaths[98], deaths[99], deaths[100], deaths[101], deaths[102], deaths[103], deaths[104], deaths[105], deaths[106], deaths[107], deaths[108], deaths[109], deaths[110], deaths[111], deaths[112], deaths[113], deaths[114], deaths[115], deaths[116], deaths[117], deaths[118], deaths[119], deaths[120], deaths[121], deaths[122], deaths[123], deaths[124], deaths[125], deaths[126], deaths[127], deaths[128], deaths[129], deaths[130], deaths[131], deaths[132], deaths[133], deaths[134], deaths[135], deaths[136], deaths[137], deaths[138], deaths[139], deaths[140], deaths[141], deaths[142], deaths[143], deaths[144], deaths[145], deaths[146], deaths[147], deaths[148], deaths[149], deaths[150], deaths[151], deaths[152], deaths[153], deaths[154], deaths[155], deaths[156], deaths[157], deaths[158], deaths[159], deaths[160], deaths[161], deaths[162], deaths[163], deaths[164], deaths[165], deaths[166], deaths[167], deaths[168], deaths[169], deaths[170], deaths[171], deaths[172], deaths[173], deaths[174], deaths[175], deaths[176], deaths[177], deaths[178], deaths[179], deaths[180], deaths[181], deaths[182], deaths[183], deaths[184], deaths[185], deaths[186], deaths[187], deaths[188], deaths[189], deaths[190], deaths[191], deaths[192], deaths[193], deaths[194], deaths[195], deaths[196], deaths[197], deaths[198], deaths[199], deaths[200], deaths[201], deaths[202], deaths[203], deaths[204], deaths[205], deaths[206], deaths[207], deaths[208], deaths[209], deaths[210], deaths[211], deaths[212], deaths[213], deaths[214], deaths[215], deaths[216], deaths[217], deaths[218], deaths[219], deaths[220], deaths[221], deaths[222], deaths[223], deaths[224], deaths[225], deaths[226], deaths[227], deaths[228], deaths[229], deaths[230], deaths[231], deaths[232], deaths[233], deaths[234], deaths[235], deaths[236], deaths[237], deaths[238], deaths[239], deaths[240], deaths[241], deaths[242], deaths[243], deaths[244], deaths[245], deaths[246], deaths[247], deaths[248], deaths[249], deaths[250], deaths[251], deaths[252], deaths[253], deaths[254], deaths[255], deaths[256], deaths[257], deaths[258], deaths[259], deaths[260], deaths[261], deaths[262], deaths[263], deaths[264], deaths[265], deaths[266], deaths[267], deaths[268], deaths[269], deaths[270], deaths[271], deaths[272], deaths[273], deaths[274], deaths[275], deaths[276], deaths[277], deaths[278], deaths[279], deaths[280], deaths[281], deaths[282], deaths[283], deaths[284], deaths[285], deaths[286], deaths[287], deaths[288], deaths[289], deaths[290], deaths[291], deaths[292], deaths[293], deaths[294], deaths[295], deaths[296], deaths[297], deaths[298], deaths[299], deaths[300], deaths[301], deaths[302], deaths[303], deaths[304], deaths[305], deaths[306], deaths[307], deaths[308], deaths[309], deaths[310], deaths[311], deaths[312], deaths[313], deaths[314], deaths[315], deaths[316], deaths[317], deaths[318], deaths[319], deaths[320], deaths[321], deaths[322], deaths[323], deaths[324], deaths[325], deaths[326], deaths[327], deaths[328], deaths[329], deaths[330], deaths[331], deaths[332], deaths[333], deaths[334], deaths[335], deaths[336], deaths[337], deaths[338], deaths[339], deaths[340], deaths[341], deaths[342], deaths[343], deaths[344], deaths[345], deaths[346], deaths[347], deaths[348], deaths[349], deaths[350], deaths[351], deaths[352], deaths[353], deaths[354], deaths[355], deaths[356], deaths[357], deaths[358], deaths[359], deaths[360], deaths[361], deaths[362], deaths[363], deaths[364], deaths[365], deaths[366], deaths[367], deaths[368], deaths[369], deaths[370], deaths[371], deaths[372], deaths[373], deaths[374], deaths[375], deaths[376], deaths[377], deaths[378], deaths[379], deaths[380], deaths[381], deaths[382], deaths[383], deaths[384], deaths[385], deaths[386], deaths[387], deaths[388], deaths[389], deaths[390], deaths[391], deaths[392], deaths[393], deaths[394], deaths[395], deaths[396], deaths[397], deaths[398], deaths[399], deaths[400], deaths[401], deaths[402], deaths[403], deaths[404], deaths[405], deaths[406], deaths[407], deaths[408], deaths[409], deaths[410], deaths[411], deaths[412], deaths[413], deaths[414], deaths[415], deaths[416], deaths[417], deaths[418], deaths[419], deaths[420], deaths[421], deaths[422], deaths[423], deaths[424], deaths[425], deaths[426], deaths[427], deaths[428], deaths[429], deaths[430], deaths[431], deaths[432], deaths[433], deaths[434], deaths[435], deaths[436], deaths[437], deaths[438], deaths[439], deaths[440], deaths[441], deaths[442], deaths[443], deaths[444], deaths[445], deaths[446], deaths[447], deaths[448], deaths[449], deaths[450], deaths[451], deaths[452], deaths[453], deaths[454], deaths[455], deaths[456], deaths[457], deaths[458], deaths[459], deaths[460], deaths[461], deaths[462], deaths[463], deaths[464], deaths[465], deaths[466], deaths[467], deaths[468], deaths[469], deaths[470], deaths[471], deaths[472], deaths[473], deaths[474], deaths[475], deaths[476], deaths[477], deaths[478], deaths[479], deaths[480], deaths[481], deaths[482], deaths[483], deaths[484], deaths[485], deaths[486], deaths[487], deaths[488], deaths[489], deaths[490], deaths[491], deaths[492], deaths[493], deaths[494], deaths[495], deaths[496], deaths[497], deaths[498], deaths[499], deaths[500], deaths[501], deaths[502], deaths[503], deaths[504], deaths[505], deaths[506], deaths[507], deaths[508], deaths[509], deaths[510], deaths[511], deaths[512], deaths[513], deaths[514], deaths[515], deaths[516], deaths[517], deaths[518], deaths[519], deaths[520], deaths[521], deaths[522], deaths[523], deaths[524], deaths[525], deaths[526], deaths[527], deaths[528], deaths[529], deaths[530], deaths[531]
internals         = 
Summary Statistics
  parameters      mean       std      mcse    ess_bulk    ess_tail      rhat      Symbol   Float64   Float64   Float64     Float64     Float64   Float64   ⋯
   deaths[1]    8.5150    2.9069    0.0939    958.4763   1008.7639    0.9994   ⋯
   deaths[2]   11.6930    3.4512    0.1055   1072.8987    944.2409    1.0000   ⋯
   deaths[3]   14.9880    3.9562    0.1225   1048.3469    977.2964    0.9991   ⋯
   deaths[4]    8.0010    2.8958    0.0951    926.2622    797.4456    1.0013   ⋯
   deaths[5]   11.7660    3.3729    0.1069    975.5156    810.6114    0.9997   ⋯
   deaths[6]   14.7840    3.8184    0.1248    932.4117    903.1870    1.0006   ⋯
   deaths[7]    7.9830    2.9277    0.0927    985.8603    922.8455    1.0025   ⋯
   deaths[8]   11.6140    3.4174    0.1094    973.3146    921.1675    1.0029   ⋯
   deaths[9]   14.6820    3.9085    0.1275    930.6639    907.1682    0.9990   ⋯
  deaths[10]    7.8740    2.9139    0.0891   1059.0651    958.9856    0.9999   ⋯
  deaths[11]   11.1530    3.3655    0.1084    960.5418    894.2850    1.0015   ⋯
  deaths[12]   13.8960    3.7610    0.1250    892.1812    914.3382    0.9990   ⋯
  deaths[13]    7.6650    2.8829    0.0904   1022.3612    883.8790    1.0012   ⋯
  deaths[14]   11.1400    3.3325    0.1078    945.1535    935.6795    1.0004   ⋯
  deaths[15]   13.7080    3.6903    0.1108   1104.3195   1009.4330    0.9998   ⋯
  deaths[16]    7.6550    2.7676    0.0897    946.8631   1036.7581    1.0032   ⋯
  deaths[17]   11.0460    3.3288    0.1039   1033.3326    929.5685    1.0026   ⋯
  deaths[18]   13.2610    3.6838    0.1276    812.4458    911.9378    1.0011   ⋯
  deaths[19]    7.3600    2.6825    0.0829   1036.1341    961.6809    1.0000   ⋯
  deaths[20]   10.7340    3.2714    0.1187    761.5744    924.5419    1.0014   ⋯
  deaths[21]   12.3820    3.5616    0.1335    779.6126    668.9713    0.9994   ⋯
  deaths[22]    7.0350    2.7399    0.0866    993.2216    778.8220    0.9993   ⋯
  deaths[23]   10.2330    3.1621    0.1049    913.8614    901.2260    1.0002   ⋯
      ⋮           ⋮         ⋮         ⋮          ⋮           ⋮          ⋮      ⋱
                                                   1 column and 508 rows omitted
Quantiles
  parameters      2.5%     25.0%     50.0%     75.0%     97.5% 
      Symbol   Float64   Float64   Float64   Float64   Float64 
   deaths[1]    3.0000    7.0000    8.0000   10.0000   15.0000
   deaths[2]    6.0000    9.0000   12.0000   14.0000   19.0000
   deaths[3]    8.0000   12.0000   15.0000   18.0000   23.0000
   deaths[4]    3.0000    6.0000    8.0000   10.0000   14.0000
   deaths[5]    6.0000    9.0000   12.0000   14.0000   19.0000
   deaths[6]    8.0000   12.0000   15.0000   17.0000   22.0250
   deaths[7]    3.0000    6.0000    8.0000   10.0000   14.0000
   deaths[8]    5.0000    9.0000   11.0000   14.0000   18.0000
   deaths[9]    7.0000   12.0000   15.0000   17.0000   22.0000
  deaths[10]    3.0000    6.0000    8.0000   10.0000   14.0000
  deaths[11]    5.0000    9.0000   11.0000   13.0000   18.0000
  deaths[12]    7.0000   11.0000   14.0000   16.0000   22.0000
  deaths[13]    3.0000    6.0000    7.5000    9.0000   14.0000
  deaths[14]    5.0000    9.0000   11.0000   13.0000   18.0000
  deaths[15]    7.0000   11.0000   13.0000   16.0000   21.0000
  deaths[16]    3.0000    6.0000    8.0000    9.0000   13.0250
  deaths[17]    5.0000    9.0000   11.0000   13.0000   18.0000
  deaths[18]    7.0000   11.0000   13.0000   16.0000   21.0000
  deaths[19]    3.0000    6.0000    7.0000    9.0000   13.0000
  deaths[20]    5.0000    8.0000   11.0000   13.0000   17.0250
  deaths[21]    6.0000   10.0000   12.0000   15.0000   20.0000
  deaths[22]    2.0000    5.0000    7.0000    9.0000   13.0000
  deaths[23]    4.0000    8.0000   10.0000   12.0000   17.0000
      ⋮           ⋮         ⋮         ⋮         ⋮         ⋮
                                                508 rows omitted