Covariate Models

Authors

Jose Storopoli

Joel Owen

using Dates
using Pumas
using PumasUtilities
using CairoMakie
using DataFramesMeta
using CSV
using PharmaDatasets
Caution

Some functions in this tutorial are only available after you load the PumasUtilities package.

1 Covariate Model Building

In this tutorial we’ll cover a workflow around covariate model building. You’ll learn how to:

  1. include covariates in your model
  2. parse data with covariates
  3. understand the difference between constant and time-varying covariates
  4. handle continuous and categorical covariates
  5. deal with missing data in your covariates
  6. deal with categorical covariates

1.1 nlme_sample Dataset

For this tutorial we’ll use the nlme_sample dataset from PharmaDatasets.jl:

pkfile = dataset("nlme_sample", String)
pkdata = CSV.read(pkfile, DataFrame; missingstring = ["NA", ""])
first(pkdata, 5)
5×15 DataFrame
Row ID TIME DV AMT EVID CMT RATE WT AGE SEX CRCL GROUP ROUTE DURATION OCC
Int64 Float64 Float64? Int64? Int64 Int64? Int64 Int64 Int64 String1 Int64 String7 Float64 Int64? Int64
1 1 0.0 missing 1000 1 1 500 90 47 M 75 1000 mg Inf 2 1
2 1 0.001 0.0667231 missing 0 missing 0 90 47 M 75 1000 mg Inf missing 1
3 1 1.0 112.817 missing 0 missing 0 90 47 M 75 1000 mg Inf missing 1
4 1 2.0 224.087 missing 0 missing 0 90 47 M 75 1000 mg Inf missing 1
5 1 4.0 220.047 missing 0 missing 0 90 47 M 75 1000 mg Inf missing 1
Note

The nlme_sample dataset has different missing values as the standard datasets in the PharmaDatasets.jl. That’s why we are first getting a String representation of the dataset as a CSV file as pkfile variable. Then, we use it to customize the missingstring keyword argument inside CSV.read function in order to have a working DataFrame for the nlme_sample dataset.

If you want to know more about data wrangling and how to read and write data in different formats, please check out the Data Wrangling Tutorials at tutorials.pumas.ai.

As you can see, the nlme_sample dataset has the standard PK dataset columns such as :ID, :TIME, :DV, :AMT, :EVID and :CMT. The dataset also contains the following list of covariates:

  • :WT: subject weight in kilograms
  • :SEX: subject sex, either "F" or "M"
  • :CRCL: subject creatinine clearance
  • :GROUP: subject dosing group, either "500 mg", "750 mg", or "1000 mg"

And we’ll learn how to include them in our Pumas modeling workflows.

describe(pkdata, :mean, :std, :nunique, :first, :last, :eltype)
15×7 DataFrame
Row variable mean std nunique first last eltype
Symbol Union… Union… Union… Any Any Type
1 ID 15.5 8.661 1 30 Int64
2 TIME 82.6527 63.2187 0.0 168.0 Float64
3 DV 157.315 110.393 missing missing Union{Missing, Float64}
4 AMT 750.0 204.551 1000 500 Union{Missing, Int64}
5 EVID 0.307692 0.461835 1 1 Int64
6 CMT 1.0 0.0 1 1 Union{Missing, Int64}
7 RATE 115.385 182.218 500 250 Int64
8 WT 81.6 11.6051 90 96 Int64
9 AGE 40.0333 11.6479 47 56 Int64
10 SEX 2 M F String1
11 CRCL 72.5667 26.6212 75 90 Int64
12 GROUP 3 1000 mg 500 mg String7
13 ROUTE Inf NaN Inf Inf Float64
14 DURATION 2.0 0.0 2 2 Union{Missing, Int64}
15 OCC 4.15385 2.62836 1 8 Int64
Tip

As you can see, all these covariates are constant. That means, they do not vary over time. We’ll also cover time-varying covariates later in this tutorial.

1.2 Step 1 - Parse Data into a Population

The first step in our covariate model building workflow is to parse data into a Population. This is accomplished with the read_pumas function. Here we are to use the covariates keyword argument to pass a vector of column names to be parsed as covariates:

pop = read_pumas(
    pkdata;
    id = :ID,
    time = :TIME,
    amt = :AMT,
    covariates = [:WT, :AGE, :SEX, :CRCL, :GROUP],
    observations = [:DV],
    cmt = :CMT,
    evid = :EVID,
    rate = :RATE,
)
Population
  Subjects: 30
  Covariates: WT, AGE, SEX, CRCL, GROUP
  Observations: DV

Due to Pumas’ dynamic workflow capabilities, we only need to define our Population once. That is, we tell read_pumas to parse all the covariates available, even if we will be using none or a subset of those in our models.

This is a feature that highly increases workflow efficiency in developing and fitting models.

1.3 Step 2 - Base Model

The second step of our covariate model building workflow is to develop a base model, i.e., a model without any covariate effects on its parameters. This represents the null model against which covariate models can be tested after checking if covariate inclusion is helpful in our model.

Let’s create a combined-error simple 2-compartment base model:

base_model = @model begin
    @param begin
        tvcl  RealDomain(; lower = 0)
        tvvc  RealDomain(; lower = 0)
        tvq  RealDomain(; lower = 0)
        tvvp  RealDomain(; lower = 0)
        Ω  PDiagDomain(2)
        σ_add  RealDomain(; lower = 0)
        σ_prop  RealDomain(; lower = 0)
    end

    @random begin
        η ~ MvNormal(Ω)
    end

    @pre begin
        CL = tvcl * exp(η[1])
        Vc = tvvc * exp(η[2])
        Q = tvq
        Vp = tvvp
    end

    @dynamics Central1Periph1

    @derived begin
        cp := @. Central / Vc
        DV ~ @. Normal(cp, sqrt(cp^2 * σ_prop^2 + σ_add^2))
    end
end
PumasModel
  Parameters: tvcl, tvvc, tvq, tvvp, Ω, σ_add, σ_prop
  Random effects: η
  Covariates: 
  Dynamical variables: Central, Peripheral
  Derived: DV
  Observed: DV

And fit it accordingly:

iparams_base_model = (;
    tvvc = 5,
    tvcl = 0.02,
    tvq = 0.01,
    tvvp = 10,
    Ω = Diagonal([0.01, 0.01]),
    σ_add = 0.1,
    σ_prop = 0.1,
)
(tvvc = 5,
 tvcl = 0.02,
 tvq = 0.01,
 tvvp = 10,
 Ω = [0.01 0.0; 0.0 0.01],
 σ_add = 0.1,
 σ_prop = 0.1,)
fit_base_model = fit(base_model, pop, iparams_base_model, FOCE())
[ Info: Checking the initial parameter values.
[ Info: The initial negative log likelihood and its gradient are finite. Check passed.
Iter     Function value   Gradient norm 
     0     8.300164e+03     4.360770e+03
 * time: 0.01734614372253418
     1     3.110315e+03     9.706222e+02
 * time: 0.39599108695983887
     2     2.831659e+03     7.817006e+02
 * time: 0.41597914695739746
     3     2.405281e+03     2.923696e+02
 * time: 0.464245080947876
     4     2.370406e+03     3.032286e+02
 * time: 0.47380900382995605
     5     2.313631e+03     3.126188e+02
 * time: 0.4848940372467041
     6     2.263986e+03     2.946697e+02
 * time: 0.513314962387085
     7     2.160182e+03     1.917599e+02
 * time: 0.5257940292358398
     8     2.112467e+03     1.588288e+02
 * time: 0.5383100509643555
     9     2.090339e+03     1.108334e+02
 * time: 0.5568311214447021
    10     2.078171e+03     8.108027e+01
 * time: 0.5655779838562012
    11     2.074517e+03     7.813928e+01
 * time: 0.5736629962921143
    12     2.066270e+03     7.044632e+01
 * time: 0.5821371078491211
    13     2.049660e+03     1.062584e+02
 * time: 0.5986220836639404
    14     2.021965e+03     1.130570e+02
 * time: 0.6069321632385254
    15     1.994936e+03     7.825801e+01
 * time: 0.6152660846710205
    16     1.979337e+03     5.318263e+01
 * time: 0.6239011287689209
    17     1.972141e+03     6.807046e+01
 * time: 0.6404860019683838
    18     1.967973e+03     7.896361e+01
 * time: 0.6488540172576904
    19     1.962237e+03     8.343757e+01
 * time: 0.6572561264038086
    20     1.952791e+03     5.565304e+01
 * time: 0.6661460399627686
    21     1.935857e+03     3.923284e+01
 * time: 0.683690071105957
    22     1.926254e+03     5.749643e+01
 * time: 0.6924350261688232
    23     1.922144e+03     4.306225e+01
 * time: 0.7006130218505859
    24     1.911566e+03     4.810496e+01
 * time: 0.709571123123169
    25     1.906464e+03     4.324267e+01
 * time: 0.7266130447387695
    26     1.905339e+03     1.207954e+01
 * time: 0.7346000671386719
    27     1.905092e+03     1.093479e+01
 * time: 0.742408037185669
    28     1.904957e+03     1.057034e+01
 * time: 0.7503530979156494
    29     1.904875e+03     1.060882e+01
 * time: 0.7658190727233887
    30     1.904459e+03     1.031525e+01
 * time: 0.773751974105835
    31     1.903886e+03     1.041179e+01
 * time: 0.7817940711975098
    32     1.903313e+03     1.135672e+01
 * time: 0.7898919582366943
    33     1.903057e+03     1.075683e+01
 * time: 0.7982831001281738
    34     1.902950e+03     1.091295e+01
 * time: 0.813615083694458
    35     1.902887e+03     1.042409e+01
 * time: 0.8214211463928223
    36     1.902640e+03     9.213043e+00
 * time: 0.8292880058288574
    37     1.902364e+03     9.519321e+00
 * time: 0.8376989364624023
    38     1.902156e+03     5.590984e+00
 * time: 0.8537740707397461
    39     1.902094e+03     1.811898e+00
 * time: 0.861778974533081
    40     1.902086e+03     1.644722e+00
 * time: 0.869473934173584
    41     1.902084e+03     1.653520e+00
 * time: 0.8770959377288818
    42     1.902074e+03     1.720184e+00
 * time: 0.8924469947814941
    43     1.902055e+03     2.619061e+00
 * time: 0.900162935256958
    44     1.902015e+03     3.519729e+00
 * time: 0.9078421592712402
    45     1.901962e+03     3.403372e+00
 * time: 0.9155271053314209
    46     1.901924e+03     1.945644e+00
 * time: 0.9309999942779541
    47     1.901914e+03     6.273342e-01
 * time: 0.9387121200561523
    48     1.901913e+03     5.374557e-01
 * time: 0.9463241100311279
    49     1.901913e+03     5.710007e-01
 * time: 0.9536020755767822
    50     1.901913e+03     6.091390e-01
 * time: 0.9613320827484131
    51     1.901912e+03     6.692417e-01
 * time: 0.9767630100250244
    52     1.901909e+03     7.579620e-01
 * time: 0.9845049381256104
    53     1.901903e+03     8.798211e-01
 * time: 0.9921350479125977
    54     1.901889e+03     1.002981e+00
 * time: 0.99983811378479
    55     1.901864e+03     1.495512e+00
 * time: 1.015505075454712
    56     1.901837e+03     1.664621e+00
 * time: 1.0230541229248047
    57     1.901819e+03     8.601119e-01
 * time: 1.0307109355926514
    58     1.901815e+03     4.525470e-02
 * time: 1.038323163986206
    59     1.901815e+03     1.294280e-02
 * time: 1.0460700988769531
    60     1.901815e+03     2.876567e-03
 * time: 1.0608041286468506
    61     1.901815e+03     2.876567e-03
 * time: 1.0749881267547607
    62     1.901815e+03     2.876567e-03
 * time: 1.0891680717468262
FittedPumasModel

Successful minimization:                      true

Likelihood approximation:                     FOCE
Log-likelihood value:                    -1901.815
Number of subjects:                             30
Number of parameters:         Fixed      Optimized
                                  0              8
Observation records:         Active        Missing
    DV:                         540              0
    Total:                      540              0

---------------------
           Estimate
---------------------
tvcl        0.1542
tvvc        4.5856
tvq         0.042341
tvvp        3.7082
Ω₁,₁        0.26467
Ω₂,₂        0.10627
σ_add       0.032183
σ_prop      0.061005
---------------------

1.4 Step 3 - Covariate Model

The third step of our covariate model building workflow is to actually develop one or more covariate models. Let’s develop three covariate models:

  1. allometric scaling based on weight
  2. clearance effect based on creatinine clearance
  3. separated error model based on sex

To include covariates in a Pumas model we need to first include them in the @covariates block. Then, we are free to use them inside the @pre block

Here’s our covariate model with allometric scaling based on weight:

Tip

When building covariate models, unlike in NONMEM, it is highly recommended to derive covariates or perform any required transformations or scaling as a data wrangling step and pass the derived/transformed into read_pumas as a covariate. In this particular case, it is easy to create two columns in the original data as:

@rtransform! pkdata :ASWT_CL = (:WT / 70)^0.75
@rtransform! pkdata :ASWT_Vc = (:WT / 70)

Then, you bring these covariates into read_pumas and use them directly on CL and Vc. This will avoid computations of your transformations during the model fit.

covariate_model_wt = @model begin
    @param begin
        tvcl  RealDomain(; lower = 0)
        tvvc  RealDomain(; lower = 0)
        tvq  RealDomain(; lower = 0)
        tvvp  RealDomain(; lower = 0)
        Ω  PDiagDomain(2)
        σ_add  RealDomain(; lower = 0)
        σ_prop  RealDomain(; lower = 0)
    end

    @random begin
        η ~ MvNormal(Ω)
    end

    @covariates begin
        WT
    end

    @pre begin
        CL = tvcl * (WT / 70)^0.75 * exp(η[1])
        Vc = tvvc * (WT / 70) * exp(η[2])
        Q = tvq
        Vp = tvvp
    end

    @dynamics Central1Periph1

    @derived begin
        cp := @. Central / Vc
        DV ~ @. Normal(cp, sqrt(cp^2 * σ_prop^2 + σ_add^2))
    end
end
PumasModel
  Parameters: tvcl, tvvc, tvq, tvvp, Ω, σ_add, σ_prop
  Random effects: η
  Covariates: WT
  Dynamical variables: Central, Peripheral
  Derived: DV
  Observed: DV

Once we included the WT covariate in the @covariates block we can use the WT values inside the @pre block. For both clearance (CL) and volume of the central compartment (Vc), we are allometric scaling by the WT value by the mean weight 70 and, in the case of CL using an allometric exponent with value 0.75.

Let’s fit our covariate_model_wt. Notice that we have not added any new parameters inside the @param block, thus, we can use the same iparams_base_model initial parameters values’ list:

fit_covariate_model_wt = fit(covariate_model_wt, pop, iparams_base_model, FOCE())
[ Info: Checking the initial parameter values.
[ Info: The initial negative log likelihood and its gradient are finite. Check passed.
Iter     Function value   Gradient norm 
     0     7.695401e+03     4.898919e+03
 * time: 6.508827209472656e-5
     1     2.846050e+03     1.128657e+03
 * time: 0.03991103172302246
     2     2.472982e+03     7.008264e+02
 * time: 0.04963207244873047
     3     2.180690e+03     2.312704e+02
 * time: 0.05923914909362793
     4     2.125801e+03     1.862929e+02
 * time: 0.08546996116638184
     5     2.103173e+03     1.320946e+02
 * time: 0.0943911075592041
     6     2.091136e+03     1.103035e+02
 * time: 0.10233402252197266
     7     2.081443e+03     1.091133e+02
 * time: 0.11033415794372559
     8     2.071793e+03     7.197675e+01
 * time: 0.1358330249786377
     9     2.062706e+03     7.623310e+01
 * time: 0.14438104629516602
    10     2.057515e+03     6.885476e+01
 * time: 0.15215802192687988
    11     2.051133e+03     6.368504e+01
 * time: 0.16022205352783203
    12     2.038626e+03     7.730243e+01
 * time: 0.17508316040039062
    13     2.019352e+03     1.136864e+02
 * time: 0.1835031509399414
    14     1.997136e+03     1.005748e+02
 * time: 0.19136810302734375
    15     1.983023e+03     6.831478e+01
 * time: 0.19980216026306152
    16     1.977700e+03     5.649783e+01
 * time: 0.21490001678466797
    17     1.974583e+03     6.357642e+01
 * time: 0.22316312789916992
    18     1.967292e+03     7.658974e+01
 * time: 0.23128414154052734
    19     1.951045e+03     6.130573e+01
 * time: 0.2406010627746582
    20     1.935868e+03     4.845839e+01
 * time: 0.25591111183166504
    21     1.929356e+03     6.325782e+01
 * time: 0.2647690773010254
    22     1.925187e+03     3.142245e+01
 * time: 0.2726719379425049
    23     1.923733e+03     4.623400e+01
 * time: 0.2807650566101074
    24     1.918498e+03     5.347738e+01
 * time: 0.2965281009674072
    25     1.912383e+03     5.849125e+01
 * time: 0.30602216720581055
    26     1.905510e+03     3.254038e+01
 * time: 0.3145270347595215
    27     1.903629e+03     2.905618e+01
 * time: 0.3226051330566406
    28     1.902833e+03     2.907696e+01
 * time: 0.3375849723815918
    29     1.902447e+03     2.746037e+01
 * time: 0.3453330993652344
    30     1.899399e+03     1.930948e+01
 * time: 0.353193998336792
    31     1.898705e+03     1.186361e+01
 * time: 0.36125898361206055
    32     1.898505e+03     1.050402e+01
 * time: 0.3758699893951416
    33     1.898474e+03     1.042186e+01
 * time: 0.3833329677581787
    34     1.897992e+03     1.238729e+01
 * time: 0.39054203033447266
    35     1.897498e+03     1.729368e+01
 * time: 0.39818501472473145
    36     1.896954e+03     1.472553e+01
 * time: 0.40582895278930664
    37     1.896744e+03     5.852822e+00
 * time: 0.42047715187072754
    38     1.896705e+03     1.171353e+00
 * time: 0.4275059700012207
    39     1.896704e+03     1.216117e+00
 * time: 0.4347341060638428
    40     1.896703e+03     1.230336e+00
 * time: 0.4421570301055908
    41     1.896698e+03     1.250716e+00
 * time: 0.4565610885620117
    42     1.896688e+03     1.449549e+00
 * time: 0.4643101692199707
    43     1.896666e+03     2.533293e+00
 * time: 0.4716501235961914
    44     1.896631e+03     3.075524e+00
 * time: 0.4793381690979004
    45     1.896599e+03     2.139782e+00
 * time: 0.494020938873291
    46     1.896587e+03     6.636036e-01
 * time: 0.5018041133880615
    47     1.896585e+03     6.303516e-01
 * time: 0.5089089870452881
    48     1.896585e+03     5.995263e-01
 * time: 0.516106128692627
    49     1.896584e+03     5.844157e-01
 * time: 0.5233910083770752
    50     1.896583e+03     6.083855e-01
 * time: 0.537330150604248
    51     1.896579e+03     8.145310e-01
 * time: 0.5445671081542969
    52     1.896570e+03     1.293487e+00
 * time: 0.5517909526824951
    53     1.896549e+03     1.877699e+00
 * time: 0.55926513671875
    54     1.896513e+03     2.217382e+00
 * time: 0.5734500885009766
    55     1.896482e+03     1.658135e+00
 * time: 0.5808899402618408
    56     1.896466e+03     5.207141e-01
 * time: 0.5879790782928467
    57     1.896463e+03     1.177464e-01
 * time: 0.5953669548034668
    58     1.896463e+03     1.678937e-02
 * time: 0.6024320125579834
    59     1.896463e+03     2.666614e-03
 * time: 0.6158890724182129
    60     1.896463e+03     2.666614e-03
 * time: 0.6293139457702637
    61     1.896463e+03     2.666614e-03
 * time: 0.6450860500335693
    62     1.896463e+03     2.666188e-03
 * time: 0.6640751361846924
    63     1.896463e+03     2.666188e-03
 * time: 0.6775059700012207
    64     1.896463e+03     2.666188e-03
 * time: 0.7006170749664307
FittedPumasModel

Successful minimization:                      true

Likelihood approximation:                     FOCE
Log-likelihood value:                   -1896.4632
Number of subjects:                             30
Number of parameters:         Fixed      Optimized
                                  0              8
Observation records:         Active        Missing
    DV:                         540              0
    Total:                      540              0

---------------------
           Estimate
---------------------
tvcl        0.13915
tvvc        3.9754
tvq         0.041988
tvvp        3.5722
Ω₁,₁        0.23874
Ω₂,₂        0.081371
σ_add       0.032174
σ_prop      0.061012
---------------------

We can definitely see that, despite not increasing the parameters of the model, our loglikelihood is higher (implying lower objective function value). Furthermore, our additive and combined error values remain almost the same while our Ωs decreased for CL and Vc. This implies that the WT covariate is definitely assisting in a better model fit by capturing the variability in CL and Vc. We’ll compare models in the next step.

Let’s now try to incorporate into this model creatinine clearance (CRCL) effect on clearance (CL):

Tip

Like the tip above, you can derive covariates or perform any required transformations or scaling as a data wrangling step and pass the derived/transformed into read_pumas as a covariate. In this particular case, it is easy to create three more columns in the original data as:

@rtransform! pkdata :CRCL_CL = :CRCL / 100
@rtransform! pkdata :ASWT_CL = (:WT / 70)^0.75
@rtransform! pkdata :ASWT_Vc = (:WT / 70)

Then, you bring these covariates into read_pumas and use them directly on renCL, CL and Vc. This will avoid computations of your transformations during the model fit.

covariate_model_wt_crcl = @model begin
    @param begin
        tvvc  RealDomain(; lower = 0)
        tvq  RealDomain(; lower = 0)
        tvvp  RealDomain(; lower = 0)
        tvcl_hep  RealDomain(; lower = 0)
        tvcl_ren  RealDomain(; lower = 0)
        Ω  PDiagDomain(2)
        σ_add  RealDomain(; lower = 0)
        σ_prop  RealDomain(; lower = 0)
        dCRCL  RealDomain()
    end

    @random begin
        η ~ MvNormal(Ω)
    end

    @covariates begin
        WT
        CRCL
    end

    @pre begin
        hepCL = tvcl_hep * (WT / 70)^0.75
        renCL = tvcl_ren * (CRCL / 100)^dCRCL
        CL = (hepCL + renCL) * exp(η[1])
        Vc = tvvc * (WT / 70) * exp(η[2])
        Q = tvq
        Vp = tvvp
    end

    @dynamics Central1Periph1

    @derived begin
        cp := @. Central / Vc
        DV ~ @. Normal(cp, sqrt(cp^2 * σ_prop^2 + σ_add^2))
    end
end
PumasModel
  Parameters: tvvc, tvq, tvvp, tvcl_hep, tvcl_ren, Ω, σ_add, σ_prop, dCRCL
  Random effects: η
  Covariates: WT, CRCL
  Dynamical variables: Central, Peripheral
  Derived: DV
  Observed: DV

In the covariate_model_wt_crcl model we are keeping our allometric scaling on WT from before. But we are also adding a new covariate creatinine clearance (CRCL), dividing clearance (CL) into hepatic (hepCL) and renal clearance (renCL), along with a new parameter dCRCL.

dCRCL is the exponent of the power function for the effect of creatinine clearance on renal clearance. In some models this parameter is fixed, however we’ll allow the model to estimate it.

This is a good example on how to add covariate coefficients such as dCRCL in any Pumas covariate model. Now, let’s fit this model. Note that we need a new initial parameters values’ list since the previous one we used doesn’t include dCRCL, tvcl_hep or tvcl_ren:

iparams_covariate_model_wt_crcl = (;
    tvvc = 5,
    tvcl_hep = 0.01,
    tvcl_ren = 0.01,
    tvq = 0.01,
    tvvp = 10,
    Ω = Diagonal([0.01, 0.01]),
    σ_add = 0.1,
    σ_prop = 0.1,
    dCRCL = 0.9,
)
(tvvc = 5,
 tvcl_hep = 0.01,
 tvcl_ren = 0.01,
 tvq = 0.01,
 tvvp = 10,
 Ω = [0.01 0.0; 0.0 0.01],
 σ_add = 0.1,
 σ_prop = 0.1,
 dCRCL = 0.9,)
fit_covariate_model_wt_crcl =
    fit(covariate_model_wt_crcl, pop, iparams_covariate_model_wt_crcl, FOCE())
[ Info: Checking the initial parameter values.
[ Info: The initial negative log likelihood and its gradient are finite. Check passed.
Iter     Function value   Gradient norm 
     0     8.554152e+03     5.650279e+03
 * time: 5.3882598876953125e-5
     1     3.572050e+03     1.302046e+03
 * time: 0.04424285888671875
     2     3.266947e+03     5.384244e+02
 * time: 0.057199954986572266
     3     3.150635e+03     1.918079e+02
 * time: 0.08918094635009766
     4     3.108122e+03     1.277799e+02
 * time: 0.09990191459655762
     5     3.091358e+03     8.838080e+01
 * time: 0.11057281494140625
     6     3.082997e+03     8.321689e+01
 * time: 0.1394948959350586
     7     3.076379e+03     8.167702e+01
 * time: 0.15010404586791992
     8     3.067422e+03     1.177822e+02
 * time: 0.1604900360107422
     9     3.048580e+03     2.526969e+02
 * time: 0.1792440414428711
    10     2.993096e+03     6.325396e+02
 * time: 0.19536399841308594
    11     2.965744e+03     7.634718e+02
 * time: 0.23865985870361328
    12     2.921212e+03     9.704020e+02
 * time: 0.257382869720459
    13     2.553649e+03     6.642510e+02
 * time: 0.28296995162963867
    14     2.319495e+03     3.204711e+02
 * time: 0.30412983894348145
    15     2.183040e+03     2.174905e+02
 * time: 0.3298039436340332
    16     2.157621e+03     3.150983e+02
 * time: 0.3402409553527832
    17     2.132395e+03     2.847614e+02
 * time: 0.35831284523010254
    18     2.084799e+03     1.563370e+02
 * time: 0.3688318729400635
    19     2.071497e+03     1.006429e+02
 * time: 0.37856292724609375
    20     2.064983e+03     9.753313e+01
 * time: 0.39582395553588867
    21     2.048289e+03     9.230405e+01
 * time: 0.40662598609924316
    22     2.020938e+03     1.292359e+02
 * time: 0.41663289070129395
    23     1.983888e+03     2.284990e+02
 * time: 0.426832914352417
    24     1.962132e+03     1.220188e+02
 * time: 0.4445159435272217
    25     1.945947e+03     1.035894e+02
 * time: 0.45409584045410156
    26     1.917782e+03     8.246698e+01
 * time: 0.46387195587158203
    27     1.905967e+03     5.408054e+01
 * time: 0.48122096061706543
    28     1.898569e+03     2.172222e+01
 * time: 0.4914579391479492
    29     1.897473e+03     1.689350e+01
 * time: 0.5008969306945801
    30     1.897019e+03     1.666689e+01
 * time: 0.5177159309387207
    31     1.896796e+03     1.699751e+01
 * time: 0.5276639461517334
    32     1.896559e+03     1.645900e+01
 * time: 0.5370738506317139
    33     1.896223e+03     1.415504e+01
 * time: 0.54640793800354
    34     1.895773e+03     1.630081e+01
 * time: 0.5634779930114746
    35     1.895309e+03     1.723930e+01
 * time: 0.5732178688049316
    36     1.895004e+03     1.229983e+01
 * time: 0.5827629566192627
    37     1.894871e+03     5.385102e+00
 * time: 0.5996758937835693
    38     1.894827e+03     3.465463e+00
 * time: 0.6095459461212158
    39     1.894816e+03     3.387474e+00
 * time: 0.6187410354614258
    40     1.894807e+03     3.295388e+00
 * time: 0.6350648403167725
    41     1.894786e+03     3.089194e+00
 * time: 0.6447999477386475
    42     1.894737e+03     2.928080e+00
 * time: 0.6541118621826172
    43     1.894624e+03     3.088723e+00
 * time: 0.6633288860321045
    44     1.894413e+03     3.493791e+00
 * time: 0.6801049709320068
    45     1.894127e+03     3.142865e+00
 * time: 0.6897599697113037
    46     1.893933e+03     2.145253e+00
 * time: 0.699131965637207
    47     1.893888e+03     2.172800e+00
 * time: 0.7153928279876709
    48     1.893880e+03     2.180509e+00
 * time: 0.725031852722168
    49     1.893876e+03     2.134257e+00
 * time: 0.734004020690918
    50     1.893868e+03     2.032354e+00
 * time: 0.7429180145263672
    51     1.893846e+03     1.760874e+00
 * time: 0.7594048976898193
    52     1.893796e+03     1.779016e+00
 * time: 0.7688260078430176
    53     1.893694e+03     2.018956e+00
 * time: 0.7780039310455322
    54     1.893559e+03     2.366854e+00
 * time: 0.794003963470459
    55     1.893474e+03     3.690142e+00
 * time: 0.8037128448486328
    56     1.893446e+03     3.675109e+00
 * time: 0.8128349781036377
    57     1.893439e+03     3.426419e+00
 * time: 0.8217089176177979
    58     1.893429e+03     3.183164e+00
 * time: 0.8382680416107178
    59     1.893398e+03     2.695171e+00
 * time: 0.8474729061126709
    60     1.893328e+03     2.753548e+00
 * time: 0.8565938472747803
    61     1.893169e+03     3.589748e+00
 * time: 0.8731098175048828
    62     1.892920e+03     3.680718e+00
 * time: 0.8829300403594971
    63     1.892667e+03     2.568107e+00
 * time: 0.8921558856964111
    64     1.892514e+03     1.087910e+00
 * time: 0.9080288410186768
    65     1.892493e+03     3.287296e-01
 * time: 0.917680025100708
    66     1.892492e+03     2.967465e-01
 * time: 0.9267499446868896
    67     1.892492e+03     3.020682e-01
 * time: 0.9354488849639893
    68     1.892491e+03     3.034704e-01
 * time: 0.9508588314056396
    69     1.892491e+03     3.091846e-01
 * time: 0.9599578380584717
    70     1.892491e+03     3.224170e-01
 * time: 0.9687178134918213
    71     1.892490e+03     6.494197e-01
 * time: 0.9775269031524658
    72     1.892488e+03     1.115188e+00
 * time: 0.9943618774414062
    73     1.892483e+03     1.838833e+00
 * time: 1.0034539699554443
    74     1.892472e+03     2.765371e+00
 * time: 1.012441873550415
    75     1.892452e+03     3.463807e+00
 * time: 1.028479814529419
    76     1.892431e+03     2.805270e+00
 * time: 1.0378758907318115
    77     1.892411e+03     5.758916e-01
 * time: 1.0470178127288818
    78     1.892410e+03     1.434041e-01
 * time: 1.055860996246338
    79     1.892409e+03     1.639246e-01
 * time: 1.0719878673553467
    80     1.892409e+03     1.145856e-01
 * time: 1.080777883529663
    81     1.892409e+03     3.966861e-02
 * time: 1.089404821395874
    82     1.892409e+03     3.550808e-02
 * time: 1.1046829223632812
    83     1.892409e+03     3.456241e-02
 * time: 1.1134018898010254
    84     1.892409e+03     3.114018e-02
 * time: 1.1217598915100098
    85     1.892409e+03     4.080806e-02
 * time: 1.1301050186157227
    86     1.892409e+03     6.722726e-02
 * time: 1.1456139087677002
    87     1.892409e+03     1.006791e-01
 * time: 1.154498815536499
    88     1.892409e+03     1.303988e-01
 * time: 1.1632428169250488
    89     1.892409e+03     1.228919e-01
 * time: 1.1789798736572266
    90     1.892409e+03     6.433813e-02
 * time: 1.188208818435669
    91     1.892409e+03     1.314164e-02
 * time: 1.1969008445739746
    92     1.892409e+03     4.929931e-04
 * time: 1.2052619457244873
FittedPumasModel

Successful minimization:                      true

Likelihood approximation:                     FOCE
Log-likelihood value:                    -1892.409
Number of subjects:                             30
Number of parameters:         Fixed      Optimized
                                  0             10
Observation records:         Active        Missing
    DV:                         540              0
    Total:                      540              0

-----------------------
             Estimate
-----------------------
tvvc          3.9757
tvq           0.042177
tvvp          3.6434
tvcl_hep      0.058572
tvcl_ren      0.1337
Ω₁,₁          0.18299
Ω₂,₂          0.081353
σ_add         0.032174
σ_prop        0.06101
dCRCL         1.1331
-----------------------

As before, our loglikelihood is higher (implying lower objective function value). Furthermore, our additive and combined error values remain almost the same while our Ω on CL, Ω₁,₁, decreased. This implies that the CRCL covariate with an estimated exponent dCRCL is definitely assisting in a better model fit.

Finally let’s include a separated CL model based on sex as a third covariate model:

covariate_model_wt_crcl_sex = @model begin
    @param begin
        tvvc  RealDomain(; lower = 0)
        tvq  RealDomain(; lower = 0)
        tvvp  RealDomain(; lower = 0)
        tvcl_hep_M  RealDomain(; lower = 0)
        tvcl_hep_F  RealDomain(; lower = 0)
        tvcl_ren_M  RealDomain(; lower = 0)
        tvcl_ren_F  RealDomain(; lower = 0)
        Ω  PDiagDomain(2)
        σ_add  RealDomain(; lower = 0)
        σ_prop  RealDomain(; lower = 0)
        dCRCL_M  RealDomain()
        dCRCL_F  RealDomain()
    end

    @random begin
        η ~ MvNormal(Ω)
    end

    @covariates begin
        WT
        CRCL
        SEX
    end

    @pre begin
        tvcl_hep = ifelse(SEX == "M", tvcl_hep_M, tvcl_hep_F)
        tvcl_ren = ifelse(SEX == "M", tvcl_ren_M, tvcl_ren_F)
        dCRCL = ifelse(SEX == "M", dCRCL_M, dCRCL_F)
        hepCL = tvcl_hep * (WT / 70)^0.75
        renCL = tvcl_ren * (CRCL / 100)^dCRCL
        CL = (hepCL + renCL) * exp(η[1])
        Vc = tvvc * (WT / 70) * exp(η[2])
        Q = tvq
        Vp = tvvp
    end

    @dynamics Central1Periph1

    @derived begin
        cp := @. Central / Vc
        DV ~ @. Normal(cp, sqrt(cp^2 * σ_prop^2 + σ_add^2))
    end
end
PumasModel
  Parameters: tvvc, tvq, tvvp, tvcl_hep_M, tvcl_hep_F, tvcl_ren_M, tvcl_ren_F, Ω, σ_add, σ_prop, dCRCL_M, dCRCL_F
  Random effects: η
  Covariates: WT, CRCL, SEX
  Dynamical variables: Central, Peripheral
  Derived: DV
  Observed: DV

In the covariate_model_wt_crcl_sex model we are keeping our allometric scaling on WT, the CRCL effect on renCL, and the breakdown of CL into hepCL and renCL from before. However we are separating them with different values by sex. Hence, we have a new covariate SEX and six new parameters in the @param block by expanding tvcl_hep, tvcl_ren, and dCRCL into male (suffix M) and female (suffix F).

This is a good example on how to add create binary values based on covariate values such as SEX inside the @pre block with the ifelse function. Now, let’s fit this model. Note that we need a new initial parameters values’ list since the previous one we used had a single tvcl_hep, tvcl_ren, and dCRCL:

iparams_covariate_model_wt_crcl_sex = (;
    tvvc = 5,
    tvcl_hep_M = 0.01,
    tvcl_hep_F = 0.01,
    tvcl_ren_M = 0.01,
    tvcl_ren_F = 0.01,
    tvq = 0.01,
    tvvp = 10,
    Ω = Diagonal([0.01, 0.01]),
    σ_add = 0.1,
    σ_prop = 0.1,
    dCRCL_M = 0.9,
    dCRCL_F = 0.9,
)
(tvvc = 5,
 tvcl_hep_M = 0.01,
 tvcl_hep_F = 0.01,
 tvcl_ren_M = 0.01,
 tvcl_ren_F = 0.01,
 tvq = 0.01,
 tvvp = 10,
 Ω = [0.01 0.0; 0.0 0.01],
 σ_add = 0.1,
 σ_prop = 0.1,
 dCRCL_M = 0.9,
 dCRCL_F = 0.9,)
fit_covariate_model_wt_crcl_sex =
    fit(covariate_model_wt_crcl_sex, pop, iparams_covariate_model_wt_crcl_sex, FOCE())
[ Info: Checking the initial parameter values.
[ Info: The initial negative log likelihood and its gradient are finite. Check passed.
Iter     Function value   Gradient norm 
     0     8.554152e+03     5.650279e+03
 * time: 6.794929504394531e-5
     1     3.641387e+03     1.432080e+03
 * time: 0.018646955490112305
     2     3.290450e+03     5.274782e+02
 * time: 0.05135989189147949
     3     3.185512e+03     2.173676e+02
 * time: 0.0641939640045166
     4     3.143009e+03     1.479653e+02
 * time: 0.07625794410705566
     5     3.128511e+03     8.980031e+01
 * time: 0.10706400871276855
     6     3.123188e+03     5.033125e+01
 * time: 0.11873388290405273
     7     3.120794e+03     4.279722e+01
 * time: 0.1383068561553955
     8     3.118627e+03     3.971051e+01
 * time: 0.15030407905578613
     9     3.115300e+03     8.456587e+01
 * time: 0.16186094284057617
    10     3.109353e+03     1.350354e+02
 * time: 0.18221306800842285
    11     3.095894e+03     1.998258e+02
 * time: 0.1948239803314209
    12     2.988214e+03     4.366433e+02
 * time: 0.22010087966918945
    13     2.896081e+03     5.505943e+02
 * time: 0.27250003814697266
    14     2.652467e+03     7.300323e+02
 * time: 0.5493929386138916
    15     2.560937e+03     6.973661e+02
 * time: 0.6270759105682373
    16     2.254941e+03     2.740033e+02
 * time: 0.6416800022125244
    17     2.222509e+03     2.034303e+02
 * time: 0.6629970073699951
    18     2.171255e+03     2.449580e+02
 * time: 0.6763420104980469
    19     2.024532e+03     1.121511e+02
 * time: 0.6888439655303955
    20     1.993723e+03     1.042814e+02
 * time: 0.7095348834991455
    21     1.985113e+03     8.079014e+01
 * time: 0.7212610244750977
    22     1.976757e+03     7.054196e+01
 * time: 0.7406718730926514
    23     1.969970e+03     6.070322e+01
 * time: 0.7525689601898193
    24     1.961095e+03     6.810782e+01
 * time: 0.7638030052185059
    25     1.947983e+03     8.116920e+01
 * time: 0.7835938930511475
    26     1.930371e+03     8.530051e+01
 * time: 0.7954220771789551
    27     1.910209e+03     6.993170e+01
 * time: 0.8070290088653564
    28     1.899107e+03     3.362640e+01
 * time: 0.8270480632781982
    29     1.898022e+03     2.642220e+01
 * time: 0.838015079498291
    30     1.897055e+03     1.213144e+01
 * time: 0.8572008609771729
    31     1.896596e+03     7.773239e+00
 * time: 0.8694009780883789
    32     1.896538e+03     7.997039e+00
 * time: 0.8804640769958496
    33     1.896451e+03     8.160909e+00
 * time: 0.8999829292297363
    34     1.896283e+03     8.237721e+00
 * time: 0.9114809036254883
    35     1.895903e+03     1.520219e+01
 * time: 0.922631025314331
    36     1.895272e+03     2.358916e+01
 * time: 0.9422280788421631
    37     1.894536e+03     2.461296e+01
 * time: 0.9535760879516602
    38     1.893995e+03     1.546128e+01
 * time: 0.9728949069976807
    39     1.893858e+03     6.976137e+00
 * time: 0.9845478534698486
    40     1.893833e+03     6.019466e+00
 * time: 0.9954409599304199
    41     1.893786e+03     3.827201e+00
 * time: 1.0145840644836426
    42     1.893714e+03     3.323412e+00
 * time: 1.0259079933166504
    43     1.893592e+03     3.215150e+00
 * time: 1.0371198654174805
    44     1.893435e+03     6.534965e+00
 * time: 1.0576720237731934
    45     1.893286e+03     7.424154e+00
 * time: 1.0689339637756348
    46     1.893190e+03     5.552627e+00
 * time: 1.0881998538970947
    47     1.893139e+03     3.222316e+00
 * time: 1.099674940109253
    48     1.893120e+03     3.015339e+00
 * time: 1.110529899597168
    49     1.893107e+03     3.244809e+00
 * time: 1.1293799877166748
    50     1.893080e+03     6.163100e+00
 * time: 1.1404399871826172
    51     1.893027e+03     9.824713e+00
 * time: 1.1513080596923828
    52     1.892912e+03     1.390100e+01
 * time: 1.1704468727111816
    53     1.892734e+03     1.510937e+01
 * time: 1.1814279556274414
    54     1.892561e+03     1.008563e+01
 * time: 1.2002370357513428
    55     1.892485e+03     3.730668e+00
 * time: 1.2116498947143555
    56     1.892471e+03     3.380261e+00
 * time: 1.2223200798034668
    57     1.892463e+03     3.167904e+00
 * time: 1.2418508529663086
    58     1.892441e+03     4.152065e+00
 * time: 1.253260850906372
    59     1.892391e+03     7.355996e+00
 * time: 1.264190912246704
    60     1.892268e+03     1.195397e+01
 * time: 1.2838869094848633
    61     1.892026e+03     1.640783e+01
 * time: 1.2952289581298828
    62     1.891735e+03     1.593576e+01
 * time: 1.3065259456634521
    63     1.891569e+03     8.316423e+00
 * time: 1.3261198997497559
    64     1.891494e+03     3.948212e+00
 * time: 1.3369159698486328
    65     1.891481e+03     3.911593e+00
 * time: 1.3557238578796387
    66     1.891457e+03     3.875559e+00
 * time: 1.3670849800109863
    67     1.891405e+03     3.811247e+00
 * time: 1.3778250217437744
    68     1.891262e+03     3.657045e+00
 * time: 1.397075891494751
    69     1.890930e+03     4.957405e+00
 * time: 1.4087309837341309
    70     1.890317e+03     6.657726e+00
 * time: 1.4202229976654053
    71     1.889660e+03     6.086302e+00
 * time: 1.4403319358825684
    72     1.889303e+03     2.270929e+00
 * time: 1.4513239860534668
    73     1.889253e+03     7.695301e-01
 * time: 1.4702520370483398
    74     1.889252e+03     7.382144e-01
 * time: 1.4815239906311035
    75     1.889251e+03     7.187898e-01
 * time: 1.4920718669891357
    76     1.889251e+03     7.215047e-01
 * time: 1.5106449127197266
    77     1.889250e+03     7.235155e-01
 * time: 1.5215709209442139
    78     1.889249e+03     7.246818e-01
 * time: 1.5320558547973633
    79     1.889244e+03     7.257796e-01
 * time: 1.550739049911499
    80     1.889233e+03     7.198190e-01
 * time: 1.5617210865020752
    81     1.889204e+03     1.089029e+00
 * time: 1.572516918182373
    82     1.889142e+03     1.801601e+00
 * time: 1.5919289588928223
    83     1.889043e+03     2.967917e+00
 * time: 1.6031160354614258
    84     1.888889e+03     2.965856e+00
 * time: 1.6222100257873535
    85     1.888705e+03     5.933555e-01
 * time: 1.6338279247283936
    86     1.888655e+03     9.577698e-01
 * time: 1.6445488929748535
    87     1.888582e+03     1.498494e+00
 * time: 1.6636528968811035
    88     1.888533e+03     1.502751e+00
 * time: 1.6751160621643066
    89     1.888490e+03     1.184664e+00
 * time: 1.6860339641571045
    90     1.888480e+03     6.684515e-01
 * time: 1.7051968574523926
    91     1.888476e+03     3.680032e-01
 * time: 1.7161459922790527
    92     1.888476e+03     4.720040e-01
 * time: 1.7268319129943848
    93     1.888476e+03     4.768646e-01
 * time: 1.7457330226898193
    94     1.888475e+03     4.736674e-01
 * time: 1.7560999393463135
    95     1.888475e+03     4.552765e-01
 * time: 1.7749519348144531
    96     1.888474e+03     5.193725e-01
 * time: 1.7868120670318604
    97     1.888473e+03     8.850098e-01
 * time: 1.797631025314331
    98     1.888468e+03     1.461598e+00
 * time: 1.8165209293365479
    99     1.888458e+03     2.209125e+00
 * time: 1.827561855316162
   100     1.888437e+03     2.961236e+00
 * time: 1.8382458686828613
   101     1.888407e+03     2.978462e+00
 * time: 1.857100009918213
   102     1.888384e+03     1.707199e+00
 * time: 1.8681769371032715
   103     1.888381e+03     6.198999e-01
 * time: 1.8789920806884766
   104     1.888380e+03     5.171075e-01
 * time: 1.8980839252471924
   105     1.888378e+03     1.037325e-01
 * time: 1.908829927444458
   106     1.888378e+03     8.473252e-02
 * time: 1.9271259307861328
   107     1.888378e+03     8.364965e-02
 * time: 1.9379639625549316
   108     1.888378e+03     8.080442e-02
 * time: 1.9482848644256592
   109     1.888378e+03     7.873900e-02
 * time: 1.967391014099121
   110     1.888378e+03     7.798398e-02
 * time: 1.9783639907836914
   111     1.888378e+03     7.788170e-02
 * time: 1.988481044769287
   112     1.888378e+03     7.776461e-02
 * time: 2.0070228576660156
   113     1.888378e+03     9.023589e-02
 * time: 2.0179529190063477
   114     1.888378e+03     1.631371e-01
 * time: 2.0283260345458984
   115     1.888378e+03     2.768693e-01
 * time: 2.04708194732666
   116     1.888377e+03     4.462316e-01
 * time: 2.058215856552124
   117     1.888377e+03     6.643173e-01
 * time: 2.068938970565796
   118     1.888375e+03     8.433185e-01
 * time: 2.0885560512542725
   119     1.888374e+03     7.596487e-01
 * time: 2.0996739864349365
   120     1.888373e+03     3.637862e-01
 * time: 2.1106278896331787
   121     1.888372e+03     8.305257e-02
 * time: 2.1305758953094482
   122     1.888372e+03     2.084516e-02
 * time: 2.141195058822632
   123     1.888372e+03     2.056416e-02
 * time: 2.1601550579071045
   124     1.888372e+03     2.044079e-02
 * time: 2.171351909637451
   125     1.888372e+03     2.035197e-02
 * time: 2.1813080310821533
   126     1.888372e+03     2.021266e-02
 * time: 2.2000229358673096
   127     1.888372e+03     1.998168e-02
 * time: 2.21111798286438
   128     1.888372e+03     3.162075e-02
 * time: 2.2213380336761475
   129     1.888372e+03     5.509975e-02
 * time: 2.2403650283813477
   130     1.888372e+03     9.277121e-02
 * time: 2.251849889755249
   131     1.888372e+03     1.528958e-01
 * time: 2.2625200748443604
   132     1.888372e+03     2.462098e-01
 * time: 2.281765937805176
   133     1.888372e+03     3.799859e-01
 * time: 2.2930569648742676
   134     1.888371e+03     5.312328e-01
 * time: 2.3037049770355225
   135     1.888369e+03     6.019736e-01
 * time: 2.3230230808258057
   136     1.888367e+03     4.665329e-01
 * time: 2.3340890407562256
   137     1.888366e+03     1.404918e-01
 * time: 2.3448219299316406
   138     1.888365e+03     8.513282e-02
 * time: 2.3641488552093506
   139     1.888364e+03     1.244276e-01
 * time: 2.374650001525879
   140     1.888364e+03     1.028225e-01
 * time: 2.3932089805603027
   141     1.888364e+03     5.163280e-02
 * time: 2.4044060707092285
   142     1.888364e+03     5.148658e-02
 * time: 2.4148108959198
   143     1.888364e+03     3.147249e-02
 * time: 2.433522939682007
   144     1.888364e+03     2.104604e-02
 * time: 2.4445550441741943
   145     1.888364e+03     6.541495e-03
 * time: 2.45465087890625
   146     1.888364e+03     2.538572e-03
 * time: 2.4731850624084473
   147     1.888364e+03     4.361890e-03
 * time: 2.4838759899139404
   148     1.888364e+03     3.034827e-03
 * time: 2.4938900470733643
   149     1.888364e+03     5.961145e-04
 * time: 2.5123839378356934
FittedPumasModel

Successful minimization:                      true

Likelihood approximation:                     FOCE
Log-likelihood value:                   -1888.3638
Number of subjects:                             30
Number of parameters:         Fixed      Optimized
                                  0             13
Observation records:         Active        Missing
    DV:                         540              0
    Total:                      540              0

--------------------------
               Estimate
--------------------------
tvvc            3.976
tvq             0.04239
tvvp            3.7249
tvcl_hep_M      1.7174e-7
tvcl_hep_F      0.13348
tvcl_ren_M      0.19378
tvcl_ren_F      0.042211
Ω₁,₁            0.14046
Ω₂,₂            0.081349
σ_add           0.032171
σ_prop          0.061007
dCRCL_M         0.94821
dCRCL_F         1.9405
--------------------------

As before, our loglikelihood is higher (implying lower objective function value). This is expected since we also added six new parameters to the model.

1.5 Step 4 - Model Comparison

Now that we’ve fitted all of our models we need to compare them and choose one for our final model.

We begin by analyzing the model metrics. This can be done with the metrics_table function:

metrics_table(fit_base_model)
17×2 DataFrame
Row Metric Value
String Any
1 Successful true
2 Estimation Time 1.089
3 Subjects 30
4 Fixed Parameters 0
5 Optimized Parameters 8
6 DV Active Observations 540
7 DV Missing Observations 0
8 Total Active Observations 540
9 Total Missing Observations 0
10 Likelihood Approximation Pumas.FOCE
11 LogLikelihood (LL) -1901.82
12 -2LL 3803.63
13 AIC 3819.63
14 BIC 3853.96
15 (η-shrinkage) η₁ -0.015
16 (η-shrinkage) η₂ -0.013
17 (ϵ-shrinkage) DV 0.056
metrics_table(fit_covariate_model_wt)
17×2 DataFrame
Row Metric Value
String Any
1 Successful true
2 Estimation Time 0.701
3 Subjects 30
4 Fixed Parameters 0
5 Optimized Parameters 8
6 DV Active Observations 540
7 DV Missing Observations 0
8 Total Active Observations 540
9 Total Missing Observations 0
10 Likelihood Approximation Pumas.FOCE
11 LogLikelihood (LL) -1896.46
12 -2LL 3792.93
13 AIC 3808.93
14 BIC 3843.26
15 (η-shrinkage) η₁ -0.014
16 (η-shrinkage) η₂ -0.012
17 (ϵ-shrinkage) DV 0.056
metrics_table(fit_covariate_model_wt_crcl)
17×2 DataFrame
Row Metric Value
String Any
1 Successful true
2 Estimation Time 1.205
3 Subjects 30
4 Fixed Parameters 0
5 Optimized Parameters 10
6 DV Active Observations 540
7 DV Missing Observations 0
8 Total Active Observations 540
9 Total Missing Observations 0
10 Likelihood Approximation Pumas.FOCE
11 LogLikelihood (LL) -1892.41
12 -2LL 3784.82
13 AIC 3804.82
14 BIC 3847.73
15 (η-shrinkage) η₁ -0.014
16 (η-shrinkage) η₂ -0.012
17 (ϵ-shrinkage) DV 0.056
metrics_table(fit_covariate_model_wt_crcl_sex)
17×2 DataFrame
Row Metric Value
String Any
1 Successful true
2 Estimation Time 2.512
3 Subjects 30
4 Fixed Parameters 0
5 Optimized Parameters 13
6 DV Active Observations 540
7 DV Missing Observations 0
8 Total Active Observations 540
9 Total Missing Observations 0
10 Likelihood Approximation Pumas.FOCE
11 LogLikelihood (LL) -1888.36
12 -2LL 3776.73
13 AIC 3802.73
14 BIC 3858.52
15 (η-shrinkage) η₁ -0.013
16 (η-shrinkage) η₂ -0.012
17 (ϵ-shrinkage) DV 0.056

metrics_table outputs all of the model metrics we might be interested with respect to a certain model. That includes metadata such as estimation time, number of subjects, how many parameters were optimized and fixed, and number of observations. It also includes common model metrics like AIC, BIC, objective function value with constant (-2 loglikelihood), and so on.

We can also do an innerjoin (check our Data Wrangling Tutorials) to get all metrics into a single DataFrame:

all_metrics = innerjoin(
    metrics_table(fit_base_model),
    metrics_table(fit_covariate_model_wt),
    metrics_table(fit_covariate_model_wt_crcl),
    metrics_table(fit_covariate_model_wt_crcl_sex);
    on = :Metric,
    makeunique = true,
);
rename!(
    all_metrics,
    :Value => :Base_Model,
    :Value_1 => :Covariate_Model_WT,
    :Value_2 => :Covariate_Model_WT_CRCL,
    :Value_3 => :Covariate_Model_WT_CRCL_SEX,
)
17×5 DataFrame
Row Metric Base_Model Covariate_Model_WT Covariate_Model_WT_CRCL Covariate_Model_WT_CRCL_SEX
String Any Any Any Any
1 Successful true true true true
2 Estimation Time 1.089 0.701 1.205 2.512
3 Subjects 30 30 30 30
4 Fixed Parameters 0 0 0 0
5 Optimized Parameters 8 8 10 13
6 DV Active Observations 540 540 540 540
7 DV Missing Observations 0 0 0 0
8 Total Active Observations 540 540 540 540
9 Total Missing Observations 0 0 0 0
10 Likelihood Approximation Pumas.FOCE Pumas.FOCE Pumas.FOCE Pumas.FOCE
11 LogLikelihood (LL) -1901.82 -1896.46 -1892.41 -1888.36
12 -2LL 3803.63 3792.93 3784.82 3776.73
13 AIC 3819.63 3808.93 3804.82 3802.73
14 BIC 3853.96 3843.26 3847.73 3858.52
15 (η-shrinkage) η₁ -0.015 -0.014 -0.014 -0.013
16 (η-shrinkage) η₂ -0.013 -0.012 -0.012 -0.012
17 (ϵ-shrinkage) DV 0.056 0.056 0.056 0.056

We can also use specific functions to retrieve those. For example, in order to get a model’s AIC you can use the aic function:

aic(fit_base_model)
3819.6299849528205
aic(fit_covariate_model_wt)
3808.926460776502
aic(fit_covariate_model_wt_crcl)
3804.817947371705
aic(fit_covariate_model_wt_crcl_sex)
3802.7275243740733

We should favor lower values of AIC, hence, the covariate model with weight, creatinine clerance, and different sex effects on clearance should be preferred, i.e. covariate_model_wt_crcl_sex.

1.5.1 Goodness of Fit Plots

Additionally, we should inspect the goodness of fit of the model. This is done with the plotting function goodness_of_fit, which should be given a result from a inspect function. So, let’s first call inspect on our covariate_model_wt_crcl_sex candidate for best model:

inspect_covariate_model_wt_crcl_sex = inspect(fit_covariate_model_wt_crcl_sex)
[ Info: Calculating predictions.
[ Info: Calculating weighted residuals.
[ Info: Calculating empirical bayes.
[ Info: Evaluating individual parameters.
[ Info: Evaluating dose control parameters.
[ Info: Done.
FittedPumasModelInspection

Fitting was successful: true
Likehood approximation used for weighted residuals : Pumas.FOCE()

And now we pass inspect_covariate_model_wt_crcl_sex to the goodness_of_fit plotting function:

goodness_of_fit(inspect_covariate_model_wt_crcl_sex)

The idea is that the population predictions (preds) capture the general tendency of the observations while the individual predictions (ipreds) should coincide much more closely with the observations. That is exactly what we are observing in the top row subplots in our goodness of fit plot.

Regarding the bottom row, on the left we have the weighted population residuals (wres) against time, and on the right we have the weighted individual residuals (iwres) against ipreds. Here we should not see any perceived pattern, which indicates that the error model in the model has a mean 0 and constant variance. Like before, this seems to be what we are observing in our goodness of fit plot.

Hence, our covariate model with allometric scaling and different sex creatinine clearance effectw on clearance is a good candidate for our final model.

1.6 Time-Varying Covariates

Pumas can handle time-varying covariates. This happens automatically if, when parsing a dataset, read_pumas detects that covariate values change over time.

1.6.1 painord Dataset

Here’s an example with an ordinal regression using the painord dataset from PharmaDatasets.jl. :painord is our observations measuring the perceived pain in a scale from 0 to 3, which we need to have the range shifted by 1 (1 to 4). Additionally, we’ll use the concentration in plasma, :conc as a covariate. Of course, :conc varies with time, thus, it is a time-varying covariate:

painord = dataset("pumas/pain_remed")
first(painord, 5)
5×8 DataFrame
Row id arm dose time conc painord dv remed
Int64 Int64 Int64 Float64 Float64 Int64 Int64 Int64
1 1 2 20 0.0 0.0 3 0 0
2 1 2 20 0.5 1.15578 1 1 0
3 1 2 20 1.0 1.37211 0 1 0
4 1 2 20 1.5 1.30058 0 1 0
5 1 2 20 2.0 1.19195 1 1 0
@rtransform! painord :painord = :painord + 1;
describe(painord, :mean, :std, :first, :last, :eltype)
8×6 DataFrame
Row variable mean std first last eltype
Symbol Float64 Float64 Real Real DataType
1 id 80.5 46.1992 1 160 Int64
2 arm 1.5 1.11833 2 0 Int64
3 dose 26.25 31.9017 20 0 Int64
4 time 3.375 2.5183 0.0 8.0 Float64
5 conc 0.93018 1.49902 0.0 0.0 Float64
6 painord 2.50208 0.863839 4 4 Int64
7 dv 0.508333 0.500061 0 0 Int64
8 remed 0.059375 0.236387 0 0 Int64
unique(painord.dose)
4-element Vector{Int64}:
 20
 80
  0
  5

As we can see we have 160 subjects were given either 0, 5, 20, or 80 units of a certain painkiller drug.

:conc is the drug concentration in plasma and :painord is the perceived pain in a scale from 1 to 4.

First, we’ll parse the painord dataset into a Population. Note that we’ll be using event_data=false since we do not have any dosing rows:

pop_ord =
    read_pumas(painord; observations = [:painord], covariates = [:conc], event_data = false)
Note

We won’t be going into the details of the ordinal regression model in this tutorial. We highly encourage you to take a look at the Ordinal Regression Pumas Tutorial for an in-depth explanation.

We’ll build an ordinal regression model declaring :conc as a covariate. In the @derived block we’ll state the the likelihood of :painord follows a Categorical distribution:

ordinal_model = @model begin
    @param begin
        b₁  RealDomain(; init = 0)
        b₂  RealDomain(; init = 1)
        b₃  RealDomain(; init = 1)
        slope  RealDomain(; init = 0)
    end

    @covariates conc # time-varying

    @pre begin
        effect = slope * conc

        # Logit of cumulative probabilities
        lge₁ = b₁ + effect
        lge₂ = lge₁ - b₂
        lge₃ = lge₂ - b₃

        # Probabilities of >=1 and >=2 and >=3
        pge₁ = logistic(lge₁)
        pge₂ = logistic(lge₂)
        pge₃ = logistic(lge₃)

        # Probabilities of Y=1,2,3,4
        p₁ = 1.0 - pge₁
        p₂ = pge₁ - pge₂
        p₃ = pge₂ - pge₃
        p₄ = pge₃
    end

    @derived begin
        painord ~ @. Categorical(p₁, p₂, p₃, p₄)
    end
end
PumasModel
  Parameters: b₁, b₂, b₃, slope
  Random effects: 
  Covariates: conc
  Dynamical variables: 
  Derived: painord
  Observed: painord

Finally we’ll fit our model using NaivePooled estimation method since it does not have any random-effects, i.e. no @random block:

ordinal_fit = fit(ordinal_model, pop_ord, init_params(ordinal_model), NaivePooled())
[ Info: Checking the initial parameter values.
[ Info: The initial negative log likelihood and its gradient are finite. Check passed.
Iter     Function value   Gradient norm 
     0     3.103008e+03     7.031210e+02
 * time: 6.413459777832031e-5
     1     2.994747e+03     1.083462e+03
 * time: 0.004961967468261719
     2     2.406265e+03     1.884408e+02
 * time: 0.009775161743164062
     3     2.344175e+03     7.741610e+01
 * time: 0.06689310073852539
     4     2.323153e+03     2.907642e+01
 * time: 0.07056307792663574
     5     2.318222e+03     2.273295e+01
 * time: 0.07421994209289551
     6     2.316833e+03     1.390527e+01
 * time: 0.07811093330383301
     7     2.316425e+03     4.490883e+00
 * time: 0.08174705505371094
     8     2.316362e+03     9.374519e-01
 * time: 0.08543205261230469
     9     2.316356e+03     1.928785e-01
 * time: 0.08985614776611328
    10     2.316355e+03     3.119615e-02
 * time: 0.09434199333190918
    11     2.316355e+03     6.215513e-03
 * time: 0.09905409812927246
    12     2.316355e+03     8.313206e-04
 * time: 0.10365414619445801
FittedPumasModel

Successful minimization:                      true

Likelihood approximation:              NaivePooled
Log-likelihood value:                   -2316.3554
Number of subjects:                            160
Number of parameters:         Fixed      Optimized
                                  0              4
Observation records:         Active        Missing
    painord:                   1920              0
    Total:                     1920              0

-------------------
          Estimate
-------------------
b₁         2.5112
b₂         2.1951
b₃         1.9643
slope     -0.38871
-------------------

As expected, the ordinal model fit estimates a negative effect of :conc on :painord measured by the slope parameter.

1.7 Missing Data in Covariates

The way how Pumas handles missing values inside covariates depends if the covariate is constant or time-varying. For both cases Pumas will interpolate the available values to fill in the missing values. If, for any subject, all of the covariate’s values are missing, Pumas will throw an error while parsing the data with read_pumas.

For both missing constant and time-varying covariates, Pumas, by default, does piece-wise constant interpolation with “next observation carried backward” (NOCB, NONMEM default). Of course for constant covariates the interpolated values over the missing values will be constant values. This can be adjusted with the covariates_direction keyword argument of read_pumas. The default value :right is NOCB and :left is “last observation carried forward” (LOCF, Monolix default).

Hence, for LOCF, you can use the following:

pop = read_pumas(pkdata; covariates_direction = :left)

along with any other required keyword arguments for column mapping.

Note

The same behavior for covariates_direction applies to time-varying covariates during the interpolation in the ODE solver. They will also be piece-wise constant interpolated following either NOCB or LOCF depending on the covariates_direction value.

1.8 Categorical Covariates

In some situations, you’ll find yourself with categorical covariates with multiple levels, instead of binary or continuous covariates. Categorical covariates are covariates that can take on a finite number of distinct values.

Pumas can easily address categorical covariates. In the @pre block you can use a nested if ... elseif ... else statement to handle the different categories.

For example:

@pre begin
    CL = if RACE == 1
        tvcl * (WT / 70)^dwtdcl * exp(η[1]) * drace1dcl
    elseif RACE == 2
        tvcl * (WT / 70)^dwtdcl * exp(η[1]) * drace2dcl
    elseif RACE == 3
        tvcl * (WT / 70)^dwtdcl * exp(η[1]) * drace3dcl
    end
end

Here we are conditioning the clearance (CL) on the RACE covariate by modulating which population-level parameter will be used for the clearance calculation: drace1dcl, drace2dcl, and drace3dcl.

There’s nothing wrong with the code above, but it can be a bit cumbersome to write and read. In order to make it more readable and maintainable, you can use the following example:

@pre begin
    raceoncl = race1cl^(race == 1) * race2cl^(race == 2) * race3cl^(race == 3)
    CL = tvcl * raceoncl
end

Here we are using the ^ operator to raise each race value to the power of the race1cl, race2cl, and race3cl values. If any of the race values is not equal to the race value, the result will be 1, otherwise it will be the respective race1cl, race2cl, or race3cl value.

1.9 Conclusion

This tutorial shows how to build covariate model in Pumas in a workflow approach. The main purpose was to inform how to:

  • parse covariate data into a Population
  • add covariate information into a model

We also went over what are the differences between constant and time-varying covariates and how does Pumas deal with missing data inside covariates.