Skip to contents

Load Model and Data

The first step is to locate and load the SBtab model files. In this example we use the SBtab files from an already existing example (“AKAR4”).

In the article “Build and simulate your own Model you can read about how the set up your own SBtab model. You can list the files in a directory with the dir() command.

f <- uqsa_example("AKAR4")
m <- model_from_tsv(f)
o <- as_ode(m)
#> Loading required namespace: pracma
C <- generateCode(o)
#> Warning in generateCode(o): This function will start a background yacas process via Ryacas.
#> There is currently no working way to reset/restart that process.
#>  It is therefore not advisable to generate the code for two different models in the same R session.
#> The definitions for the two models will be mixed up.
modelName <- write_and_compile(C)
#> './AKAR4_gvf.c' was created.
#> building a shared library from c source, and using GSL odeiv2 as backend (pkg-config is used here).
#> cc -shared -fPIC `pkg-config --cflags gsl` -O2 -o './AKAR4.so' './AKAR4_gvf.c' `pkg-config --libs gsl`

The chain of functions that we use to build the model tracks the name of the model via comments:

print(comment(m))
#> [1] "AKAR4"
print(comment(o))
#> [1] "AKAR4"
print(comment(C))
#> [1] "AKAR4"
print(comment(modelName))
#> [1] "./AKAR4.so"

This comment is ultimately used to determine the automatic file-name of the shared library.

ex <- experiments(m,o)

Take a parameter Sample for this Model

We use just one core as this is a tiny model. No MPI, no parallel tempering. First we construct the prior:

mu <- values(m$Parameter)
sigma <- log(m$Parameter$stdv)

dprior <- dNormalPrior(
    mean=log(mu^2/sqrt(mu^2+sigma^2)),
    sd=log(1+sigma^2/mu^2)
)
rprior <- rNormalPrior(
    mean=log(mu)/sqrt(mu^2+sigma^2),
    sd=log(1+sigma^2/mu^2)
)

Next, the MCMC method via metropolis_update, which determines the sampling algorithm we are going to use. Finally, we construct the sampler (MH below), which actually performs the MCMC run, this function takes the inital values, number of steps and the stepsize as arguments.

options(mc.cores=length(ex))
s <- simulator.c(ex,modelName,parMap=logParMap, omit=2)

MH <- mcmc(
    metropolis_update(
        s,
        dprior=dprior
    )
)

Finally, to obtain a sample of parameters:

x <- mcmc_init(
    beta=1.0,
    parMCMC=log(values(m$Parameter)),
    simulate=s,
    dprior=dprior)

h <- tune_step_size(MH,x)
#> acceptance rate: 0.62, step-size: 0.01;
#> acceptance rate: 0.46, step-size: 0.017203;
#> acceptance rate: 0.35, step-size: 0.0265607;
#> acceptance rate: 0.31, step-size: 0.035175;
#> acceptance rate: 0.2, step-size: 0.0426269;
#> acceptance rate: 0.34, step-size: 0.0332698;
N <- 5e4
X <- MH(x,N,eps=h)

print(attr(X,"acceptanceRate"))
#> [1] 0.22888

if (require(hexbin)){
    hexplom(X)
} else {
    pairs(X)
}

Better Sample

We can try to obtain a higher quality sample, through simple means:

  1. Combine several samples (replicas) into one big sample
  2. Use several random starting locations
  3. Remove burn-in phase from each chain
  4. Adjust the step size at least once to get close to 25% acceptance rate