File: checkConsistency.Rd

package info (click to toggle)
r-cran-tmb 1.9.17-1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 4,344 kB
  • sloc: cpp: 52,049; ansic: 382; makefile: 11
file content (88 lines) | stat: -rw-r--r-- 3,393 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/checker.R
\name{checkConsistency}
\alias{checkConsistency}
\title{Check consistency and Laplace accuracy}
\usage{
checkConsistency(
  obj,
  par = NULL,
  hessian = FALSE,
  estimate = FALSE,
  n = 100,
  observation.name = NULL
)
}
\arguments{
\item{obj}{Object from \code{MakeADFun}}

\item{par}{Parameter vector (\eqn{\theta}) for simulation. If
unspecified use the best encountered parameter of the object.}

\item{hessian}{Calculate the hessian matrix for each replicate ?}

\item{estimate}{Estimate parameters for each replicate ?}

\item{n}{Number of simulations}

\item{observation.name}{Optional; Name of simulated observation}
}
\value{
List with gradient simulations (joint and marginal)
}
\description{
Check consistency of various parts of a TMB implementation.
Requires that user has implemented simulation code for the data and
optionally random effects. (\emph{Beta version; may change without
notice})
}
\details{
This function checks that the simulation code of random effects and
data is consistent with the implemented negative log-likelihood
function. It also checks whether the approximate \emph{marginal}
score function is central indicating whether the Laplace
approximation is suitable for parameter estimation.

Denote by \eqn{u} the random effects, \eqn{\theta} the parameters
and by \eqn{x} the data.  The main assumption is that the user has
implemented the joint negative log likelihood \eqn{f_{\theta}(u,x)}
satisfying
\deqn{\int \int \exp( -f_{\theta}(u,x) ) \:du\:dx = 1}
It follows that the joint and marginal score functions are central:
\enumerate{
  \item \eqn{E_{u,x}\left[\nabla_{\theta}f_{\theta}(u,x)\right]=0}
  \item \eqn{E_{x}\left[\nabla_{\theta}-\log\left( \int \exp(-f_{\theta}(u,x))\:du \right) \right]=0}
}
For each replicate of \eqn{u} and \eqn{x} joint and marginal
gradients are calculated. Appropriate centrality tests are carried
out by \code{\link{summary.checkConsistency}}.  An asymptotic
\eqn{\chi^2} test is used to verify the first identity. Power of
this test increases with the number of simulations \code{n}.  The
second identity holds \emph{approximately} when replacing the
marginal likelihood with its Laplace approximation. A formal test
would thus fail eventually for large \code{n}. Rather, the gradient
bias is transformed to parameter scale (using the estimated
information matrix) to provide an estimate of parameter bias caused
by the Laplace approximation.
}
\section{Simulation/re-estimation}{

A full simulation/re-estimation study is performed when \code{estimate=TRUE}.
By default \link[stats]{nlminb} will be used to perform the minimization, and output is stored in a separate list component 'estimate' for each replicate.
Should a custom optimizer be needed, it can be passed as a user function via the same argument (\code{estimate}).
The function (\code{estimate}) will be called for each simulation as \code{estimate(obj)} where \code{obj} is the simulated model object.
Current default corresponds to \code{estimate = function(obj) nlminb(obj$par,obj$fn,obj$gr)}.
}

\examples{
\dontrun{
runExample("simple")
chk <- checkConsistency(obj)
chk
## Get more details
s <- summary(chk)
s$marginal$p.value  ## Laplace exact for Gaussian models }
}
\seealso{
\code{\link{summary.checkConsistency}}, \code{\link{print.checkConsistency}}
}