File: mixture.Rd

package info (click to toggle)
r-cran-lava 1.8.1%2Bdfsg-1
  • links: PTS, VCS
  • area: main
  • in suites: trixie
  • size: 2,816 kB
  • sloc: sh: 13; makefile: 2
file content (88 lines) | stat: -rw-r--r-- 2,447 bytes parent folder | download | duplicates (3)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/mixture.R
\name{mixture}
\alias{mixture}
\title{Estimate mixture latent variable model.}
\usage{
mixture(
  x,
  data,
  k = length(x),
  control = list(),
  vcov = "observed",
  names = FALSE,
  ...
)
}
\arguments{
\item{x}{List of \code{lvm} objects. If only a single \code{lvm} object is
given, then a \code{k}-mixture of this model is fitted (free parameters
varying between mixture components).}

\item{data}{\code{data.frame}}

\item{k}{Number of mixture components}

\item{control}{Optimization parameters (see details)
#type Type of EM algorithm (standard, classification, stochastic)}

\item{vcov}{of asymptotic covariance matrix (NULL to omit)}

\item{names}{If TRUE returns the names of the parameters (for defining starting values)}

\item{...}{Additional arguments parsed to lower-level functions}
}
\description{
Estimate mixture latent variable model
}
\details{
Estimate parameters in a mixture of latent variable models via the EM
algorithm.

The performance of the EM algorithm can be tuned via the \code{control}
argument, a list where a subset of the following members can be altered:

\describe{ \item{start}{Optional starting values} \item{nstart}{Evaluate
\code{nstart} different starting values and run the EM-algorithm on the
parameters with largest likelihood} \item{tol}{Convergence tolerance of the
EM-algorithm.  The algorithm is stopped when the absolute change in
likelihood and parameter (2-norm) between successive iterations is less than
\code{tol}} \item{iter.max}{Maximum number of iterations of the
EM-algorithm} \item{gamma}{Scale-down (i.e. number between 0 and 1) of the
step-size of the Newton-Raphson algorithm in the M-step} \item{trace}{Trace
information on the EM-algorithm is printed on every \code{trace}th
iteration} }

Note that the algorithm can be aborted any time (C-c) and still be saved
(via on.exit call).
}
\examples{

\donttest{
m0 <- lvm(list(y~x+z,x~z))
distribution(m0,~z) <- binomial.lvm()
d <- sim(m0,2000,p=c("y~z"=2,"y~x"=1),seed=1)

## unmeasured confounder example
m <- baptize(lvm(y~x, x~1));
intercept(m,~x+y) <- NA

if (requireNamespace('mets', quietly=TRUE)) {
  set.seed(42)
  M <- mixture(m,k=2,data=d,control=list(trace=1,tol=1e-6))
  summary(M)
  lm(y~x,d)
  estimate(M,"y~x")
  ## True slope := 1
}
}

}
\seealso{
\code{mvnmix}
}
\author{
Klaus K. Holst
}
\keyword{models}
\keyword{regression}