1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98
|
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/knn3.R
\name{knn3}
\alias{knn3}
\alias{knn3.formula}
\alias{knn3.matrix}
\alias{knn3.data.frame}
\alias{knn3Train}
\alias{print.knn3}
\title{k-Nearest Neighbour Classification}
\usage{
knn3(x, ...)
\method{knn3}{formula}(formula, data, subset, na.action, k = 5, ...)
\method{knn3}{data.frame}(x, y, k = 5, ...)
\method{knn3}{matrix}(x, y, k = 5, ...)
\method{print}{knn3}(x, ...)
knn3Train(train, test, cl, k = 1, l = 0, prob = TRUE, use.all = TRUE)
}
\arguments{
\item{x}{a matrix of training set predictors}
\item{...}{additional parameters to pass to \code{knn3Train}. However,
passing \code{prob = FALSE} will be over-ridden.}
\item{formula}{a formula of the form \code{lhs ~ rhs} where \code{lhs} is
the response variable and \code{rhs} a set of predictors.}
\item{data}{optional data frame containing the variables in the model
formula.}
\item{subset}{optional vector specifying a subset of observations to be
used.}
\item{na.action}{function which indicates what should happen when the data
contain \code{NA}s.}
\item{k}{number of neighbours considered.}
\item{y}{a factor vector of training set classes}
\item{train}{matrix or data frame of training set cases.}
\item{test}{matrix or data frame of test set cases. A vector will be
interpreted as a row vector for a single case.}
\item{cl}{factor of true classifications of training set}
\item{l}{minimum vote for definite decision, otherwise \code{doubt}. (More
precisely, less than \code{k-l} dissenting votes are allowed, even if
\code{k} is increased by ties.)}
\item{prob}{If this is true, the proportion of the votes for each class are
returned as attribute \code{prob}.}
\item{use.all}{controls handling of ties. If true, all distances equal to
the \code{k}th largest are included. If false, a random selection of
distances equal to the \code{k}th is chosen to use exactly \code{k}
neighbours.}
}
\value{
An object of class \code{knn3}. See \code{\link{predict.knn3}}.
}
\description{
$k$-nearest neighbour classification that can return class votes for all
classes.
}
\details{
\code{knn3} is essentially the same code as \code{\link[ipred]{ipredknn}}
and \code{knn3Train} is a copy of \code{\link[class]{knn}}. The underlying C
code from the \code{class} package has been modified to return the vote
percentages for each class (previously the percentage for the winning class
was returned).
}
\examples{
irisFit1 <- knn3(Species ~ ., iris)
irisFit2 <- knn3(as.matrix(iris[, -5]), iris[,5])
data(iris3)
train <- rbind(iris3[1:25,,1], iris3[1:25,,2], iris3[1:25,,3])
test <- rbind(iris3[26:50,,1], iris3[26:50,,2], iris3[26:50,,3])
cl <- factor(c(rep("s",25), rep("c",25), rep("v",25)))
knn3Train(train, test, cl, k = 5, prob = TRUE)
}
\author{
\code{\link[class]{knn}} by W. N. Venables and B. D. Ripley and
\code{\link[ipred]{ipredknn}} by Torsten.Hothorn
<Torsten.Hothorn@rzmail.uni-erlangen.de>, modifications by Max Kuhn and
Andre Williams
}
\keyword{multivariate}
|