File: step_depth.Rd

package info (click to toggle)
r-cran-recipes 1.0.4%2Bdfsg-1
  • links: PTS, VCS
  • area: main
  • in suites: bookworm
  • size: 3,636 kB
  • sloc: sh: 37; makefile: 2
file content (156 lines) | stat: -rw-r--r-- 5,657 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/depth.R
\name{step_depth}
\alias{step_depth}
\title{Data Depths}
\usage{
step_depth(
  recipe,
  ...,
  class,
  role = "predictor",
  trained = FALSE,
  metric = "halfspace",
  options = list(),
  data = NULL,
  prefix = "depth_",
  skip = FALSE,
  id = rand_id("depth")
)
}
\arguments{
\item{recipe}{A recipe object. The step will be added to the
sequence of operations for this recipe.}

\item{...}{One or more selector functions to choose variables
for this step. See \code{\link[=selections]{selections()}} for more details.}

\item{class}{A single character string that specifies a single
categorical variable to be used as the class.}

\item{role}{For model terms created by this step, what analysis role should
they be assigned? By default, the new columns created by this step from
the original variables will be used as \emph{predictors} in a model.}

\item{trained}{A logical to indicate if the quantities for
preprocessing have been estimated.}

\item{metric}{A character string specifying the depth metric.
Possible values are "potential", "halfspace", "Mahalanobis",
"simplicialVolume", "spatial", and "zonoid".}

\item{options}{A list of options to pass to the underlying
depth functions. See \code{\link[ddalpha:depth.halfspace]{ddalpha::depth.halfspace()}},
\code{\link[ddalpha:depth.Mahalanobis]{ddalpha::depth.Mahalanobis()}},
\code{\link[ddalpha:depth.potential]{ddalpha::depth.potential()}},
\code{\link[ddalpha:depth.projection]{ddalpha::depth.projection()}},
\code{\link[ddalpha:depth.simplicial]{ddalpha::depth.simplicial()}},
\code{\link[ddalpha:depth.simplicialVolume]{ddalpha::depth.simplicialVolume()}},
\code{\link[ddalpha:depth.spatial]{ddalpha::depth.spatial()}},
\code{\link[ddalpha:depth.zonoid]{ddalpha::depth.zonoid()}}.}

\item{data}{The training data are stored here once after
\code{\link[=prep]{prep()}} is executed.}

\item{prefix}{A character string for the prefix of the resulting new
variables. See notes below.}

\item{skip}{A logical. Should the step be skipped when the
recipe is baked by \code{\link[=bake]{bake()}}? While all operations are baked
when \code{\link[=prep]{prep()}} is run, some operations may not be able to be
conducted on new data (e.g. processing the outcome variable(s)).
Care should be taken when using \code{skip = TRUE} as it may affect
the computations for subsequent operations.}

\item{id}{A character string that is unique to this step to identify it.}
}
\value{
An updated version of \code{recipe} with the new step added to the
sequence of any existing operations.
}
\description{
\code{step_depth} creates a \emph{specification} of a recipe
step that will convert numeric data into measurement of
\emph{data depth}. This is done for each value of a categorical
class variable.
}
\details{
Data depth metrics attempt to measure how close data a
data point is to the center of its distribution. There are a
number of methods for calculating depth but a simple example is
the inverse of the distance of a data point to the centroid of
the distribution. Generally, small values indicate that a data
point not close to the centroid. \code{step_depth} can compute a
class-specific depth for a new data point based on the proximity
of the new value to the training set distribution.

This step requires the \pkg{ddalpha} package. If not installed, the
step will stop with a note about installing the package.

Note that the entire training set is saved to compute future
depth values. The saved data have been trained (i.e. prepared)
and baked (i.e. processed) up to the point before the location
that \code{step_depth} occupies in the recipe. Also, the data
requirements for the different step methods may vary. For
example, using \code{metric = "Mahalanobis"} requires that each
class should have at least as many rows as variables listed in
the \code{terms} argument.

The function will create a new column for every unique value of
the \code{class} variable. The resulting variables will not
replace the original values and by default have the prefix \code{depth_}. The
naming format can be changed using the \code{prefix} argument.
}
\section{Tidying}{
When you \code{\link[=tidy.recipe]{tidy()}} this step, a tibble with columns
\code{terms} (the selectors or variables selected) and \code{class} is returned.
}

\section{Case weights}{


The underlying operation does not allow for case weights.
}

\examples{
\dontshow{if (rlang::is_installed("ddalpha")) (if (getRversion() >= "3.4") withAutoprint else force)(\{ # examplesIf}

# halfspace depth is the default
rec <- recipe(Species ~ ., data = iris) \%>\%
  step_depth(all_numeric_predictors(), class = "Species")

# use zonoid metric instead
# also, define naming convention for new columns
rec <- recipe(Species ~ ., data = iris) \%>\%
  step_depth(all_numeric_predictors(),
    class = "Species",
    metric = "zonoid", prefix = "zonoid_"
  )

rec_dists <- prep(rec, training = iris)

dists_to_species <- bake(rec_dists, new_data = iris)
dists_to_species

tidy(rec, number = 1)
tidy(rec_dists, number = 1)
\dontshow{\}) # examplesIf}
}
\seealso{
Other multivariate transformation steps: 
\code{\link{step_classdist}()},
\code{\link{step_geodist}()},
\code{\link{step_ica}()},
\code{\link{step_isomap}()},
\code{\link{step_kpca_poly}()},
\code{\link{step_kpca_rbf}()},
\code{\link{step_kpca}()},
\code{\link{step_mutate_at}()},
\code{\link{step_nnmf_sparse}()},
\code{\link{step_nnmf}()},
\code{\link{step_pca}()},
\code{\link{step_pls}()},
\code{\link{step_ratio}()},
\code{\link{step_spatialsign}()}
}
\concept{multivariate transformation steps}