1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49
|
Source: lua-torch-optim
Section: interpreters
Priority: optional
Maintainer: Debian Science Maintainers <debian-science-maintainers@lists.alioth.debian.org>
Uploaders: Mo Zhou <cdluminate@gmail.com>
Build-Depends: debhelper (>=11),
dh-lua,
# lua-torch-torch7 is not a real B-D, but an explicit runtime dependency
lua-torch-torch7,
Standards-Version: 4.1.4
Homepage: https://github.com/torch/optim
Vcs-Browser: https://salsa.debian.org/science-team/lua-torch-optim
Vcs-Git: https://salsa.debian.org/science-team/lua-torch-optim.git
Package: lua-torch-optim
Architecture: all
Multi-Arch: foreign
Depends: ${misc:Depends},
lua5.1 | luajit,
lua-torch-torch7,
lua-torch-xlua,
XB-Lua-Versions: ${lua:Versions}
Description: Numeric Optimization Package for Torch Framework
This package contains several optimization routines and a logger for Torch.
.
The following algorithms are provided:
* Stochastic Gradient Descent
* Averaged Stochastic Gradient Descent
* L-BFGS
* Congugate Gradients
* AdaDelta
* AdaGrad
* Adam
* AdaMax
* FISTA with backtracking line search
* Nesterov's Accelerated Gradient method
* RMSprop
* Rprop
* CMAES
All these algorithms are designed to support batch optimization as well
as stochastic optimization. It's up to the user to construct an objective
function that represents the batch, mini-batch, or single sample on which
to evaluate the objective.
.
This package provides also logging and live plotting capabilities via the
`optim.Logger()` function. Live logging is essential to monitor the
network accuracy and cost function during training and testing, for
spotting under- and over-fitting, for early stopping or just for monitoring
the health of the current optimisation task.
|