File: control

package info (click to toggle)
skorch 1.1.0-1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 13,368 kB
  • sloc: python: 20,415; sh: 66; makefile: 20
file content (48 lines) | stat: -rw-r--r-- 2,147 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
Source: skorch
Section: science
Homepage: https://github.com/skorch-dev/skorch
Priority: optional
Standards-Version: 4.6.2
Vcs-Git: https://salsa.debian.org/deeplearning-team/skorch.git
Vcs-Browser: https://salsa.debian.org/deeplearning-team/skorch
Maintainer: Debian Deep Learning Team <debian-ai@lists.debian.org>
Uploaders: Mo Zhou <lumin@debian.org>
Build-Depends: debhelper-compat (= 13),
               dh-python,
               python3-all,
               python3-coverage <!nocheck>,
               python3-flaky <!nocheck>,
               python3-numpy,
               python3-pandas <!nocheck>,
               python3-pytest <!nocheck>,
               python3-pytest-cov <!nocheck>,
               python3-scipy,
               python3-setuptools,
               python3-sklearn,
               python3-tabulate,
               python3-torch (>= 1.3.1),
               python3-tqdm

Package: python3-skorch
Architecture: all
Depends: python3-torch (>= 1.3.1) | python3-torch-cuda (>= 1.3.1),
         ${misc:Depends}, ${python3:Depends}
Description: scikit-learn compatible neural network library that wraps PyTorch
 The goal of skorch is to make it possible to use PyTorch with sklearn. This is
 achieved by providing a wrapper around PyTorch that has an sklearn interface.
 In that sense, skorch is the spiritual successor to nolearn, but instead of
 using Lasagne and Theano, it uses PyTorch.
 .
 skorch does not re-invent the wheel, instead getting as much out of your way as
 possible. If you are familiar with sklearn and PyTorch, you don’t have to learn
 any new concepts, and the syntax should be well known. (If you’re not familiar
 with those libraries, it is worth getting familiarized.)
 .
 Additionally, skorch abstracts away the training loop, making a lot of
 boilerplate code obsolete. A simple net.fit(X, y) is enough. Out of the box,
 skorch works with many types of data, be it PyTorch Tensors, NumPy arrays,
 Python dicts, and so on. However, if you have other data, extending skorch is
 easy to allow for that.
 .
 Overall, skorch aims at being as flexible as PyTorch while having a clean
 interface as sklearn.