File: control

package info (click to toggle)
pytorch-geometric 2.6.1-7
  • links: PTS, VCS
  • area: main
  • in suites: sid
  • size: 12,904 kB
  • sloc: python: 127,155; sh: 338; cpp: 27; makefile: 18; javascript: 16
file content (51 lines) | stat: -rw-r--r-- 1,926 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
Source: pytorch-geometric
Section: science
Priority: optional
Maintainer: Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
Uploaders:
 Andrius Merkys <merkys@debian.org>,
Rules-Requires-Root: no
Build-Depends:
 debhelper-compat (= 13),
 dh-sequence-python3,
 flit,
 libtorch-dev <!nocheck>,
 libsleef-dev [arm64 ppc64el] <!nocheck>,
 pybuild-plugin-pyproject,
 python3,
 python3-ase <!nocheck>,
 python3-jinja2 <!nocheck>,
 python3-pandas <!nocheck>,
 python3-pyparsing <!nocheck>,
 python3-pytest <!nocheck>,
 python3-rdkit <!nocheck>,
 python3-setuptools,
 python3-sklearn <!nocheck>,
 python3-torch <!nocheck>,
 python3-torch-cluster <!nocheck>,
 python3-torch-sparse <!nocheck>,
 python3-tqdm <!nocheck>,
Testsuite: autopkgtest-pkg-pybuild
Standards-Version: 4.6.2
Homepage: https://github.com/pyg-team/pytorch_geometric
Vcs-Browser: https://salsa.debian.org/deeplearning-team/pytorch-geometric
Vcs-Git: https://salsa.debian.org/deeplearning-team/pytorch-geometric.git

Package: python3-torch-geometric
Architecture: all
Depends:
 ${misc:Depends},
 ${python3:Depends},
Description: Graph Neural Network Library for PyTorch
 PyG (PyTorch Geometric) is a library built upon PyTorch to easily write and
 train Graph Neural Networks (GNNs) for a wide range of applications related to
 structured data.
 .
 It consists of various methods for deep learning on graphs and other irregular
 structures, also known as geometric deep learning, from a variety of published
 papers. In addition, it consists of easy-to-use mini-batch loaders for
 operating on many small and single giant graphs, multi GPU-support, DataPipe
 support, distributed graph learning via Quiver, a large number of common
 benchmark datasets (based on simple interfaces to create your own), the
 GraphGym experiment manager, and helpful transforms, both for learning on
 arbitrary graphs as well as on 3D meshes or point clouds.