1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61
|
Source: sentencepiece
Section: science
Priority: optional
Maintainer: Debian Science Maintainers <debian-science-maintainers@lists.alioth.debian.org>
Uploaders:
TSUCHIYA Masatoshi <tsuchiya@namazu.org>,
Kentaro Hayashi <kenhys@xdump.org>
Build-Depends:
debhelper-compat (= 13),
protobuf-compiler,
libprotobuf-dev,
dh-python,
python3-all-dev,
quilt,
cmake,
python3-setuptools
Standards-Version: 4.5.0
Homepage: https://github.com/google/sentencepiece
Vcs-Browser: https://salsa.debian.org/science-team/sentencepiece
Vcs-Git: https://salsa.debian.org/science-team/sentencepiece.git
Rules-Requires-Root: no
Package: sentencepiece
Architecture: any
Depends: ${shlibs:Depends}, ${misc:Depends}
Description: Unsupervised text tokenizer and detokenizer
SentencePiece is an unsupervised text tokenizer/detokenizer mainly
designed for Neural Network-based text generation systems where the
vocabulary size is predetermined prior to the neural model training.
Package: libsentencepiece0
Section: libs
Architecture: any
Depends: ${shlibs:Depends}, ${misc:Depends}
Description: Library files of SentencePiece
SentencePiece is an unsupervised text tokenizer/detokenizer mainly
designed for Neural Network-based text generation systems where the
vocabulary size is predetermined prior to the neural model training.
Package: libsentencepiece-dev
Section: libdevel
Architecture: any
Depends: libsentencepiece0 (= ${binary:Version}), ${misc:Depends}
Description: Header files of SentencePiece
SentencePiece is an unsupervised text tokenizer/detokenizer mainly
designed for Neural Network-based text generation systems where the
vocabulary size is predetermined prior to the neural model training.
Package: python3-sentencepiece
Section: python
Architecture: any
Depends:
${shlibs:Depends},
${misc:Depends},
${python3:Depends}
Description: SentencePiece binding for Python3
SentencePiece is an unsupervised text tokenizer/detokenizer mainly
designed for Neural Network-based text generation systems where the
vocabulary size is predetermined prior to the neural model training.
.
python3-sentencepiece is its binding for Python3.
|