File: control

package info (click to toggle)
python-tokenize-rt 6.1.0-1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 128 kB
  • sloc: python: 475; makefile: 2
file content (24 lines) | stat: -rw-r--r-- 1,092 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Rules-Requires-Root: no
Standards-Version: 4.7.0
Build-Depends: debhelper-compat (= 13), dh-sequence-python3, pybuild-plugin-pyproject, python3-all, python3-setuptools, python3-pytest
Testsuite: autopkgtest-pkg-pybuild
Source: python-tokenize-rt
Maintainer: Debian Python Team <team+python@tracker.debian.org>
Uploaders: Jelmer Vernooij <jelmer@debian.org>
Homepage: https://github.com/asottile/tokenize-rt
Priority: optional
Section: python
Vcs-Git: https://salsa.debian.org/python-team/packages/python-tokenize-rt.git
Vcs-Browser: https://salsa.debian.org/python-team/packages/python-tokenize-rt

Package: python3-tokenize-rt
Depends: ${python3:Depends}, ${misc:Depends}
Architecture: all
Description: wrapper around the Python stdlib `tokenize` which roundtrips
 The stdlib tokenize module does not properly roundtrip.  This wrapper
 around the stdlib provides two additional tokens ESCAPED_NL and
 UNIMPORTANT_WS, and a Token data type.  Use src_to_tokens and
 tokens_to_src to roundtrip.
 .
 This library is useful if you're writing a refactoring tool based on
 the python tokenization.