File: tokenize_test.py

package info (click to toggle)
python-enum-tools 0.12.0-3
  • links: PTS, VCS
  • area: main
  • in suites: sid, trixie
  • size: 656 kB
  • sloc: python: 1,447; makefile: 4
file content (7 lines) | stat: -rw-r--r-- 189 bytes parent folder | download
1
2
3
4
5
6
7
# stdlib
import tokenize
from io import StringIO
from pprint import pprint

tokens = tokenize.generate_tokens(StringIO('foo = "abcdefg"  # doc: a docstring').readline)
pprint(list(tokens))