Skip to main content

A wrapper around the stdlib `tokenize` which roundtrips.

Project description

Build Status Coverage Status

tokenize-rt

The stdlib tokenize module does not properly roundtrip. This wrapper around the stdlib provides two additional tokens ESCAPED_NL and UNIMPORTANT_WS, and a Token data type. Use src_to_tokens and tokens_to_src to roundtrip.

This library is useful if you're writing a refactoring tool based on the python tokenization.

Installation

pip install tokenize-rt

Usage

tokenize_rt.src_to_tokens(text) -> List[Token]

tokenize_rt.tokens_to_src(Sequence[Token]) -> text

tokenize_rt.ESCAPED_NL

tokenize_rt.UNIMPORTANT_WS

tokenize_rt.Offset(line=None, utf8_byte_offset=None)

A token offset, useful as a key when cross referencing the ast and the tokenized source.

tokenize_rt.Token(name, src, line=None, utf8_byte_offset=None)

Construct a token

  • name: one of the token names listed in token.tok_name or ESCAPED_NL or UNIMPORTANT_WS
  • src: token's source as text
  • line: the line number that this token appears on. This will be None for ESCAPED_NL and UNIMPORTANT_WS tokens.
  • utf8_byte_offset: the utf8 byte offset that this token appears on in the line. This will be None for ESCAPED_NL and UNIMPORTANT_WS tokens.

tokenize_rt.Token.offset

Retrieves an Offset for this token.

tokenize_rt.reversed_enumerate(Sequence[Token]) -> Iterator[Tuple[int, Token]]

yields (index, token) pairs. Useful for rewriting source.

Sample usage

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenize_rt-2.2.0.tar.gz (3.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tokenize_rt-2.2.0-py2.py3-none-any.whl (4.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file tokenize_rt-2.2.0.tar.gz.

File metadata

  • Download URL: tokenize_rt-2.2.0.tar.gz
  • Upload date:
  • Size: 3.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.7

File hashes

Hashes for tokenize_rt-2.2.0.tar.gz
Algorithm Hash digest
SHA256 56a0a093db01e984297e4a15813b4196764dc09c5c352617f22ad6c26d4c8042
MD5 bc6c1fb937821f29c969ddcdba5cf383
BLAKE2b-256 d6140168c11b729705fa64e531ae33c54c3b11a1e0f0b64fbe1d6c3d61898bb9

See more details on using hashes here.

File details

Details for the file tokenize_rt-2.2.0-py2.py3-none-any.whl.

File metadata

  • Download URL: tokenize_rt-2.2.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 4.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.7

File hashes

Hashes for tokenize_rt-2.2.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 6b845036b52d430d395b02981fa4adaeb279c6914b57d8019be0d7d0a98b8a03
MD5 73007d91a928060240fb486b71938a6b
BLAKE2b-256 434bc5df89ff5b38afffc04fb208c9b1fce30c1426788a368d7039b4cbcf524e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page