Tokenize code snippets.
Project description
Tokenize-All
Tokenize blocks of code in Python. Used by manim-code-blocks
to syntax highlight blocks of code.
Example Usage
from tokenize_all import Java
tokens = Java.tokenize(
"""
public class Main {
public static void main(String[] args) {
System.out.println("Hello world!");
}
}
"""
)
for token in tokens: print(token)
Output:
>> Token[ type = keyword, value = public, start = 4, end = 10 ]
>> Token[ type = keyword, value = class, start = 11, end = 16 ]
>> Token[ type = class name, value = Main, start = 17, end = 21 ]
>> Token[ type = left brace, value = {, start = 22, end = 23 ]
>> Token[ type = keyword, value = public, start = 31, end = 37 ]
>> Token[ type = keyword, value = static, start = 38, end = 44 ]
>> Token[ type = keyword, value = void, start = 45, end = 49 ]
>> Token[ type = function, value = main, start = 50, end = 54 ]
>> Token[ type = left parentheses, value = (, start = 54, end = >> 55 ]
>> Token[ type = class name, value = String, start = 55, end = >> 61 ]
>> Token[ type = left bracket, value = [, start = 61, end = 62 ]
>> Token[ type = right bracket, value = ], start = 62, end = 63 ]
>> Token[ type = identifier, value = args, start = 64, end = 68 ]
>> Token[ type = right parentheses, value = ), start = 68, end = >> 69 ]
>> Token[ type = left brace, value = {, start = 70, end = 71 ]
>> Token[ type = class name, value = System, start = 83, end = >> 89 ]
>> Token[ type = dot, value = ., start = 89, end = 90 ]
>> Token[ type = identifier, value = out, start = 90, end = 93 ]
>> Token[ type = dot, value = ., start = 93, end = 94 ]
>> Token[ type = function, value = println, start = 94, end = >> 101 ]
>> Token[ type = left parentheses, value = (, start = 101, end = >> 102 ]
>> Token[ type = string, value = "Hello world!", start = 102, >> end = 116 ]
>> Token[ type = right parentheses, value = ), start = 116, end >> = 117 ]
>> Token[ type = semicolon, value = ;, start = 117, end = 118 ]
>> Token[ type = right brace, value = }, start = 126, end = 127 ]
>> Token[ type = right brace, value = }, start = 131, end = 132 ]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tokenize_all-1.0.11.tar.gz
(6.2 kB
view details)
Built Distribution
File details
Details for the file tokenize_all-1.0.11.tar.gz
.
File metadata
- Download URL: tokenize_all-1.0.11.tar.gz
- Upload date:
- Size: 6.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 546e97f08c5968e4dee56303e1c0377c94d552dd93422c6ab6acf5aa87fb7fbc |
|
MD5 | 803abc10d0d453ea174337b4e5e4d3ea |
|
BLAKE2b-256 | 0ac11d8ece03a328f04634489dc4c41173138354f34c37168206da4df6a5e15d |
File details
Details for the file tokenize_all-1.0.11-py3-none-any.whl
.
File metadata
- Download URL: tokenize_all-1.0.11-py3-none-any.whl
- Upload date:
- Size: 7.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4161bf15f094cac55483e36e4db1059a5a50626864bbcb6cd5b62249bff3e484 |
|
MD5 | 364404179560aae023543837a68d73bb |
|
BLAKE2b-256 | aade9364e2a24bda50546ecd7f471529ade54827e6d204b7327ccd88859d6ee4 |