Tokenize code snippets.
Project description
Tokenize-All
Tokenize blocks of code in Python. Used by manim-code-blocks
to syntax highlight blocks of code.
Example Usage
from tokenize_all import Java
tokens = Java.tokenize(
"""
public class Main {
public static void main(String[] args) {
System.out.println("Hello world!");
}
}
"""
)
for token in tokens: print(token)
Output:
>> Token[ type = keyword, value = public, start = 4, end = 10 ]
>> Token[ type = keyword, value = class, start = 11, end = 16 ]
>> Token[ type = class name, value = Main, start = 17, end = 21 ]
>> Token[ type = left brace, value = {, start = 22, end = 23 ]
>> Token[ type = keyword, value = public, start = 31, end = 37 ]
>> Token[ type = keyword, value = static, start = 38, end = 44 ]
>> Token[ type = keyword, value = void, start = 45, end = 49 ]
>> Token[ type = function, value = main, start = 50, end = 54 ]
>> Token[ type = left parentheses, value = (, start = 54, end = >> 55 ]
>> Token[ type = class name, value = String, start = 55, end = >> 61 ]
>> Token[ type = left bracket, value = [, start = 61, end = 62 ]
>> Token[ type = right bracket, value = ], start = 62, end = 63 ]
>> Token[ type = identifier, value = args, start = 64, end = 68 ]
>> Token[ type = right parentheses, value = ), start = 68, end = >> 69 ]
>> Token[ type = left brace, value = {, start = 70, end = 71 ]
>> Token[ type = class name, value = System, start = 83, end = >> 89 ]
>> Token[ type = dot, value = ., start = 89, end = 90 ]
>> Token[ type = identifier, value = out, start = 90, end = 93 ]
>> Token[ type = dot, value = ., start = 93, end = 94 ]
>> Token[ type = function, value = println, start = 94, end = >> 101 ]
>> Token[ type = left parentheses, value = (, start = 101, end = >> 102 ]
>> Token[ type = string, value = "Hello world!", start = 102, >> end = 116 ]
>> Token[ type = right parentheses, value = ), start = 116, end >> = 117 ]
>> Token[ type = semicolon, value = ;, start = 117, end = 118 ]
>> Token[ type = right brace, value = }, start = 126, end = 127 ]
>> Token[ type = right brace, value = }, start = 131, end = 132 ]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tokenize_all-1.0.17.tar.gz
(6.5 kB
view details)
Built Distribution
File details
Details for the file tokenize_all-1.0.17.tar.gz
.
File metadata
- Download URL: tokenize_all-1.0.17.tar.gz
- Upload date:
- Size: 6.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | dca44dc2443c1cca6410743f2bc0d43168433bfeeee1df4c1a70b233f66a3e0b |
|
MD5 | 102d39ab40838d183a3d20b2da375123 |
|
BLAKE2b-256 | 0f4b5dc9e67bfcc2169495707d5b27957485d4e4982837b0b76e884928911eeb |
File details
Details for the file tokenize_all-1.0.17-py3-none-any.whl
.
File metadata
- Download URL: tokenize_all-1.0.17-py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c887877a18b0aa67a69a51b2a471933253bf840534964d5674fd41e392827da3 |
|
MD5 | 50cb8c51618001c2930d9afc185702c9 |
|
BLAKE2b-256 | 08725ddeb1a9a6dec5ed3d4eb335c0f8ab6244f2ccf9f5dc6838283c5c0cbfd6 |