Tokenize code snippets.
Project description
Tokenize-All
Tokenize blocks of code in Python. Used by manim-code-blocks
to syntax highlight blocks of code.
Example Usage
from tokenize_all import Java
tokens = Java.tokenize(
"""
public class Main {
public static void main(String[] args) {
System.out.println("Hello world!");
}
}
"""
)
for token in tokens: print(token)
Output:
>> Token[ type = keyword, value = public, start = 4, end = 10 ]
>> Token[ type = keyword, value = class, start = 11, end = 16 ]
>> Token[ type = class name, value = Main, start = 17, end = 21 ]
>> Token[ type = left brace, value = {, start = 22, end = 23 ]
>> Token[ type = keyword, value = public, start = 31, end = 37 ]
>> Token[ type = keyword, value = static, start = 38, end = 44 ]
>> Token[ type = keyword, value = void, start = 45, end = 49 ]
>> Token[ type = function, value = main, start = 50, end = 54 ]
>> Token[ type = left parentheses, value = (, start = 54, end = >> 55 ]
>> Token[ type = class name, value = String, start = 55, end = >> 61 ]
>> Token[ type = left bracket, value = [, start = 61, end = 62 ]
>> Token[ type = right bracket, value = ], start = 62, end = 63 ]
>> Token[ type = identifier, value = args, start = 64, end = 68 ]
>> Token[ type = right parentheses, value = ), start = 68, end = >> 69 ]
>> Token[ type = left brace, value = {, start = 70, end = 71 ]
>> Token[ type = class name, value = System, start = 83, end = >> 89 ]
>> Token[ type = dot, value = ., start = 89, end = 90 ]
>> Token[ type = identifier, value = out, start = 90, end = 93 ]
>> Token[ type = dot, value = ., start = 93, end = 94 ]
>> Token[ type = function, value = println, start = 94, end = >> 101 ]
>> Token[ type = left parentheses, value = (, start = 101, end = >> 102 ]
>> Token[ type = string, value = "Hello world!", start = 102, >> end = 116 ]
>> Token[ type = right parentheses, value = ), start = 116, end >> = 117 ]
>> Token[ type = semicolon, value = ;, start = 117, end = 118 ]
>> Token[ type = right brace, value = }, start = 126, end = 127 ]
>> Token[ type = right brace, value = }, start = 131, end = 132 ]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tokenize_all-1.0.7.tar.gz
(6.7 kB
view details)
Built Distribution
File details
Details for the file tokenize_all-1.0.7.tar.gz
.
File metadata
- Download URL: tokenize_all-1.0.7.tar.gz
- Upload date:
- Size: 6.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ee0bf480e641e091f6d49fb7cfc565b2fdc88abfdba5dbd5a7b4ca3c6717109c |
|
MD5 | d07e04c4ed1a7e774a3c40c4002f3155 |
|
BLAKE2b-256 | 314934e7435748fa0be82d045936451d4945144a427becf281ff2403d6749ac8 |
File details
Details for the file tokenize_all-1.0.7-py3-none-any.whl
.
File metadata
- Download URL: tokenize_all-1.0.7-py3-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6a787185e7447d32b4c5149ef052931d700bd27325a9340ed15a72850ffec094 |
|
MD5 | 20ebfbb31582cd0f40603f89aca802af |
|
BLAKE2b-256 | 14786d4fdf7a00342ed8c4f39d141f39223b2238bc9b55e33e325059cb18c681 |