A Python Library for Tokenizers
Project description
Lexikanon
A Python Library for Tokenizers
- Documentation: https://lexikanon.entelecheia.ai
- GitHub: https://github.com/entelecheia/lexikanon
- PyPI: https://pypi.org/project/lexikanon
Lexikanon is a robust and efficient Python library designed for creating, training, and deploying tokenizers, an essential component in natural language processing (NLP) and artificial intelligence (AI) applications. The name Lexikanon originates from the Greek words λέξη (word) and κάνων (maker), reflecting the library's purpose in enabling users to build powerful tokenizers for various languages and tasks.
Changelog
See the CHANGELOG for more information.
Contributing
Contributions are welcome! Please see the contributing guidelines for more information.
License
This project is released under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for lexikanon-0.5.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a03f19f1bf746a67dd891ce3d46abfa68aeec5cdc60884e5794ee5a241f3851 |
|
MD5 | 480d20cd97b0917aa09942cd6b9d61a8 |
|
BLAKE2b-256 | 405c4bbc63865a8d364df0f512871681756cab42525496819c7e32b3469fd979 |