Japanese tokenizer with transformers library
Project description
jptranstokenizer: Japanese Tokenzier for transformers
This is a repository for japanese tokenizer with HuggingFace library.
You can use JapaneseTransformerTokenizer
like transformers.BertJapaneseTokenizer
.
issue は日本語でも大丈夫です。
Documentations
Documentations are available on readthedoc.
Install
pip install jptranstokenizer
Quickstart
This is the example to use jptranstokenizer.JapaneseTransformerTokenizer
with sentencepiece model of nlp-waseda/roberta-base-japanese and Juman++.
Before the following steps, you need to install pyknp and Juman++.
>>> from jptranstokenizer import JapaneseTransformerTokenizer
>>> tokenizer = JapaneseTransformerTokenizer.from_pretrained("nlp-waseda/roberta-base-japanese")
>>> tokens = tokenizer.tokenize("外国人参政権")
# tokens: ['▁外国', '▁人', '▁参政', '▁権']
Note that different dependencies are required depending on the type of tokenizer you use.
See also Quickstart on Read the Docs
Citation
There will be another paper. Be sure to check here again when you cite.
This Implementation
@misc{suzuki-2022-github,
author = {Masahiro Suzuki},
title = {jptranstokenizer: Japanese Tokenzier for transformers},
year = {2022},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/retarfi/jptranstokenizer}}}
Related Work
- Pretrained Japanese BERT models (containing Japanese tokenizer)
- Autor NLP Lab. in Tohoku University
- https://github.com/cl-tohoku/bert-japanese
- SudachiTra
- Author Works Applications
- https://github.com/WorksApplications/SudachiTra
- UD_Japanese-GSD
- Author megagonlabs
- https://github.com/megagonlabs/UD_Japanese-GSD
- Juman++
- Author Kurohashi Lab. in Universyti of Kyoto
- https://github.com/ku-nlp/jumanpp
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for jptranstokenizer-0.1.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 500433f72f7c5eae2967708f6d5d1e15fb181a9acac39cc37147af6e9f3b4a40 |
|
MD5 | 0953f0e5f337ef624cd8bebe0393c5bd |
|
BLAKE2b-256 | 0edc2002e26ac454e8fb2393b8fcac501f14a8ecdaa3850df5c903a0d39fc750 |