Japanese tokenizer with transformers library
Project description
jptranstokenizer: Japanese Tokenzier for transformers
This is a repository for japanese tokenizer with HuggingFace library.
You can use JapaneseTransformerTokenizer
like transformers.BertJapaneseTokenizer
.
issue は日本語でも大丈夫です。
Documentations
Documentations are available on readthedoc.
Install
pip install jptranstokenizer
Quickstart
This is the example to use jptranstokenizer.JapaneseTransformerTokenizer
with sentencepiece model of nlp-waseda/roberta-base-japanese and Juman++.
Before the following steps, you need to install pyknp and Juman++.
>>> from jptranstokenizer import JapaneseTransformerTokenizer
>>> tokenizer = JapaneseTransformerTokenizer.from_pretrained("nlp-waseda/roberta-base-japanese")
>>> tokens = tokenizer.tokenize("外国人参政権")
# tokens: ['▁外国', '▁人', '▁参政', '▁権']
Note that different dependencies are required depending on the type of tokenizer you use.
See also Quickstart on Read the Docs
Citation
There will be another paper. Be sure to check here again when you cite.
This Implementation
@inproceedings{Suzuki-2023-nlp,
jtitle = {{異なる単語分割システムによる日本語事前学習言語モデルの性能評価}},
title = {{Performance Evaluation of Japanese Pre-trained Language Models with Different Word Segmentation Systems}},
jauthor = {鈴木, 雅弘 and 坂地, 泰紀 and 和泉, 潔},
author = {Suzuki, Masahiro and Sakaji, Hiroki and Izumi, Kiyoshi},
jbooktitle = {言語処理学会 第29回年次大会 (NLP2023)},
booktitle = {29th Annual Meeting of the Association for Natural Language Processing (NLP)},
year = {2023},
pages = {894-898}
}
Related Work
- Pretrained Japanese BERT models (containing Japanese tokenizer)
- Autor NLP Lab. in Tohoku University
- https://github.com/cl-tohoku/bert-japanese
- SudachiTra
- Author Works Applications
- https://github.com/WorksApplications/SudachiTra
- UD_Japanese-GSD
- Author megagonlabs
- https://github.com/megagonlabs/UD_Japanese-GSD
- Juman++
- Author Kurohashi Lab. in University of Kyoto
- https://github.com/ku-nlp/jumanpp
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for jptranstokenizer-0.3.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cd7502ebf77f69ae92cbf0596a7f6d7906c7040d8a91aaecde45aa457f335833 |
|
MD5 | d78270eacad99f3cfe2bf716e8a168f8 |
|
BLAKE2b-256 | 234be2633ba7ddca419501a9decb44b87dc31275ac50ebd1cca45783d9037e48 |