Skip to main content

Calculate readability scores for Japanese texts.

Project description


Text readability calculator for Japanese learners 🇯🇵


jReadability allows python developers to calculate the readability of Japanese text using the model developed by Jae-ho Lee and Yoichiro Hasebe in Introducing a readability evaluation system for Japanese language education and Readability measurement of Japanese texts based on levelled corpora. Note that this is not an official implementation.

Demo

You can play with an interactive demo here.

Installation

pip install jreadability

Quickstart

from jreadability import compute_readability

# "Good morning! The weather is nice today."
text = 'おはようございます!今日は天気がいいですね。' 

score = compute_readability(text)

print(score) # 6.438000000000001

Readability scores

Level Readability score range
Upper-advanced [0.5, 1.5)
Lower-advanced [1.5, 2.5)
Upper-intermediate [2.5, 3.5)
Lower-intermediate [3.5, 4.5)
Upper-elementary [4.5, 5.5)
Lower-elementary [5.5, 6.5)

Note that this readability calculator is specifically for non-native speakers learning to read Japanese. This is not to be confused with something like grade level or other readability scores meant for native speakers.

Model

readability = {mean number of words per sentence} * -0.056
            + {percentage of kango} * -0.126
            + {percentage of wago} * -0.042
            + {percentage of verbs} * -0.145
            + {percentage of particles} * -0.044
            + 11.724

* "kango" (漢語) means Japanese word of Chinese origin while "wago" (和語) means native Japanese word.

Note on model consistency

The readability scores produced by this python package tend to differ slightly from the scores produced on the official jreadability website. This is likely due to the version difference in UniDic between these two implementations as this package uses UniDic 2.1.2 while theirs uses UniDic 2.2.0. This issue may be resolved in the future.

Batch processing

jreadability makes use of fugashi's tagger under the hood and initializes a new tagger everytime compute_readability is invoked. If you are processing a large number of texts, it is recommended to initialize the tagger first on your own, then pass it as an argument to each subsequent compute_readability call.

from fugashi import Tagger

texts = [...]

tagger = Tagger()

for text in texts:
    
    score = compute_readability(text, tagger) # fast :D
    #score = compute_readability(text) # slow :'(
    ...

Documentation

You can find this repo's (very minimal) documentation here.

Other implementations

The official jReadability implementation can be found on jreadability.net

A node.js implementation can also be found here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jreadability-1.1.4rc1.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

jreadability-1.1.4rc1-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file jreadability-1.1.4rc1.tar.gz.

File metadata

  • Download URL: jreadability-1.1.4rc1.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for jreadability-1.1.4rc1.tar.gz
Algorithm Hash digest
SHA256 11f86f7f3587c695208936e754011da9306768e1f8d9e4b065b72359baa044f3
MD5 384e96418dfed94bbc850ad12e58508d
BLAKE2b-256 7d2826e9c763680ca7739a6b41a108d5fc09b2dbf823e5253cf0892c5154ef1e

See more details on using hashes here.

File details

Details for the file jreadability-1.1.4rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for jreadability-1.1.4rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 1d7d5720fb429a38b22f90c1daf6b8321313c9a8add9435a8457bb25121ab1a5
MD5 b299b80944003085c0f6d9b3f9abf4ab
BLAKE2b-256 cc03f849e6074cd386dac73c60060c99d6a42c716dbf84f784519ea8096267a9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page