Skip to main content

Calculate readability scores for Japanese texts.

Project description


Text readability calculator for Japanese learners 🇯🇵


jReadability allows python developers to calculate the readability of Japanese text using the model developed by Jae-ho Lee and Yoichiro Hasebe in Introducing a readability evaluation system for Japanese language education and Readability measurement of Japanese texts based on levelled corpora. Note that this is not an official implementation.

Demo

You can play with an interactive demo here.

Installation

pip install jreadability

Quickstart

from jreadability import compute_readability

# "Good morning! The weather is nice today."
text = 'おはようございます!今日は天気がいいですね。' 

score = compute_readability(text)

print(score) # 6.438000000000001

Readability scores

Level Readability score range
Upper-advanced [0.5, 1.5)
Lower-advanced [1.5, 2.5)
Upper-intermediate [2.5, 3.5)
Lower-intermediate [3.5, 4.5)
Upper-elementary [4.5, 5.5)
Lower-elementary [5.5, 6.5)

Note that this readability calculator is specifically for non-native speakers learning to read Japanese. This is not to be confused with something like grade level or other readability scores meant for native speakers.

Model

readability = {mean number of words per sentence} * -0.056
            + {percentage of kango} * -0.126
            + {percentage of wago} * -0.042
            + {percentage of verbs} * -0.145
            + {percentage of particles} * -0.044
            + 11.724

* "kango" (漢語) means Japanese word of Chinese origin while "wago" (和語) means native Japanese word.

Note on model consistency

The readability scores produced by this python package tend to differ slightly from the scores produced on the official jreadability website. This is likely due to the version difference in UniDic between these two implementations as this package uses UniDic 2.1.2 while theirs uses UniDic 2.2.0. This issue may be resolved in the future.

Batch processing

jreadability makes use of fugashi's tagger under the hood and initializes a new tagger everytime compute_readability is invoked. If you are processing a large number of texts, it is recommended to initialize the tagger first on your own, then pass it as an argument to each subsequent compute_readability call.

from fugashi import Tagger

texts = [...]

tagger = Tagger()

for text in texts:
    
    score = compute_readability(text, tagger) # fast :D
    #score = compute_readability(text) # slow :'(
    ...

Documentation

You can find this repo's (very minimal) documentation here.

Other implementations

The official jReadability implementation can be found on jreadability.net

A node.js implementation can also be found here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jreadability-1.1.4.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

jreadability-1.1.4-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file jreadability-1.1.4.tar.gz.

File metadata

  • Download URL: jreadability-1.1.4.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for jreadability-1.1.4.tar.gz
Algorithm Hash digest
SHA256 60342faba213a65bfaf4291304105ab530d2c4eb7076163d484d73ef8b4178b0
MD5 2a995bd7a328438b32eb2d9e6d36d38b
BLAKE2b-256 4253345073db373c57d940db1181332bfdbeec92c4e2bba0b227d9d328336eba

See more details on using hashes here.

File details

Details for the file jreadability-1.1.4-py3-none-any.whl.

File metadata

  • Download URL: jreadability-1.1.4-py3-none-any.whl
  • Upload date:
  • Size: 6.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for jreadability-1.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 469aa3659c891ee6449dc53674442d51e00bd6fdbf5f798227987eb38521b0b3
MD5 28288a648575998899d178494e0e2c5f
BLAKE2b-256 b1467143f731febc349fb1620ea87c4530b650bbdf806abd5c58cca875563f5c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page