Skip to main content

A lightweight sentence boundary detector for Meitei Mayek (Manipuri) text

Project description

Meitei Senter

A lightweight sentence boundary detector for Meitei Mayek (Manipuri) text.

PyPI version License: MIT Python 3.9+ Model Size F-Score

Features

  • 🚀 Lightweight - Only ~1MB model, minimal dependencies
  • 🎯 Accurate - 94.7% F-Score on Meitei text
  • 🔧 Easy to use - Simple Python API and CLI
  • Fast - Optimized for quick inference

Installation

pip install meitei-senter

Optional: spaCy Backend (for higher accuracy)

pip install meitei-senter[spacy]

Quick Start

Python API

from meitei_senter import MeiteiSentenceSplitter

# Initialize the splitter
splitter = MeiteiSentenceSplitter()

# Split text into sentences
text = "ꯆꯦꯔꯣꯀꯤ ꯑꯁꯤ ꯑꯣꯀ꯭ꯂꯥꯍꯣꯃꯥꯒꯤ ꯁꯍꯔꯅꯤ ꯫ ꯃꯁꯤ ꯌꯥꯝꯅ ꯆꯥꯎꯏ ꯫"
sentences = splitter.split_sentences(text)

for i, sent in enumerate(sentences, 1):
    print(f"{i}. {sent}")

Output:

1. ꯆꯦꯔꯣꯀꯤ ꯑꯁꯤ ꯑꯣꯀ꯭ꯂꯥꯍꯣꯃꯥꯒꯤ ꯁꯍꯔꯅꯤ ꯫
2. ꯃꯁꯤ ꯌꯥꯝꯅ ꯆꯥꯎꯏ ꯫

Command Line

# Interactive mode
meitei-senter --interactive

# Direct text input
meitei-senter --text "ꯆꯦꯔꯣꯀꯤ ꯑꯁꯤ ꯑꯣꯀ꯭ꯂꯥꯍꯣꯃꯥꯒꯤ ꯁꯍꯔꯅꯤ ꯫ ꯃꯁꯤ ꯌꯥꯝꯅ ꯆꯥꯎꯏ ꯫"

# Show version
meitei-senter --version

Advanced Usage

Using the Convenient Loader

from meitei_senter import load_splitter

# Load with default (delimiter-based) backend
splitter = load_splitter()

# Or with spaCy backend (requires spacy extra)
splitter = load_splitter(use_spacy=True)

sentences = splitter.split_sentences("Your Meitei text here ꯫")

Using Neural Network Mode

from meitei_senter import MeiteiSentenceSplitter

# Enable neural mode for context-aware splitting
splitter = MeiteiSentenceSplitter(use_neural=True)
sentences = splitter.split_sentences(text)

Direct Callable Interface

from meitei_senter import MeiteiSentenceSplitter

splitter = MeiteiSentenceSplitter()

# Call splitter directly
sentences = splitter("ꯆꯦꯔꯣꯀꯤ ꯑꯁꯤ... ꯫ ꯃꯁꯤ ꯌꯥꯝꯅ ꯆꯥꯎꯏ ꯫")

With spaCy (Custom Tokenizer)

import spacy
import os
from meitei_senter import MeiteiTokenizer, get_model_path

# Get path to bundled model
model_path = os.path.join(get_model_path(), 'meitei_tokenizer.model')

# Create blank spaCy model with custom tokenizer
nlp = spacy.blank("xx")
nlp.tokenizer = MeiteiTokenizer(model_path, nlp.vocab)

doc = nlp("ꯆꯦꯔꯣꯀꯤ ꯑꯁꯤ ꯑꯣꯀ꯭ꯂꯥꯍꯣꯃꯥꯒꯤ ꯁꯍꯔꯅꯤ ꯫")
print([token.text for token in doc])
# Output: ['ꯆꯦ', 'ꯔꯣ', 'ꯀꯤ', 'ꯑꯁꯤ', 'ꯑꯣꯀ꯭ꯂꯥꯍꯣꯃꯥ', 'ꯒꯤ', 'ꯁꯍꯔ', 'ꯅꯤ', '꯫']

📊 Model Details

Feature Specification
Model Size ~1 MB
Tokenizer SentencePiece (Unigram, 8K vocab)
Architecture CNN (HashEmbedCNN)
F-Score 94.71%
Precision 93.94%
Recall 95.49%

📂 Repository Structure

mni_tokenizer/
├── meitei_senter/              # Main package
│   ├── __init__.py             # Package exports
│   ├── cli.py                  # Command-line interface
│   ├── model.py                # PyTorch model & splitter
│   ├── tokenizer.py            # spaCy tokenizer
│   ├── meitei_tokenizer.model  # SentencePiece model
│   ├── meitei_senter.pth       # PyTorch weights
│   └── meitei_senter.json      # Model config
├── pyproject.toml              # Build configuration
└── README.md                   # This file

API Reference

MeiteiSentenceSplitter

Main class for sentence splitting.

MeiteiSentenceSplitter(
    pth_path: str = None,      # Path to PyTorch model
    spm_path: str = None,      # Path to SentencePiece model
    config_path: str = None,   # Path to config JSON
    use_neural: bool = False   # Enable neural network mode
)

Methods:

Method Description
split_sentences(text) Split text into list of sentences
tokenize(text) Tokenize text into pieces and IDs
__call__(text) Direct callable interface

MeiteiTokenizer

spaCy-compatible tokenizer using SentencePiece.

MeiteiTokenizer(model_path: str, vocab: spacy.Vocab)

load_splitter

Convenience function to load a pre-configured splitter.

load_splitter(use_spacy: bool = False)

🔧 Development

# Clone repository
git clone https://github.com/Okramjimmy/mni_tokenizer.git
cd mni_tokenizer

# Install in development mode
pip install -e ".[dev]"

# Run tests
pytest

# Build package
python -m build

# Upload to PyPI
twine upload dist/*

📜 License

MIT License - see LICENSE for details.


📚 Citation

If you use this in your research, please cite:

@software{meitei_senter,
  author = {Okram Jimmy},
  title = {Meitei Senter: Sentence Boundary Detection for Meitei Mayek},
  year = {2024},
  url = {https://github.com/Okramjimmy/mni_tokenizer}
}

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📧 Contact

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meitei_senter-1.0.1.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

meitei_senter-1.0.1-py3-none-any.whl (1.2 MB view details)

Uploaded Python 3

File details

Details for the file meitei_senter-1.0.1.tar.gz.

File metadata

  • Download URL: meitei_senter-1.0.1.tar.gz
  • Upload date:
  • Size: 1.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for meitei_senter-1.0.1.tar.gz
Algorithm Hash digest
SHA256 96a5b606dd2bacb00a489a8bde71778640d32f80354508a856e3b5aa74419479
MD5 bba25eeda97d8954061868252691ffa4
BLAKE2b-256 59c9489cf05dd240ba9477c2e45e814b3dce6c82712d47fe2e66638a7b301878

See more details on using hashes here.

File details

Details for the file meitei_senter-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: meitei_senter-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for meitei_senter-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ae7f85eb48bc8f8433cb20c4216476f908a8773f57b7f4dc47effc8e5a8bb31d
MD5 f38693bf250aecf0c671d995a1c9ed99
BLAKE2b-256 61fc8e35d61afb636cb7340ea5c43c14a1c22190b7880a36b258083f374b64cb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page