Skip to main content

Natural-Language-Toolkit for bahasa Malaysia, powered by Tensorflow and PyTorch.

Project description

Malaya is a Natural-Language-Toolkit library for bahasa Malaysia, powered by Tensorflow and PyTorch.

Documentation

Proper documentation is available at https://malaya.readthedocs.io/

Installing from the PyPI

$ pip install malaya

It will automatically install all dependencies except for Tensorflow and PyTorch. So you can choose your own Tensorflow CPU / GPU version and PyTorch CPU / GPU version.

Only Python >= 3.6.0, Tensorflow >= 1.15.0, and PyTorch >= 1.10 are supported.

Development Release

Install from master branch,

$ pip install git+https://github.com/huseinzol05/malaya.git

We recommend to use virtualenv for development.

Documentation at https://malaya.readthedocs.io/en/latest/

Features

  • Alignment, translation word alignment using Eflomal and pretrained Transformer models.

  • Augmentation, augment any text using dictionary of synonym, Wordvector or Transformer-Bahasa.

  • Constituency Parsing, breaking a text into sub-phrases using finetuned Transformer-Bahasa.

  • Coreference Resolution, finding all expressions that refer to the same entity in a text using Dependency Parsing models.

  • Dependency Parsing, extracting a dependency parse of a sentence using finetuned Transformer-Bahasa.

  • Emotion Analysis, detect and recognize 6 different emotions of texts using finetuned Transformer-Bahasa.

  • Entities Recognition, seeks to locate and classify named entities mentioned in text using finetuned Transformer-Bahasa.

  • Generator, generate any texts given a context using T5-Bahasa, GPT2-Bahasa or Transformer-Bahasa.

  • Jawi-to-Rumi, convert from Jawi to Rumi using Transformer.

  • KenLM, provide easy interface to load Pretrained KenLM Malaya models.

  • Keyword Extraction, provide RAKE, TextRank and Attention Mechanism hybrid with Transformer-Bahasa.

  • Knowledge Graph, generate Knowledge Graph using T5-Bahasa or parse from Dependency Parsing models.

  • Language Detection, using Fast-text and Sparse Deep learning Model to classify Malay (formal and social media), Indonesia (formal and social media), Rojak language and Manglish.

  • Normalizer, using local Malaysia NLP researches hybrid with Transformer-Bahasa to normalize any bahasa texts.

  • Num2Word, convert from numbers to cardinal or ordinal representation.

  • Paraphrase, provide Abstractive Paraphrase using T5-Bahasa and Transformer-Bahasa.

  • Grapheme-to-Phoneme, convert from Grapheme to Phoneme DBP or IPA using LSTM Seq2Seq with attention state-of-art.

  • Part-of-Speech Recognition, grammatical tagging is the process of marking up a word in a text using finetuned Transformer-Bahasa.

  • Question Answer, reading comprehension using finetuned Transformer-Bahasa.

  • Relevancy Analysis, detect and recognize relevancy of texts using finetuned Transformer-Bahasa.

  • Rumi-to-Jawi, convert from Rumi to Jawi using Transformer.

  • Sentiment Analysis, detect and recognize polarity of texts using finetuned Transformer-Bahasa.

  • Text Similarity, provide interface for lexical similarity deep semantic similarity using finetuned Transformer-Bahasa.

  • Spelling Correction, using local Malaysia NLP researches hybrid with Transformer-Bahasa to auto-correct any bahasa words and NeuSpell using T5-Bahasa.

  • Stemmer, using BPE LSTM Seq2Seq with attention state-of-art to do Bahasa stemming including local language structure.

  • Subjectivity Analysis, detect and recognize self-opinion polarity of texts using finetuned Transformer-Bahasa.

  • Kesalahan Tatabahasa, Fix kesalahan tatabahasa using TransformerTag-Bahasa.

  • Summarization, provide Abstractive T5-Bahasa also Extractive interface using Transformer-Bahasa, skip-thought and Doc2Vec.

  • Tokenizer, provide word, sentence and syllable tokenizers.

  • Topic Modelling, provide Transformer-Bahasa, LDA2Vec, LDA, NMF and LSA interface for easy topic modelling with topics visualization.

  • Toxicity Analysis, detect and recognize 27 different toxicity patterns of texts using finetuned Transformer-Bahasa.

  • Transformer, provide easy interface to load Pretrained Language Malaya models.

  • Translation, provide Neural Machine Translation using Transformer for EN to MS and MS to EN.

  • Word2Num, convert from cardinal or ordinal representation to numbers.

  • Word2Vec, provide pretrained bahasa wikipedia and bahasa news Word2Vec, with easy interface and visualization.

  • Zero-shot classification, provide Zero-shot classification interface using Transformer-Bahasa to recognize texts without any labeled training data.

  • Hybrid 8-bit Quantization, provide hybrid 8-bit quantization for all models to reduce inference time up to 2x and model size up to 4x.

  • Longer Sequences Transformer, provide BigBird, BigBird + Pegasus and Fastformer for longer sequence tasks.

Pretrained Models

Malaya also released Bahasa pretrained models, simply check at Malaya/pretrained-model

References

If you use our software for research, please cite:

@misc{Malaya, Natural-Language-Toolkit library for bahasa Malaysia, powered by Deep Learning Tensorflow,
  author = {Husein, Zolkepli},
  title = {Malaya},
  year = {2018},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/huseinzol05/malaya}}
}

Acknowledgement

Thanks to KeyReply for sponsoring private cloud to train Malaya models, without it, this library will collapse entirely.

Also, thanks to Tensorflow Research Cloud for free TPUs access.

Contributing

Thank you for contributing this library, really helps a lot. Feel free to contact me to suggest me anything or want to contribute other kind of forms, we accept everything, not just code!

Project details


Release history Release notifications | RSS feed

This version

4.9.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

malaya-4.9.1-py3-none-any.whl (2.7 MB view details)

Uploaded Python 3

File details

Details for the file malaya-4.9.1-py3-none-any.whl.

File metadata

  • Download URL: malaya-4.9.1-py3-none-any.whl
  • Upload date:
  • Size: 2.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.10

File hashes

Hashes for malaya-4.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b678da700fc58ee4d555fdddfab19e8b3a5b67a6076616c9e364994561c4c9c4
MD5 29cf80bf8c47d34b1b2c57e31989c0cf
BLAKE2b-256 5b3a31d059ff3ed9d39d187a78c46ec81348759799f9dd4118aff5a4753ff82c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page