Skip to main content

Deep learning for Arabic text Vocalization - التشكيل الالي للنصوص العربية

Project description

Shakkala Project V 2.1 مشروع شكّالة

Introduction

The Shakkala project presents a recurrent neural network for Arabic text vocalization that automatically forms Arabic characters (تشكيل الحروف) to enhance text-to-speech systems. The model can also be used in other applications such as improving search results. In the beta version, the model was trained on over a million sentences, including a majority of historical Arabic data from books and some modern data from the internet. The accuracy of the model reached up to 95%, and in some data sets it achieved even higher levels of accuracy depending on complexity and data distribution. This innovative approach has the potential to significantly improve the quality of writing and text-to-speech systems for the Arabic language.

Requirements

pip install shakkala

Note: Shakkala has been tested with Tensorflow 2.9.3.

Code Examples (How to)

Check full example in (demo.py) file.

  1. Import
from shakkala import Shakkala
  1. Create Shakkala object
sh = Shakkala()

OR for advanced usage:

sh = Shakkala(version={version_num})
  1. Prepare input
input_text = "فإن لم يكونا كذلك أتى بما يقتضيه الحال وهذا أولى"
input_int = sh.prepare_input(input_text)
  1. Call the neural network
model, graph = sh.get_model()
logits = model.predict(input_int)[0]
  1. Predict output
predicted_harakat = sh.logits_to_text(logits)
final_output = sh.get_final_text(input_text, predicted_harakat)

Available models:

  • version_num=1: First test of the solution.
  • version_num=2: Main release version.
  • version_num=3: Some enhancements from version number 2.

It worth to try both version_num=2 and version_num=3.

Perfomance Tips

Shakkala built in object oriented way to load the model once into memory for faster prediction, to make sure you dont load it multiple times in your service or application follow the steps:

  • Load the model in global variable:
sh = Shakkala(folder_location, version={version_num})
model, graph = sh.get_model()
  • Then inside your request function or loop add:
input_int = sh.prepare_input(input_text)
logits = model.predict(input_int)[0]
predicted_harakat = sh.logits_to_text(logits)
final_output = sh.get_final_text(input_text, predicted_harakat)

Accuracy

In this beta version 2 accuracy reached up to 95% and in some data it reach more based on complexity and data disribution. This beta version trained on more than million sentences with majority of historical Arabic data from books and some of available formed modern data in the internet.

Prediction Example

For live demo based on Shakkala library click the link

Real output Predicted output
فَإِنْ لَمْ يَكُونَا كَذَلِكَ أَتَى بِمَا يَقْتَضِيهِ الْحَالُ وَهَذَا أَوْلَى فَإِنْ لَمْ يَكُونَا كَذَلِكَ أَتَى بِمَا يَقْتَضِيهِ الْحَالُ وَهَذَا أَوْلَى
قَالَ الْإِسْنَوِيُّ وَسَوَاءٌ فِيمَا قَالُوهُ مَاتَ فِي حَيَاةِ أَبَوَيْهِ أَمْ لَا قَالَ الْإِسْنَوِيُّ وَسَوَاءٌ فِيمَا قَالُوهُ مَاتَ فِي حَيَاةِ أَبَوَيْهِ أَمْ لَا
طَابِعَةٌ ثُلَاثِيَّةُ الْأَبْعَاد طَابِعَةٌ ثَلَاثِيَّةُ الْأَبْعَادِ

Accuracy Enhancements

The model can be enhanced to reach more than 95% accuracy with following:

  • Availability of more formed modern data to train the network. (because current version trained with mostly available historical Arabic data and some modern data)
  • Stack different models

References

Citation

For academic work use

Shakkala, Arabic text vocalization, Barqawi & Zerrouki

OR bibtex format

@misc{
  title={Shakkala, Arabic text vocalization},
  author={Barqawi, Zerrouki},
  url={https://github.com/Barqawiz/Shakkala},
  year={2017}
}

Contribution

Core Team

  1. Ahmad Barqawi: Neural Network Developer.
  2. Taha Zerrouki: Mentor Data and Results.

Contributors

  1. Zaid Farekh & propellerinc.me: Provide infrastructure and consultation support.
  2. Mohammad Issam Aklik: Artist.
  3. Brahim Sidi: Form new sentences.
  4. Fadi Bakoura: Aggregate online content.
  5. Ola Ghanem: Testing.
  6. Ali Hamdi Ali Fadel: Contribute code.

License

Free to use and distribute only mention the original project name Shakkala as base model.

The MIT License (MIT)

Copyright (c) 2017 Shakkala Project

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shakkala-1.7.tar.gz (36.6 MB view hashes)

Uploaded Source

Built Distribution

shakkala-1.7-py3-none-any.whl (73.3 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page