Skip to main content

A Python NLP Library for Persian language, by PartDP AI

Project description

https://raw.githubusercontent.com/partdpai/PartNLP/master/images/PartAiLogo.png

PartNLP Project

Overview

documentation

This documentation is all about PartNLP package. PartNLP designs to help developers to perprocessing their text automatically! Also it has many useful features that makes perprocessing more fun! However, This is not an exhaustive description but it should show you how use the package effortlessly.

Introduction

PartNLP is an integrated package uses many famous packages. Moreover, PartNLP supports multi languages. In the below table you can see all valid operations accomplishing by PartNLP and their corresponder packages.

Operations

Keyword

Packages

normalize

NORMALIZE

HAZM, PARSIVAR

sent tokenize

S_TOKENIZE

HAZM, PARSIVAR, STANZA

word tokenize

W_TOKENIZE

HAZM, PARSIVAR, STANZA

lemmatize

LEMMATIZE

HAZM, STANZA

stem

STEM

HAZM, PARSIVAR, STANZA

Installation

for installing, you can simply use pip to install the package.

>>> pip install PartNLP

Pipeline Usage Example

>>> from PartNLP import Pipeline

>>> Pipeline(lang='persian', package='hazm', processors=['W_TOKENIZE', 'LEMMATIZE'], text='این متن، جهت بررسی عملکرد بسته نوشته شده است.')

Pipeline also can handle missing required data in which should be passed by users. In the below example no package, language or processors entered but Pipeline asks you to fill them out.

 >>> Pipeline(text='این متن، جهت بررسی عملکرد بسته نوشته شده است.')
Warning: no package selected. List of supported packages:['HAZM', 'PARSIVAR', 'STANZA']
please enter a valid value: 'hazm'
Warning: no language selected. List of supported languages:['ENGLISH', 'PERSIAN']
please enter a valid value: 'persian'
Warning: no operator selected. List of supported operations for 'hazm' package :['NORMALIZE', 'S_TOKENIZE', 'STEM', 'W_TOKENIZE', 'LEMMATIZE']

Interface Usage Example

In this section we are going to see the simple usage of PartNLP package.

https://gitlab.partdp.ai/naturallanguageprocessing/preprocess/preprocess_ai/raw/version-0.1/images/Interface.gif

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

PartNLP-0.1.12.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

PartNLP-0.1.12-py3-none-any.whl (24.8 kB view details)

Uploaded Python 3

File details

Details for the file PartNLP-0.1.12.tar.gz.

File metadata

  • Download URL: PartNLP-0.1.12.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for PartNLP-0.1.12.tar.gz
Algorithm Hash digest
SHA256 b4b6f8f4cb4ddd4a70c9cc1358f9eb7eae9e945cdff25154e0622648b170b4d4
MD5 d582e27dfd3bd8f26bcf4584271802de
BLAKE2b-256 953dfb38fa8055049ffd9b36fbb46fb2ce584df72614aae8365476b6a8454e25

See more details on using hashes here.

File details

Details for the file PartNLP-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: PartNLP-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 24.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for PartNLP-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 fb31b6884c596e9ce7e8f31740018b634b59cc44de6fd566b73c7c76308b382b
MD5 699ccb45593cc693102ea772012800b5
BLAKE2b-256 6d0bc8f4a60467ef78335ebe5e01181f83b472f8b54f28577a7c2a764757f0dc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page