Skip to main content

Phrase recognizer component for spacy pipeline

Project description

Installation from pip3

pip3 install --verbose phrase_detective 
python -m spacy download en_core_web_trf
python -m spacy download es_dep_news_trf
python -m spacy download de_dep_news_trf

Usage

Please refer to api docs

Detect noun phrases

import spacy
from spacy import Language
from phrase_detective import NounPhraseRecognizer, PKG_INDICES

@Language.factory("nprecog")
def create_np_parser(nlp: Language, name: str):
  return NounPhraseRecognizer(nlp) 

def noun_phrase(lang, sentence):
  nlp = spacy.load(PKG_INDICES[lang])
  nlp.add_pipe("nprecog")
  doc = nlp(sentence)
  for np in doc._.noun_phrases:
    print(np.text)

Detect preposition phrases

import spacy
from spacy import Language
from phrase_detective import PrepPhraseRecognizer, PKG_INDICES

@Language.factory("pprecog")
def create_pp_parser(nlp: Language, name: str):
  return PrepPhraseRecognizer(nlp) 

def prep_phrase(lang, sentence):
  nlp = spacy.load(PKG_INDICES[lang])
  nlp.add_pipe("pprecog")
  doc = nlp(sentence)
  for np in doc._.prep_phrases:
    print(np.text)

Detect verb phrases

import spacy
from spacy import Language
from phrase_detective import VerbKnowledgeRecognizer, PKG_INDICES

@Language.factory("vkbrecog")
def create_vkb_parser(nlp: Language, name: str):
  return VerbKnowledgeRecognizer(nlp) 

def verb_knowledge(lang, sentence):
  nlp = spacy.load(PKG_INDICES[lang])
  nlp.add_pipe("vkbrecog")
  doc = nlp(sentence)
  for v in doc._.verbs:
    print("TEXT: {}, TAG: {}, FORM: {}, ORIGNAL: {}".format(v.text, v.tag_, spacy.explain(v.tag_), v.lemma_))
  for pp in doc._.passive_phrases:
    print(pp.text)
  for vp in doc._.verb_phrases:
    print(vp)

Development

Clone project

git clone https://github.com/qishe-nlp/phrase-detective.git

Install poetry

Install dependencies

poetry update

Test and Issue

poetry run pytest -rP

which run tests under tests/*

Create sphix docs

poetry shell
cd apidocs
sphinx-apidoc -f -o source ../phrase_detective
make html
python -m http.server -d build/html

Hose docs on github pages

cp -rf apidocs/build/html/* docs/

Build

  • Change version in pyproject.toml and phrase_detective/__init__.py
  • Build python package by poetry build

Git commit and push

Publish from local dev env

  • Set pypi test environment variables in poetry, refer to poetry doc
  • Publish to pypi test by poetry publish -r test

Publish through CI

git tag [x.x.x]
git push origin master

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

phrase-detective-0.1.15.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

phrase_detective-0.1.15-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file phrase-detective-0.1.15.tar.gz.

File metadata

  • Download URL: phrase-detective-0.1.15.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.8.2 Linux/5.8.0-1036-azure

File hashes

Hashes for phrase-detective-0.1.15.tar.gz
Algorithm Hash digest
SHA256 dc5d5c195ef6fee6e7bc6f218535244df646ae03da3dc73442847a3c73d934de
MD5 2b591c54daac25c143a73ab4e22aad43
BLAKE2b-256 67b15d11f16b07707fb8e2f977688843596e04ad940e41c9cd4740cb40c1e4ac

See more details on using hashes here.

File details

Details for the file phrase_detective-0.1.15-py3-none-any.whl.

File metadata

  • Download URL: phrase_detective-0.1.15-py3-none-any.whl
  • Upload date:
  • Size: 11.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.8.2 Linux/5.8.0-1036-azure

File hashes

Hashes for phrase_detective-0.1.15-py3-none-any.whl
Algorithm Hash digest
SHA256 46772f0dcce86c9e277d2c266d45585a321260741a3377b946264e0923009440
MD5 2bc365aed30ef4868775b5a6ea9e58ac
BLAKE2b-256 a17fd6a75f6da24165cb5d5cca8a7eb73b21bd218b5ce96c556a9ced516303eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page