Skip to main content

A python wapper for Stanford CoreNLP, simple and customizable.

Project description

corenlp-client

prerequisites

Java 8

Python >= 3.6

Description

A simple, user-friendly python wrapper for Stanford CoreNLP, an nlp tool for natural language processing in Java. CoreNLP provides a lingustic annotaion pipeline, which means users can use it to tokenize, ssplit(sentence split), POS, NER, constituency parse, dependency parse, openie etc. However, it's written in Java, which can not be interacted directly with Python programs. Therefore, we've developed a CoreNLP client tool in Python. The corenlp-client can be used to start a CoreNLP Server once you've followed the official release and download necessary packages and corresponding models. Or, if a server is already started, the only thing you need to do is to specify the server's url, and call the annoate method.

Installation

pip install corenlp_client

Usage

Quick start:

Sometimes you may want to get directly tokenized (pos, ner) result in list format without extra efforts. So, we provide tokenize(), pos_tag(), ner() methods to simplify the whole process.

from corenlp_client import CoreNLP
# max_mem: max memory use, default is 4. threads: num of threads to use, defualt is num of cpu cores.
annotator = CoreNLP(annotators="tokenize", corenlp_dir="/path/to/corenlp", local_port=9000, max_mem=4, threads=2)
# you can set `clean_text=True` to remove extra spaces in your text
# by setting `ssplit=False`, you'll get a list of tokens without splitting sentences
tokenized_text = annotator.tokenize(data, ssplit=False, clean_text=True)
pos_tags = annotator.pos_tag(data)
ners = annotator.ner(data)
annotator.close()

Start a new server and annotate text:

If you want to start a server locally, it's more graceful to use with ... as ... to handle exceptions.

from corenlp_client import CoreNLP
# max_mem: max memory use, default is 4. threads: num of threads to use, defualt is num of cpu cores.
with CoreNLP(annotators="tokenize", corenlp_dir="/path/to/corenlp", local_port=9000, max_mem=4, threads=2) as annotator:
    # your code here

Use an existing server:

You can also use an existing server by providing the url.

from corenlp_request import CoreNLP
# lang for language, default is en.
# you can specify annotators to use by passing `annotator="tokenize,ssplit"` args to CoreNLP. If not provided, all available annotators will be used.
with CoreNLP(url="https://corenlp.run", lang="en") as annotator:
    # your code here

Advanced Usage

For advanced users, you may want to have access to server's original response in dict format:

anno = annotator.annotate("CoreNLP is your one stop shop for natural language processing in Java! Enjoy yourself! ")
print(anno.tokens) # tokens
print(anno.parse_tree) # parse
print(anno.bi_parse_tree) # binaryParse
print(anno.basic_dep) # basicDependencies
print(anno.enhanced_dep) # enhancedDependencies
print(anno.enhanced_pp_dep) # enhancedPlusPlusDependencies
print(anno.entities) # entitymentions
print(anno.openie) # openie

print(anno.ann_result) # original server's response format
print(anno.pretty_print_tree(anno.parse_tree[0])) # pretty print parse tree's structure

Extra Notes

Note that if you choose to start server locally, it'll take a while to load models for the first time. Also, if "with" is not used, remember to call close() method to stop the Java CoreNLP server.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

corenlp_client-1.0.1.tar.gz (4.9 kB view details)

Uploaded Source

Built Distribution

corenlp_client-1.0.1-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file corenlp_client-1.0.1.tar.gz.

File metadata

  • Download URL: corenlp_client-1.0.1.tar.gz
  • Upload date:
  • Size: 4.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.23.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.3

File hashes

Hashes for corenlp_client-1.0.1.tar.gz
Algorithm Hash digest
SHA256 0037d5f56a319986695da599d0a5ea52f25d0da73726e5f7097a8334190ad1c0
MD5 36d78164bc39dc92f56676d76c0f77b7
BLAKE2b-256 fc9fdabafc3bd889b15d3c43e2a7acbb43450283b2b2aebdae3ae6c9895f5532

See more details on using hashes here.

File details

Details for the file corenlp_client-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: corenlp_client-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 5.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.23.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.3

File hashes

Hashes for corenlp_client-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1931a258b2cc6388e39830428197e3ebf3d5b6ee5c0121a91aa451c57f66c81c
MD5 ae6add99dc27dd68d83a5df8bde20e00
BLAKE2b-256 cd21575a94716caa72b6dbc66965cbc010b0f6ca9b01fe6accca5e3987a2607f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page