The Python Xrenner JSON-NLP package
Project description
Xrenner to JSON-NLP
(C) 2019 by Damir Cavar, Oren Baldinger, Maanvitha Gongalla, Anurag Kumar, Murali Kammili, Boli Fang
Brought to you by the NLP-Lab.org!
Introduction
Xrenner wrapper for JSON-NLP. Xrenner specializes in coreference and anaphora resolution, in a more highly annotated manner than just a coreference chain.
Required Dependency Parse
Xrenner requires a Dependency Parse in CoNLL-U format. This can come from CoreNLP, or another parser that provides universal dependencies in [CoNNL-U] format. There are two ways to accomplish this:
CoreNLP Server
The XrennerPipeline
class will take care of the details, however it requires an available CoreNLP server.
The easiest way to create one is with Docker:
docker pull nlpbox/corenlp
docker run -p 9000:9000 -ti nlpbox/corenlp
To test this, open a new tab,
wget -q --post-data "Although they didn't like it, they accepted the offer." 'localhost:9000/?properties={"annotators":"depparse","outputFormat":"conll"}' -O /dev/stdout
You then need to create a .env
file in the root of the project, follow the example in sample_env
.
The default entry that corresponds to the Docker command above is:
CORENLP_SERVER=http://localhost:9000
Provide your own CoNLL-U
Use the XrennerPipeline.process_conll
function, with your conll data passed as a string via
the conll
argument.
You may find the pyjsonnlp.conversion.to_conllu
function helpful for converting JSON-NLP,
maybe from spaCy, to CoNLL-U.
Microservice
The JSON-NLP repository provides a Microservice class, with a pre-built implementation of Flask. To run it, execute:
python xrennerjsonnlp/server.py
Since server.py
extends the Flask app, a WSGI file would contain:
from xrennerjsonnlp.server import app as application
Text is provided to the microservice with the text
parameter, via either GET
or POST
. If you pass url
as a parameter, the microservice will scrape that url and process the text of the website.
Here is an example GET
call:
http://localhost:5000?text=John went to the store. He bought some milk.
The process_conll
endpoint mentioned above is available at the /process_conll
URI. Instead of passing text
, pass conll
. A POST operation will be easier than GET
in this situation.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file xrennerjsonnlp-0.0.5.tar.gz
.
File metadata
- Download URL: xrennerjsonnlp-0.0.5.tar.gz
- Upload date:
- Size: 14.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b8e8290b2d3499b87af9eb126aa47360f2da5a386a6cd2fa90866730a5784d7a |
|
MD5 | 59a93fc9a59172922f397983e66f39f3 |
|
BLAKE2b-256 | a4e7dfc5ba641d865bc4bfcc8469e790a1ba4ebc9cb728cc2ca0b09702095dc3 |
File details
Details for the file xrennerjsonnlp-0.0.5-py3-none-any.whl
.
File metadata
- Download URL: xrennerjsonnlp-0.0.5-py3-none-any.whl
- Upload date:
- Size: 18.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a190be4470d5f352d4ed51e7bd634b584bddd6555078260bd00356b73048db9f |
|
MD5 | 0422d51894443f05fc0e3fdbc17e96ea |
|
BLAKE2b-256 | d8b096196b72e87fcc67072ec07dd0485d68780ae22d4c9df1cd3ba094da2bb8 |