Skip to main content

Utility functions for the Impact and Fiction project

Project description

impfic-core

GitHub Actions Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public. PyPI PyPI - Python Version

Core code base for common functionalities

Installing

pip install impfic-core

Usage

Dealing with output from different parsers

The Doc class of impfic-core offers a unified API to parsed document from different parsers (currently SpaCy and Trankit).

import spacy
from trankit import Pipeline

import impfic_core.parse.doc as parse_doc

spacy_nlp = spacy.load('en_core_web_lg')

trankit_nlp = Pipeline('english')

# First paragraph of Moby Dick, taken from Project Gutenberg (https://www.gutenberg.org/cache/epub/2701/pg2701-images.html)
text = """Call me Ishmael. Some years ago—never mind how long precisely—having little or no money in my purse, and nothing particular to interest me on shore, I thought I would sail about a little and see the watery part of the world. It is a way I have of driving off the spleen and regulating the circulation. Whenever I find myself growing grim about the mouth; whenever it is a damp, drizzly November in my soul; whenever I find myself involuntarily pausing before coffin warehouses, and bringing up the rear of every funeral I meet; and especially whenever my hypos get such an upper hand of me, that it requires a strong moral principle to prevent me from deliberately stepping into the street, and methodically knocking people’s hats off—then, I account it high time to get to sea as soon as I can. This is my substitute for pistol and ball. With a philosophical flourish Cato throws himself upon his sword; I quietly take to the ship. There is nothing surprising in this. If they but knew it, almost all men in their degree, some time or other, cherish very nearly the same feelings towards the ocean with me."""

Document objects have the following properties: text (the whole text string) sentences, tokens, entities and optional metadata (a dictionary with whatever keys and values).

# parse with both SpaCy and Trankit
spacy_doc = spacy_nlp(text)
trankit_doc = trankit_nlp(text)

# First, turn SpaCy document object to an impfic Doc
impfic_doc1 = parse_doc.spacy_json_to_doc(spacy_doc.to_json())

# Next, turn Trankit document object to an impfic Doc
impfic_doc2 = parse_doc.trankit_json_to_doc(trankit_doc)

# Show type and length of impfic_core Doc
# Doc length is number of tokens
print('impfic Doc of SpaCy parse:', type(impfic_doc1), len(impfic_doc1))

print('impfic Doc of Trankit parse:', type(impfic_doc2), len(impfic_doc2))

Outputs:

>>> impfic Doc of SpaCy parse: <class 'impfic_core.parse.doc.Doc'> 190
>>> impfic Doc of Trankit parse: <class 'impfic_core.parse.doc.Doc'> 226

Sentence objects have the following properties:

  • id: ID of the sentence in the document (running numbers)
  • tokens: a list of Token objects
  • entitites: a list of Entity objects (named entities identified by the parser)
  • text: the sentence as text string
  • start: the character offset of the start of the sentence within the document
  • end: the character offset of the end of the sentence within the document

Extracting Clausal Units

sent = doc.sentences[5]
for sent in doc.sentences:
    print(sent.text)
    clauses = pattern.get_verb_clauses(sent)
    for clause in clauses:
        print([t.text for t in clause])
With a philosophical flourish Cato throws himself upon his sword; I quietly take to the ship.
clause: ['With', 'a', 'philosophical', 'flourish', 'Cato', 'throws', 'himself', 'upon', 'his', 'sword', ';', '.']
clause: ['I', 'quietly', 'take', 'to', 'the', 'ship']

External Resources

To use utilities for external resources such as the RBN, you need to point to your copy of those resources in the settings (settings.py). Once you have done that, you can use them with:

from settings import rbn_file
from impfic_core.resources.rbn import RBN

rbn = RBN(rbn_file)

rbn.has_term('aanbiddelijk') # returns True

Anonymisation

For review anonymisation you need a salt hash in a file called impfic_core/secrets.py. The repository doesn't contain this file to ensure other cannot recreate the user ID mapping. An example file is available as impfic_core/secrets_example.py. Copy this file to impfic_core/secrets.py and update the salt hash to do your own user ID mapping.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

impfic_core-0.8.0.tar.gz (34.0 kB view details)

Uploaded Source

Built Distribution

impfic_core-0.8.0-py3-none-any.whl (40.2 kB view details)

Uploaded Python 3

File details

Details for the file impfic_core-0.8.0.tar.gz.

File metadata

  • Download URL: impfic_core-0.8.0.tar.gz
  • Upload date:
  • Size: 34.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for impfic_core-0.8.0.tar.gz
Algorithm Hash digest
SHA256 771f72139ac8a907d6c75272c4c17fde6be07e33486ac88697ee130fad3b381f
MD5 2c095172a73fd955a7686b85f13a0305
BLAKE2b-256 e5425c87d6ab7c3fce7c85284b511505c1729fc4765b94981b5433629b968149

See more details on using hashes here.

Provenance

File details

Details for the file impfic_core-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: impfic_core-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 40.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for impfic_core-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7a59133a1cb200743599d4e8e2dc53bdf7179b4f6e70991e87fa287abb3e751a
MD5 874031e2e393cb096a94a7ae15f48416
BLAKE2b-256 32c7c7f68e5c1b909b6716c09b7c6b5dfbb90f2f6a1b6ebda2e6c22686638e8f

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page