Skip to main content

Intelligent information integration based on weak supervision

Project description

🔮 weak-nlp

Intelligent information integration based on weak supervision Python 3.9 pypi 0.0.13

Installation

You can set up this library via either running $ pip install weak-nlp, or via cloning this repository and running $ pip install -r requirements.txt in your repository.

A sample installation would be:

$ conda create --name weak-nlp python=3.9
$ conda activate weak-nlp
$ pip install weak-nlp

Usage

The library consists of three main entities:

  • Associations: an association contains the information of one record <> label mapping. This does not have to be ground truth label for a given record, but can also come from e.g. a labelfunction (see below for an example).
  • Source vectors: A source vector combines the created associations from one logical source. Additionally, it marks whether the respective source vector can be seen as a reference vector, such as a manually labeled source vector containing the true record <> label mappings.
  • Noisy label matrices: Collection of source vectors that can be analyzed w.r.t. quality metrics (such as the confusion matrix, i.e., true positives etc.), quantity metrics (intersections and conflicts) or weakly supervisable labels.

The following is an example for building a noisy label matrix for a classification task

import weak_nlp

def contains_keywords(text):
    if any(term in text for term in ["val1", "val2", "val3"]):
        return "regular"

texts = [...]

lf_associations = []
for text_id, text in enumerate(texts):
    label = contains_keywords(text)
    if label is not None:
        association = weak_nlp.ClassificationAssociation(text_id + 1, label)
        lf_associations.append(association)

lf_vector = weak_nlp.SourceVector(contains_keywords.__name__, False, lf_associations)

ground_truths = [
    weak_nlp.ClassificationAssociation(1, "clickbait"),
    weak_nlp.ClassificationAssociation(2, "regular"),
    weak_nlp.ClassificationAssociation(3, "regular")
]

gt_vector = weak_nlp.SourceVector("ground_truths", True, ground_truths)

cnlm = weak_nlp.CNLM([gt_vector, lf_vector])

Whereas for extraction tasks, your code snippet could look as follows:

import weak_nlp

def match_keywords(text):
    for idx, token in enumerate(text.split()):
        if token in ["val1", "val2", "val3"]:
            yield "person", idx, idx+1 # label, from_idx, to_idx

texts = [...]

lf_associations = []
for text_id, text in enumerate(texts):
    for triplet in match_keywords(text):
        label, from_idx, to_idx = triplet
        association = weak_nlp.ExtractionAssociation(text_id + 1, label, from_idx, to_idx)
        lf_associations.append(association)

lf_vector = weak_nlp.SourceVector(match_keywords.__name__, False, lf_associations)

ground_truths = [
    weak_nlp.ExtractionAssociation(1, "person", 1, 2),
    weak_nlp.ExtractionAssociation(2, "person", 4, 5),
]

gt_vector = weak_nlp.SourceVector("ground_truths", True, ground_truths)

enlm = weak_nlp.ENLM([gt_vector, lf_vector])

Roadmap

If you want to have something added, feel free to open an issue.

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

And please don't forget to leave a ⭐ if you like the work!

License

Distributed under the Apache 2.0 License. See LICENSE.txt for more information.

Contact

This library is developed and maintained by kern.ai. If you want to provide us with feedback or have some questions, don't hesitate to contact us. We're super happy to help ✌️

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

weak_nlp-0.0.13-py2.py3-none-any.whl (18.5 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file weak_nlp-0.0.13-py2.py3-none-any.whl.

File metadata

  • Download URL: weak_nlp-0.0.13-py2.py3-none-any.whl
  • Upload date:
  • Size: 18.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.3

File hashes

Hashes for weak_nlp-0.0.13-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 5a511d52bd4f339624803a230f8d876442106519b0287a62432c21e6123d571a
MD5 523d4528d993d6f0de3deb1e3b08580d
BLAKE2b-256 a9889dee6f60189933919d13a1562b0b2664ac0699d6c0d15540f7491a2ff8d4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page