Skip to main content

This summarizer attempts to leverage Byte Pair Encoding (BPE) tokenization and the Bart vocabulary to filter text by semantic meaningfulness.

Project description

BPE Summarizer

CI

This summarizer attempts to leverage Byte Pair Encoding (BPE) tokenization and the Bart vocabulary to filter text by semantic meaningfulness.

BPE text representation is a subword level approach to tokenization which aims to efficiently reuse parts of words while retaining semantic value.

The algorithm is based on the frequency of n-gram pairs. More frequent pairs are represented by larger tokens.

This project explored the assumption that token size correlates strongly to semantic meaningfulness. This summarization approach intends to surface the most meaningful sentences with comparing token values and retaining sentences from the original text that included meaningful tokens within a specified percentile.

Install

pip install bpe-summarizer

Usage

from bpe_summarizer import bpe_summarize

bpe_summarize(article, percentile=99)

Parameters

Parameter Definition Default Type
document A text blob with sentences delineated by punctuation None String
percentile Sentences that include tokens in the top kth percentile will remain after summarization 99 Float
tokenizer A huggingface PreTrainedTokenizer instance that relies on byte-pair-encoding BartTokenizer.from_pretrained("facebook/bart-large") transformers.PreTrainedTokenizer
apply_intra_sentence If True, summarization will be applied at both the document level and the sentence level False False
intra_sentence_percentile When apply_intra_sentence is True, this percentile will be applied to individual sentences 50* Float
  • Note: intra_sentence_percentile is ignored if its value represents less than the percentile score of the mean of tokens, otherwise the percentile score of the mean is used.

Examples

Human Summary

Building Deep Dependency Structures Using A Wide-Coverage CCG Parser

This paper describes a wide-coverage statistical parser that uses Combinatory Categorial Grammar (CCG) to derive dependency structures.

The parser differs from most existing wide-coverage treebank parsers in capturing the long-range dependencies inherent in constructions such as coordination, extraction, raising and control, as well as the standard local predicate-argument dependencies.

A set of dependency structures used for training and testing the parser is obtained from a treebank of CCG normal-form derivations, which have been derived (semi-) automatically from the Penn Treebank.\nThe parser correctly recovers over 80% of labelled dependencies, and around 90% of unlabelled dependencies.

We provide examples showing how heads can fill dependency slots during a derivation, and how long-range dependencies can be recovered through unification of co-indexed head variables.

We define predicate argument structure for CCG in terms of the dependencies that hold between words with lexical functor categories and their arguments.\n

BPE Summary

Building Deep Dependency Structures Using A Wide-Coverage CCG Parser

This paper describes a wide-coverage statistical parser that uses Combinatory Categorial Grammar (CCG) to derive dependency structures.

The parser differs from most existing wide-coverage treebank parsers in capturing the long-range dependencies inherent in constructions such as coordination, extraction, raising and control, as well as the standard local predicate-argument dependencies.

A set of dependency structures used for training and testing the parser is obtained from a treebank of CCG normal-form derivations, which have been derived (semi-) automatically from the Penn Treebank. The parser correctly recovers over 80% of labelled dependencies, and around 90% of unlabelled dependencies. However, the dependencies are typically derived from a context-free phrase structure.

Evaluation

To evaluate the quality of the summarization, we apply a semantic similarity metric, to compare auto-summarized examples with human summaries from the scisummnet dataset. Text was represented using sentence-level embeddings. Figure 1. charts the results from the BPE Summarizer as compared to widely used summarization techniques. It performed competitively and completed summarization in one one-hundredth of a second as compared to 55 seconds* over 100 samples.

Side-by-side with widely used summarizer

Fig1. Evaluation alongside a widely used summarizer

*Performance evaluation was done using a CPU, and the competitive technique was applied after stripping down to use only the summarization component.

References:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bpe-summarizer-0.2.1.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

bpe_summarizer-0.2.1-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file bpe-summarizer-0.2.1.tar.gz.

File metadata

  • Download URL: bpe-summarizer-0.2.1.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for bpe-summarizer-0.2.1.tar.gz
Algorithm Hash digest
SHA256 bb3a2155efc82c2813c00f5048a44ae18c81dfd698d3d0df8d3b04d47be129fc
MD5 ba6e5a99a57390bb99dddd02bda17796
BLAKE2b-256 9ffe9d565f1c1a5b437283f3ee38c1e2c74647594d79f21e5f78da67036f9532

See more details on using hashes here.

File details

Details for the file bpe_summarizer-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: bpe_summarizer-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 6.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for bpe_summarizer-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 07b56c246d482b45d2c3e8eddc1b71fe4f255311b3043c871eab869ad26e4e7e
MD5 768f3f0b4d99a916341c05f30d35b2c8
BLAKE2b-256 75697c92f24c89eb76064134d7ee8a8d7801be88ccfcf42547900bc7a1cf368c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page