Skip to main content

A set of tools for connecting to the Augmented Criticism Lab

Project description

Augmented Criticism Lab Toolkit and Connectors

This set of tools is designed for interfacing with the Augmented Criticism Lab's API, https://acriticismlab.org. The toolkit can be installed with pip:

pip install Augmented-Criticism-Lab-Toolkit

Using the Connectors

Connectors are used to pull data from the database over the API. Here are some examples:

from connectors.poem import Poem

# To get a list of all poems:
all_poems = Poem().all()

# To get a specific poem by database id:
single_poem = Poem().by_id(1)

from connectors.book import Book

# To get a list of all books:
all_books = Book().all()

# To get a specific book by database id:
single_book = Book().by_id(1)

The included connectors are book, poem, and section. Each connector works on the same principle.

Using the Tools

API based tools:

from tools.api import Tools
# Lemmatize text:
lemmas = Tools().lemmatize("text to lemmatize")

# Part of speech tags:
tags = Tools().pos_tag("text to tag")

# Frequency distribution:
freqdist = Tools().frequency_distribution("text to get distribution for")

# Topic model:
model = Tools().topic_model("text to model")

Note: Topic models take about a minute to run.

Python based tools:

Rhyme Scheme Analyzer:

from tools.rhyme import Rhyme
from tools.rhyme import classify_sonnet

# Initialize a Rhyme object with the text you want to analyze.
# The text must be separated into lines, you can define a delimiter
# the default is '\\n'. This returns a list of rhyme pairs:
# ['A','B','B','A','C','D','D','C','E','F','E','F','G,'G']
rhyme = Rhyme("text\n broken\n into lines", delimiter='\n').find_rhyme_scheme()

# To classify the rhyme scheme (only works for sonnets) run:
# Returns a tuple such that (each number represents a probability
# the sonnet of the type listed):
#(Petrarchan 1, Petrarchan 2, Petrarchan 3, Shakespearean,  Spenserian)
sonnet_type = classify_sonnet(rhyme)

Syllable Counter:

from tools.syllable import SyllableCounter

# Initialize a counter:
syllable_counter = SyllableCounter()

# Run a line of poetry through the counter:
syllable_count_for_line = syllable_counter.count_syllables_by_line("line of text")

It is also possible to run the syllable counter on a poem from the ACL database directly:

from tools.syllable import SyllableCounter
from connectors.poem import Poem

# Initialize a counter:
syllable_counter = SyllableCounter()

# Get a poem:
poem = Poem().by_id(1)

# Get counts for the poem:
counts = syllable_counter.count_syllables_poem(poem)

Extend the OUTLIERS: (i.e. words that don't give correct syllable counts)

Create a csv file formatted as:

WORD, NUMBER_OF_SYLLABES

# Example
apple, 2
orange, 2

You then load the csv file:

from tools.syllable import SyllableCounter

# Initialize a SyllableCounter.
syllable_counter = SyllableCounter()

# Load the custom outliers file into the counter.
syllable_counter.load_custom_outliers('PATH_TO_FILE')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for Augmented-Criticism-Lab-Toolkit, version 1.1.9
Filename, size File type Python version Upload date Hashes
Filename, size Augmented_Criticism_Lab_Toolkit-1.1.9-py3-none-any.whl (19.0 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size Augmented Criticism Lab Toolkit-1.1.9.tar.gz (14.1 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page