Skip to main content

Reproducible Experimentation for Computational Linguistics Use

Project description

Recluse

Author: L. Amber Wilcox-O'Hearn

Contact: amber@cs.toronto.edu

Released under the GNU AFFERO GENERAL PUBLIC LICENSE, see COPYING file for details.

==============
Introduction
==============

Recluse (Reproducible Experimentation for Computational Linguistics Use) is a set of tools for running computational linguistics experiments reproducibly.

This version contains

* utils, which has three functions:
** open_with_unicode for reading and writing unicode with regular or compressed text
** split_file_into_chunks for splitting a file into smaller pieces. This is needed for some tools that load everything into RAM, or train on all the data when we would be satisfied with training on partial data.
** partition_by_list works like a combination of the string methods partition and split; it keeps the separators, but partitions into a list.

* article_randomiser, which reproducibly randomly divides a corpus into training, development, and test sets.
* nltk_based_segmenter_tokeniser, which does sentence segmentation and word tokenisation.
It is optimised for Wikipedia type text.
* vocabulary_generator and the helper class vocabulary_cutter. This wraps srilm as it makes unigram counts, and then selects the most frequent.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

recluse-0.2.1.tar.gz (459.2 kB view details)

Uploaded Source

File details

Details for the file recluse-0.2.1.tar.gz.

File metadata

  • Download URL: recluse-0.2.1.tar.gz
  • Upload date:
  • Size: 459.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for recluse-0.2.1.tar.gz
Algorithm Hash digest
SHA256 b06ddc386f11df6aa4556ac7889f5ea70ea3b58261057c03aa55aed6677b0aef
MD5 eb8b1f8f9735f826acb79af81b8dfc01
BLAKE2b-256 32eed10ef1fde01765d54a27b30c1e56e58e85c0a7b2f5d5adc17b3f5ed0d87b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page