Skip to main content

Reproducible Experimentation for Computational Linguistics Use

Project description

Recluse

Author: L. Amber Wilcox-O'Hearn

Contact: amber@cs.toronto.edu

Released under the GNU AFFERO GENERAL PUBLIC LICENSE, see COPYING file for details.

==============
Introduction
==============

Recluse (Reproducible Experimentation for Computational Linguistics Use) is a set of tools for running computational linguistics experiments reproducibly.

This version contains

* utils, which has three functions:
** open_with_unicode for reading and writing unicode with regular or compressed text
** split_file_into_chunks for splitting a file into smaller pieces. This is needed for some tools that load everything into RAM, or train on all the data when we would be satisfied with training on partial data.
** partition_by_list works like a combination of the string methods partition and split; it keeps the separators, but partitions into a list.

* article_randomiser, which reproducibly randomly divides a corpus into training, development, and test sets.
* nltk_based_segmenter_tokeniser, which does sentence segmentation and word tokenisation.
It is optimised for Wikipedia type text.
* vocabulary_generator and the helper class vocabulary_cutter. This wraps srilm as it makes unigram counts, and then selects the most frequent.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

recluse-0.2.0.tar.gz (459.1 kB view details)

Uploaded Source

File details

Details for the file recluse-0.2.0.tar.gz.

File metadata

  • Download URL: recluse-0.2.0.tar.gz
  • Upload date:
  • Size: 459.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for recluse-0.2.0.tar.gz
Algorithm Hash digest
SHA256 2c497862d45f8ec1ae4436aec663425550d6572108e8fa080ddf656cb119b8f2
MD5 0c44e119ef0c6f0323cf8161b99475e4
BLAKE2b-256 caf95db1c40b0d4a56e6055e5b93ff9a1004ddf7bca75ea6ee478e4633ebd09c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page