Skip to main content

An end-to-end solution for processing Capture-C, Tri-C and Tiled-C data

Project description

CapCruncher

Documentation Status codecov CI Anaconda-Server Badge DOI Downloads

Analysis software for Capture-C, Tri-C and Tiled-C data.

CapCruncher is a tool designed to automate the processing of Capture-C, Tri-C and Tiled-C data from FASTQ files, the package is written in python and consists of an end-to-end data processing pipline together with a supporting command line interface to enable finer grained control. The pipeline provided is fast, robust and scales from a laptop to a computational cluster.

For further information see the documentation

Changelog

CapCruncher v0.2.0 - (2022-02-07)

Bug Fixes

  • CLI: Fixed help option not being displayed when running capcruncher pipeline (#129) (9f09093)
  • Deduplication: Prevents excessive memory usage during duplicate removal (#136) (b175978)
  • Packaging: Fixed bug after updating pyyaml to latest version (#122) (7d76b5f)
  • Packaging: Added missing dependencies (seaborn and trackhub) to setup.cfg (550a882)
  • Packaging: Fixed packaging long description. (#115) (6f716d1)
  • Pipeline: Fixes issue with tasks going over their allotted number of cores. (#133) (27cd193)
  • Pipeline: Fixes error during deduplication when using gzip compression (#134) (01ff56b)
  • Pipeline: Re-partition reporter slices after filtering (#124) (db72c56)
  • Reporter comparisons: Fixed an issue when no data exists for a viewpoint for a given sample (#139) (e720029)
  • Storage: Fix link common cooler tables (#137) (4836fbe)

Features

  • CLI: Enables pileup normalisation using a set of regions supplied as a bed file (#121) (9c587ff)
  • Packaging: Moved all configuration from setup.py to setup.cfg. (#114) (4835da4)
  • Pipeline: Expanded the number of viewpoints that can be processed (#128) (8fcb576)
  • Pipeline: Capability to normalise pileups (bedgraphs/bigwigs) by a set of supplied regions. (#125) (bab07ea)
  • Pipeline: Enable optional compression during fastq split and deduplicate (#131) (0c32b73)
  • Pipeline: Enabled the use of custom filtering orders (#119) (b57ebe8)
  • Pipeline: Reduced disk space required by pipeline by removing intermediate files (#135) (d6c4302)
  • Pipeline:: Reporter counting now performed in parallel on separate partitions before collating. (#117) (aae5356)
  • Pipeline: Reverted without_cluster for reporter comparisons (#140) (f847d28)
  • Storage: Reduce disk space taken up by reporters (slices and counts) (#138) (7659a8c)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

capcruncher-0.2.0.tar.gz (128.2 kB view details)

Uploaded Source

Built Distribution

capcruncher-0.2.0-py3-none-any.whl (148.2 kB view details)

Uploaded Python 3

File details

Details for the file capcruncher-0.2.0.tar.gz.

File metadata

  • Download URL: capcruncher-0.2.0.tar.gz
  • Upload date:
  • Size: 128.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.1 requests/2.26.0 setuptools/57.4.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.6

File hashes

Hashes for capcruncher-0.2.0.tar.gz
Algorithm Hash digest
SHA256 db15874b5bd3c31a64b4f0de7e19a153ae342345ff25c76958a5c381496664df
MD5 717440d5687ac01c06728bbb82f48a42
BLAKE2b-256 ab8507383d62359c33bab143b6f85173a09852963625a5c6e23bbfd5c54d81d4

See more details on using hashes here.

File details

Details for the file capcruncher-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: capcruncher-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 148.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.1 requests/2.26.0 setuptools/57.4.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.9.6

File hashes

Hashes for capcruncher-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f09e008077e117a2be5a140ea55adefe98b1b5b8101b74e6902449448697cedd
MD5 724b91564734037954d23583265e2779
BLAKE2b-256 d8b8f378ab302aba8e011a3548bc8e12ca018dd51bae5e175f2aeac9b7023dbe

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page