Skip to main content

A toolbox for processing data from long-slit spectrographs

Project description

Welcome to specsuite!

For help with getting specsuite running on your own data, please check out the documentation page!

Introduction

Although other spectroscopic reduction tools exist, they are often designed for a small subset of instruments, have hard-to-read documentation, or are difficult to debug. specsuite was designed to address all three of these concerns, providing a set of robust, generalized, and user-friendly reduction tools! As of writing, this reduction pipeline has been tested this reduction pipeline against data from...

  • Gemini North (GMOS-N)
  • Apache Point Observatory (KOSMOS)
  • Sommers-Bausch Observatory (SBO)

...but we are constantly testing on data from other telescopes!

Another advantage of specsuite is its modularity. All functions were designed to be easy to slot into an existing reduction pipeline (assuming the data is formatted correctly). If there are features you would like to see added to specsuite, please feel free to add an issue on this repository for our developers to address!

Installation

To install the most recent version of specsuite, run the following command from your terminal...

pip install specsuite

OR if you would like to install a version from this repository, the run...

git clone https://github.com/Autumn10677/specsuite.git
cd specsuite
pip install .

How can I test specsuite runs on my computer?

We have provided a handful of files and scripts that should help you get started on processing your data.

  • specsuite_env.yml ~ A working Conda environment for the current version of the package.
  • workflow.smk ~ This is a "snakemake workflow" set to run on some sample data taken from APO's long-slit spectrograph.

To run this workflow on your own computer, first clone the repository using...

git clone https://github.com/Autumn10677/specsuite.git
cd specsuite

Then run...

conda env create -f environment.yml
conda activate specsuite_env
snakemake --cores 1

This should deposit a set of files in an 'output/' folder that you can use to check out how the pipeline works at various steps in the analysis. These outputs include both images and '.npy' files used for storing exposure data between steps of the pipeline. If you see...

Finished jobid: 0 (Rule: all)
4 of 4 steps (100%) done

...then the pipeline ran successfully!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

specsuite-1.1.1.tar.gz (36.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

specsuite-1.1.1-py3-none-any.whl (38.5 kB view details)

Uploaded Python 3

File details

Details for the file specsuite-1.1.1.tar.gz.

File metadata

  • Download URL: specsuite-1.1.1.tar.gz
  • Upload date:
  • Size: 36.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for specsuite-1.1.1.tar.gz
Algorithm Hash digest
SHA256 780f21eb5163514c60c5bd11646afac4b9ebb6828735487c3c2a373ae6f222a9
MD5 be33f86c7ab28bc0da16b0e846c52404
BLAKE2b-256 5f684dbb3339cc4208c041d261e57b08e6af3a970e03169c800c2cf3e5b7b37a

See more details on using hashes here.

File details

Details for the file specsuite-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: specsuite-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 38.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for specsuite-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 01c3fe8032e9fad03ab6b4680aaa8917bf757530e45fb67e14e4e72e0adb53e9
MD5 1bdfc51502743511e05b092c77ff724e
BLAKE2b-256 1e82e5fec257cb48e07f0cac0825b9d1c7374a3e102b5640f2f53a919bc75cff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page