An interface layer for scripting the AMI-Reduce pipeline.
A python package for scripting the AMI-reduce pipeline.
This package makes it trivial to script reduction of raw AMI data from python. What’s more, it provides tools to group the raw files into datasets, outputting the UVFITS for each dataset under a single folder. It does this by extracting the pointing information from the raw data, resulting in fairly reliable groupings (although you can edit these manually, see later).
When processing the data, all output from reduce is saved to an accompanying log-file, retaining all information that would normally be available to the user from the interactive interface. Meanwhile, all emulated commands passed to reduce are recorded in a separate log for each file processed, so it’s easy to re-run the script manually and tinker with the reduction process.
Additionally, when running commands listed in a script the interface quietly parses key information such as flagging percentages, rain modulation, and estimated noise, from the reduce output. These are then stored to disk alongside the UVFITs in easily machine readable JSON format. (These may also be added to the UVFITS header in future.)
From the command line (preferably within a virtualenv):
git clone git://github.com/timstaley/drive-ami.git cd drive-ami pip install numpy #Workaround for buggy scipy/numpy combined install. pip install .
Command-line scripts are installed along with the package. Their sourcefiles can be found at https://github.com/timstaley/drive-ami/tree/master/bin. For full details, run e.g.:
Where -h is short for ‘help’.
Typical usage is to run driveami_list_rawfiles.py to build a full listing of available data, followed by driveami_filter_rawfile_listing.py to extract the entries on a desired target. Finally, driveami_calibrate_rawfiles.py actually does the processing using AMI-REDUCE.