Python3 library for converting (and filtering) spectral data in various formats.
Project description
Python library for converting (and filtering) spectral data in various formats.
Supported spectral formats:
ADAMS (read/write)
ARFF row-wise (read/write)
ASC (read/write)
ASCII XY (read/write)
CAL FOSS (read/write)
CSV (read/write)
DPT (read/write)
MPS (read)
NIR FOSS (read/write)
Opus Bruker (read)
Opus Ext Bruker (read)
SPA Thermo Fisher (read)
Supported sample data formats:
ADAMS report (read/write)
CSV row-wise (read/write)
JSON (read/write)
Examples can be found here:
https://github.com/waikato-datamining/spectral-data-converter-examples
Changelog
0.1.0 (2025-10-31)
split-records filter now allows specifying the meta-data field in which to store the split name
the tee meta-filter can now forward or drop the incoming data based on a meta-data evaluation
the sub-process filter can be used for processing data with sub-flow of filters, can be conditional based on meta-data evaluation
the metadata-from-name filter can work on the path now as well (must be present)
switched to kasperl library for base API and generic pipeline plugins
requiring seppl>=0.3.0 now
added @abc.abstractmethod decorator where appropriate
the sdc-exec tool now uses all remaining parameters as the pipeline components rather than having to specify them via the -p/–pipeline parameter, making it easy to simply prefix the sdc-exec command to an existing sdc-convert command-line
added the text-file and csv-file generators that work off files to populate the variable(s)
sdc-exec can load pipelines from file now as well, useful when dealing with large pipelines
added –load_pipeline option to sdc-convert
added from-text-file reader and to-text-file writer
readers now locate files the first time the read() method gets called rather than in the initialized(), to allow more dynamic placeholders
added from-text-file reader and to-text-file writer
added block, stop filters for controlling the flow of data (via meta-data conditions)
added email support with get-email reader and send-email writer
added list-files reader for listing files in a directory
added list-to-sequence stream filter that forwards list items one by one
added console writer for outputting the data on stdout that is coming through
added watch-dir meta-reader that uses the watchdog library to react to file-system events rather than using fixed-interval polling like poll-dir
added delete-files writer
added copy-files filter
added support for caching plugins via SDC_CLASS_CACHE environment variable
added to-metadata writer that outputs the meta-data of an image
added attach-metadata filter that loads meta-data from a directory and attaches it to the data passing through
0.0.3 (2025-07-15)
requiring seppl>=0.2.20 now for improved help requests in sdc-convert tool
0.0.2 (2025-07-11)
wai.spectralio-based readers now instantiate the wai.spectralio reader in the initialize method
wai.spectralio-based writers now instantiate the wai.spectralio writer in the initialize method
introduced SpectralIOBased, SpectralIOReader and SpectralIOWriter mixins to wai.spectralio-based readers/writers for a cleaner class hierarchy
requiring wai-spectralio>=0.0.5 now
requiring seppl>=0.2.19 now
added experimental support for direct read/write operations using file-like objects
fixed initialization of sample ID and sample data prefix in CSVSampleDataWriter
fixed initialization of None values of OPUSExtReader, aligning it with the command-line args
added from-zip meta-reader for reading spectra and sample data from zip files
added to-zip meta-writer for writing spectra and sample data to zip files
0.0.1 (2025-06-27)
initial release
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file spectral_data_converter-0.1.0.tar.gz.
File metadata
- Download URL: spectral_data_converter-0.1.0.tar.gz
- Upload date:
- Size: 53.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3887cf13deaccf8524c6fb775231acf6c16f0838e84fda2f8647d3e8fb5146af
|
|
| MD5 |
b057293501e3a34d04d4eada947a83ea
|
|
| BLAKE2b-256 |
19fc8d532d49acb27d9d6528df1ee1bd3989cb5b07f80be81542fe92e824c5c7
|