Skip to main content

Processing of meteorological FODS data.

Project description

# pyfocs

pyfocs has been known by btmm_process (obscure non-pythonic name) and pyfox (an unmaintained package on PyPi) resulting in the new name for the library.

# Getting Started

## Installation

### Using a package manager pyfocs can be installed by using:

pip install pyfocs

which installs pyfocs plus all dependencies. This install method has caused problems for Windows OS. If you encounter errors when running pyfocs using this method, we instead recommend following the below method.

### From source Alternatively you can download the source code from this repository (green button with “Clone or Download”), extract the package, navigate to the directory containing it, and run:

python setup.py install

Note that Windows users will need to use anaconda power prompt or a similar python environment.

Both methods should result in the PyFOX.py being callable from the command line.

### Dependency issues Installing the code from source may allow some dependency issues. These can be resolved through

pip install -r requirements.txt

## Example

Download the data in the example directory. Within that directory is an example configuration file in yaml format. Adjust the dir_pre and external paths to be those of the example folder. Then, you should be able to run

PyFOX.py path/to/example_configuration.yml

Alternatively, providing no path to the yaml file will open a file browser for selecting the configuration file.

# Overview

The Bayreuth Micrometeorology python library for processing Fiber Optic Distributed Sensing (FODS) data. The library consists of a family of simple functions and a master script (PyFOX) that can be used to process output from a Silixa Distribute Temperature Sensing (DTS) device, such as an Ultima or XT, from the original *.xml files to calibrated temperatures with physical labels. This library is built around the [xarray](http://xarray.pydata.org) package for handling n-dimensional data, especially in a netcdf format.

## Other libraries

Other similar libraries exist, such as the [one developed at Delft University](https://github.com/bdestombe/python-geotechnical-profile), which can be more useful for some applications, especially those with double-ended configurations.

# PyFOX Steps

Data and the surrounding directory structure is assumed to follow ![this outline.](data_structure_scheme.jpg).

Each Subdirectory corresponds to a particular step in the processing.

  1. Archives original .xml files into specified time interval.

  2. Creates netcdfs of the raw data, including the instrument reported temperature, stokes intensity, and anti-stokes intensity. Dimensions of Length Along the Fiber, LAF, and time.

  3. Labels the data, integrates external data streams and other reference data, performs step-loss corrections, performs single ended calibration based on Hausner et al., (2011). Splits multicore data into individual cores. Reports instrument reported temperature, calibrated temperature, log-power ratio of stoke and anti-stokes intensities, stokes intensity, anti-stokes intensities, and all data labels. Dimensions are LAF and time. New coordinates specified by location type in the location library can be used to label the data along with a number of labels by number of LAF coordinate.

  4. Converts data labels with physical coordinates. Drops the LAF label and only includes the physical location (xyz) and time. Each core dimension is saved as a separate netcdf. Cores do not share the xyz dimension and must be aligned with each other. They do share the time dimension.

## Example jupyter notebook

For space reasons we only include the data for following steps 2-4 in the example notebook. The example notebook walks through the iterative approach for processing FODS data.

### References

Hausner, M. B., Suárez, F., Glander, K. E., & Giesen, N. Van De. (2011). Calibrating Single-Ended Fiber-Optic Raman Spectra Distributed Temperature Sensing Data. Sensors, 11, 10859–10879. https://doi.org/10.3390/s111110859

### Muppet Archiver

Batch script for scheduled archiving of .xml files on the Silixa DTS devices. Why muppet? Unviersity of Bayreuth Micrometeorology names their Silixa devices after muppet characters. Requires an anaconda 3.* distribution of python. Task scheduler must point to the .bat script and not the python script.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

pyfocs-0.1.3.1-py3.7.egg (39.7 kB view details)

Uploaded Egg

pyfocs-0.1.3.1-py3-none-any.whl (25.5 kB view details)

Uploaded Python 3

File details

Details for the file pyfocs-0.1.3.1-py3.7.egg.

File metadata

  • Download URL: pyfocs-0.1.3.1-py3.7.egg
  • Upload date:
  • Size: 39.7 kB
  • Tags: Egg
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for pyfocs-0.1.3.1-py3.7.egg
Algorithm Hash digest
SHA256 8daab7f6124b6680fadecd9d489ff031f918473b644e866b3d00858d601d7814
MD5 25d2a9206dc0ae1d734d2485bd5ff2f8
BLAKE2b-256 0e2691c00f52166f24cec098d50728312fcb126305136c4dfd18f58e1d88d7b8

See more details on using hashes here.

File details

Details for the file pyfocs-0.1.3.1-py3-none-any.whl.

File metadata

  • Download URL: pyfocs-0.1.3.1-py3-none-any.whl
  • Upload date:
  • Size: 25.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for pyfocs-0.1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 eb5d07c389d441f02ea5987618a7d995abd948f2d24937169aa14b63b753e893
MD5 3da7193cb14ccad4aee52452f57a4b1f
BLAKE2b-256 5b93f507f09dbf57ee20a90698b2c6039696f61c0eb8b7b0d8fd745142ee60de

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page