Skip to main content

AO Muse package

Project description

AO Muse

AO Muse package

Axel Iván Reyes Orellana

This package allows the connection with a database of exposures to easy retrieval and processing.

Contents

Installation

Install using pip!

$ pip install aomuse

Requirements

To use the package you need a database where your exposure data is stored, for example MariaDB:

AO Muse library uses the following packages:

For better understanding, the terms that are enclosed in curly brackets must be replaced with their corresponding values.

Before use

Before use its necessary a database where your exposure data is stored. If you have one already, you can skip this section.

For this tutorial, we will use a MariaDB database.

You will need the name of the DB, username and password for the connection.

To create the database you have to login first (you can do it as root):

mysql -u {user_name} -p 

Then you have to create an empty database:

CREATE DATABASE {database_name}; 

Create an user of the database:

CREATE USER '{username}'@'localhost' IDENTIFIED BY '{password}';

And finally grant privileges to the database user:

GRANT ALL PRIVILEGES ON {database_name}.* TO '{username}'@'localhost';

To use this library its very recommendable to use the following AO Muse script to retrieve the data from Muse files.

AO Muse Script

Data Structure

The following code cells define the structure of the database entities (Targets, Exposures and Processed Exposures) and how Pony map the classes with the tables. This structure is mandatory to use the AO Muse Package.

from pony.orm import *
# Create a database object from Pony
db = Database()

# The classes inherit db.Entity from Pony
class Target(db.Entity):
    #   ----- Attributes -----

    target_name = Required(str, unique=True)  # Required: Cannot be None

    #   ----- Relations -----

    exposures = Set('Exposure')  # One target contains a set of exposures
    processed_exposure = Optional('Processed_Exposure')

# Exposure table class
class Exposure(db.Entity):
    #   ----- Attributes -----

    observation_time = Required(datetime, unique=True)
    obs_id = Required(int, size=32, unsigned=True)
    insMode = Required(str)
    datacube_header = Optional(Json)
    raw_exposure_header = Optional(Json)
    raw_exposure_data = Optional(Json)
    raw_exposure_filename = Optional(str, unique=True)
    prm_filename = Optional(str, unique=True)
    pampelmuse_params = Optional(Json)
    sources = Optional(Json)
    pampelmuse_catalog = Optional(Json)
    raman_image_header = Optional(Json)
    maoppy_data = Optional(Json)

    #   ----- Sky parameters -----
    sky_condition_start_time = Optional(float)
    sky_condition_start = Optional(LongStr)
    sky_comment_start = Optional(LongStr)
    sky_condition_end_time = Optional(float)
    sky_condition_end = Optional(LongStr)
    sky_comment_end = Optional(LongStr)

    #   ----- Relations -----

    target = Required('Target')  # One exposure belongs to a target
    processed_exposure = Optional('Processed_Exposure')

class Processed_Exposure(db.Entity):
    observation_time = Required(datetime, unique=True)
    obs_id = Required(int, size=32, unsigned=True)
    insMode = Required(str)
    raw_filename = Optional(str, unique=True)
    ngs_flux = Optional(float)
    ttfree = Optional(bool)
    degraded = Optional(bool)
    glf = Optional(float)
    seeing = Optional(float)
    seeing_los = Optional(float)
    airMass = Optional(float)
    tau0 = Optional(float)
    # --------------------------------------------------------------
    num_sources = Optional(int, unsigned=True)
    sgs_data = Optional(Json) # sgs_data extension
    ag_data = Optional(Json) # ag_data extension
    sparta_cn2 = Optional(Json) # sparta_cn2 extension
    sparta_atm =Optional(Json) # sparta_atm extension
    psf_params = Optional(Json)
    sparta_iq_data = Optional(Json)
    sparta_iq_los_500nm = Optional(float)
    sparta_iq_los_500nm_nogain = Optional(float)
    
    # Relations
    
    target = Required('Target')  # One exposure belongs to a target
    exposure = Required('Exposure')

How to use

Imports

In order to use the package, you have to import the database object.

from AOMUSE.db.Database import database

Then, define the following parameters to stablish the connection with the database.

db_data = {
    "provider":{provider},
    "host":{ip}, 
    "user":{db_admin_username}, 
    "passwd":{db_admin_password}, 
    "db":{db_name}
}

If you are using MariaDB and localhost, then provider has to be "mysql" and host "127.0.0.1".

Now, you can stablish the connection creating an instance of muse_db class provided by the AO Muse Package, which receive the database imported earlier and the previous database parameters.

from AOMUSE.ao.muse_db import muse_db
aomuse_db = muse_db(database, **db_data)

Methods

The package also offers the following methods which process exposures data or returns a Pandas DataFrame with their corresponding data.

muse_db.process()

Process and store all the exposures stored in the database. If processed data existed previosly, it is deleted first. The processed data will be obtained by the Exposure table of the current database and will be stored in a new table (Processed_Exposure).

'psf_params' and 'sparta_iq_data' fields are dictionaries (or Json) with several keys. The wavelength range has been sliced into 10 windows, where the windows that were in the laser range were skipped. For each exposure the key has an array of values, each value corresponds to the mean of the corresponding measure, indicated by the key, along the corresponding wavelenght window. Also, both fields has a key called "wavelength" that indicates the mean of the wavelength of each window not skipped.

aomuse_db.process()

muse_db.get_exposures()

Return a DataFrame containing the metadata of the exposures and their corresponding exposure and processed exposure objects.

exposures = aomuse_db.get_exposures()
exposures


muse_db.get_processed_exposures()

Return a DataFrame containing the metadata of the processed exposures and their corresponding exposure and processed exposure objects.

processed_exposures = aomuse_db.get_processed_exposures()
processed_exposures


muse_db.get_processed_data()

Return a DataFrame containing some metadata and the processed data of exposures that are not tables or Jsons.

processed_data = aomuse_db.get_processed_data()
processed_data


muse_db.get_processed_data()

Return a DataFrame containing some metadata and the processed data of exposures that are tables or Jsons.

processed_tables = aomuse_db.get_processed_tables()
processed_tables


If there is an error with the library or with the README, like a misspelling or something, do not be afraid to send me an email to axel.reyes@sansano.usm.cl and I will try to fix it as soon as posible. Thank you in advance.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

AOMUSE-0.0.10.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

AOMUSE-0.0.10-py3-none-any.whl (14.5 kB view details)

Uploaded Python 3

File details

Details for the file AOMUSE-0.0.10.tar.gz.

File metadata

  • Download URL: AOMUSE-0.0.10.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for AOMUSE-0.0.10.tar.gz
Algorithm Hash digest
SHA256 5d4ff710f090c30baac9ba7fba856f5ae245210502db2e2d45cca25eb44223ac
MD5 4e72522f78b262802c887135af48398e
BLAKE2b-256 4b574b4efd9fe7987f78110a8f54d655b420aa5ee57885219a809a1addd20e87

See more details on using hashes here.

File details

Details for the file AOMUSE-0.0.10-py3-none-any.whl.

File metadata

  • Download URL: AOMUSE-0.0.10-py3-none-any.whl
  • Upload date:
  • Size: 14.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for AOMUSE-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 f6d3523915a8ce607bd944abd2a0399357f2595a0ae9ff8ce2dca480fa8b65b9
MD5 9d46569e0ddc64fd1cfc4210871cf5d1
BLAKE2b-256 9a079bcfdc55efa4101bd3c2c967083cf791912674db444365cac19c880a0d97

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page