Skip to main content

Data management, coupling and execution for MDO problems

Reason this release was yanked:

Yanked stub version

Project description

mdo-engine

mdo-engine provides data management, coupling between arbitrary sources (such as files, databases, python packages, etc.) and execution ordering.

It is the framework on which dtocean-core is built.

Installation

Installation and development of mdo-engine uses the Poetry dependency manager. Poetry must be installed and available on the command line.

To install:

$ poetry install

Tests

A test suite is provided with the source code that uses pytest.

Install the testing dependencies:

$ poetry install --with test

Run the tests:

$ poetry run pytest

Usage

Example

An example of using mdo-engine to read data from a DataWell SPT file interface, store the data using Simulation and DataPool objects, and then retrieve the data using its specified data structure.

All the setup for this example is in the mdo_engine.test module of the source code. The example SPT file can be found in the mdo_engine\\tests\\data directory.

First, look for interfaces that are subclasses of FileInterface in the mdo_engine.test.interfaces module:

>>> from mdo_engine.control.sockets import NamedSocket
>>> import mdo_engine.test.interfaces as interfaces

>>> interfacer = NamedSocket("FileInterface")
>>> interfacer.discover_interfaces(interfaces)
>>> interfacer.get_interface_names()
{'Datawell SPT File': 'SPTInterface'}

Load the SPTInterface interface and see what file types it can load:

>>> file_interface = interfacer.get_interface_object('SPTInterface')
>>> file_interface.get_valid_extensions()
['.spt']

See which variables the interface can provide:

>>> output_variables = file_interface.get_outputs()
>>> output_variables
['site:wave:dir',
 'site:wave:spread',
 'site:wave:skewness',
 'site:wave:kurtosis',
 'site:wave:freqs',
 'site:wave:PSD1D',
 'site:wave:Hm0',
 'site:wave:Tz']

Get the data from the test SPT file:

>>> file_interface.set_file_path(test_spectrum_30min.spt)
>>> file_interface.connect()

Create a data catalogue and read the defined structures and meta data for each variable:

>>> from mdo_engine.control.data import DataValidation
>>> from mdo_engine.entity.data import DataCatalog

>>> catalog = DataCatalog()
>>> validation = DataValidation(meta_cls=data.MyMetaData)
>>> validation.update_data_catalog_from_definitions(catalog,
                                                    data)

Check which variables in the interface are defined in the data catalogue:

>>> valid_variables = validation.get_valid_variables(catalog, output_variables)
>>> valid_variables
['site:wave:dir', 'site:wave:PSD1D', 'site:wave:freqs']

Collect the raw data for the valid variables:

>>> raw_data = []

>>> for variable in valid_variables:
>>>     raw_data.append(file_interface.get_data(variable))

Create DataPool, Simulation and Loader objects and store the collected data:

>>> from mdo_engine.control.data import DataStorage
>>> from mdo_engine.control.simulation import Loader
>>> from mdo_engine.entity import Simulation
>>> from mdo_engine.entity.data import DataPool

>>> pool = DataPool()
>>> simulation = Simulation("Hello World!")
>>> data_store = DataStorage(data)
>>> loader = Loader(data_store)

>>> loader.add_datastate(pool,
...                      simulation,
...                      None,
...                      catalog,
...                      valid_variables,
...                      raw_data)

Retrieved variables are now pandas Series objects, as defined in the data catalogue:

>>> freqs = loader.get_data_value(pool,
...                               simulation,
...                               'site:wave:freqs')
>>> type(freqs)
pandas.core.series.Series

Command Line Tools

A utility is provided to convert DTOcean data description specifications (DDS) files saved in MS Excel format to native yaml format. To get help:

$ bootstrap-dds -h

A seconds utility is provided to merge two DDS files in Excel format. This can be useful when merging files in a version-control system. To get help:

$ xl_merge -h

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

See this blog post for information regarding development of the DTOcean ecosystem.

Please make sure to update tests as appropriate.

Credits

This package was initially created as part of the EU DTOcean project by Mathew Topper at TECNALIA.

It is now maintained by Mathew Topper at Data Only Greater.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mdo_engine-1.0.0.tar.gz (47.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mdo_engine-1.0.0-py3-none-any.whl (54.7 kB view details)

Uploaded Python 3

File details

Details for the file mdo_engine-1.0.0.tar.gz.

File metadata

  • Download URL: mdo_engine-1.0.0.tar.gz
  • Upload date:
  • Size: 47.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.12.7 Windows/10

File hashes

Hashes for mdo_engine-1.0.0.tar.gz
Algorithm Hash digest
SHA256 5a8f5246e02a6a4ffb1fa2e8e6568ba65ece9d86da89bdf10f4c3d6a4789971b
MD5 dbb4eb3a28ce1e512042f67c49d205f1
BLAKE2b-256 93693c1858f0537bc2d7c91074212e8ab91a5907ff23ce79d1cc5529448257a2

See more details on using hashes here.

File details

Details for the file mdo_engine-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: mdo_engine-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 54.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.12.7 Windows/10

File hashes

Hashes for mdo_engine-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 44b2192c5c0c2e95618bfc736b18c550c8ea7ed0e8c2b26b94d94ca56fcc2205
MD5 0f1257b1d0684c6c61972db688c9da53
BLAKE2b-256 b7c32947d02298fcdb3f6b512e8b5a2684e781409f8e466cc966e3ff84d245f5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page