Skip to main content

A package for ANT-MOC data manipulation.

Project description

ANT-MOC Data

A package for ANT-MOC data manipulation.

Prerequisites

  • Python >= 3.8
  • baseopt
  • numpy
  • h5py

Install

$ pip install antmocdata

License

MIT

Subpackage: ANT-MOC Solution

Package antmocdata.solution provides tools for processing reaction rates and fluxes produced by ANT-MOC.

Load data to numpy arrays

Function antmocdata.solution.load_vtu reads reaction rates and fluxes from a .vtu file and stores them in a dictionary of numpy arrays.

from antmocdata.solution import load_vtu
# Read only the 'Avg Fission RX' dataset from file 'reaction_rates.vtu'
rx = load_vtu("reaction_rates.vtu", ["^Avg Fission RX$"])
fiss = rx["Avg Fission RX"]
print(fiss.shape)

For ANT-MOC v0.1.15, the coordinate axes in a .vtu file are in the following order.

Loading the file with load_vtu will revert the y-axis. Back to the previous example, loaded data can be accessed by fiss[z, y, x].

Subpackage: ANT-MOC Log

Package antmocdata.log provides tools for exploring ANT-MOC logs.

Examples

Please check the directory examples/log for live examples.

Each of the sample scripts accepts command line arguments.

python ./examples/log/extract-records.py --help

Log file

A log file of ANT-MOC contains many data fields. If a group of log files are provided to a LogDB object, these files will be serialized and stored as LogFile objects.

from antmocdata.log import Options, LogDB
options = Options()
# ...
# setup options eigher through CLI or direct assignments
# ...
logdb = LogDB()
logdb.setup(options)

Extractor

A log extractor is used to make LogDB queries and save the results to a .csv file. For example, to get log files with specific fields, one could write down

options["output"].value = "antmoc-records.csv"
options["specs"].value = ["Azims", "Polars"]
extractor = TinyExtractor(logdb)
extractor.extract()

This would list the Azims and Polars fields of all the log files and save the results to antmoc-records.csv.

In this case, Azims and Polars are called FieldSpec. A spec can be used to filter out results. For example,

# List Azims and Polars fields of all the logs
options["specs"].value = ["Azims", "Polars"]
# List Azims and Polars fields of all the logs, and
# only show logs with Azims=64 
options["specs"].value = ["Azims=64", "Polars"]
# List Azims and Polars fields of all the logs, and
# only show logs with Azims=64 and Polars>2
options["specs"].value = ["Azims=64", "Polars>2"]
extractor.extract()

A FieldSpec consists of three parts: field name, binary operator, and value. Perl regex is supported for the field name and the operator. For example, ".*Time" could match all the fields with an ending Time string.

The binary operator could be ==, <, <=, >, or >=. Operator == is for string comparison and perl regex is supported in this case. Inequality symbols are for string or numerical comparisons.

Be careful if you want to use inequality symbols on string fields. Values from these fields are compared through string comparison. Predefined field types are located in antmocdata.log.default_fields.json.

Fields

Predefined log fields are located in antmocdata.log.default_fields.json. If these fields are outdated due to ANT-MOC updates, please update this json file or load a new one in your scripts.

from antmocdata.log import fields
fields.load("path/to/your/fields.json")

There is also an add method for appending single field to the field dictionary.

from antmocdata.log import Field
fields.add(Field(name="NewField1", patterns=["NewField1.*"]))

# or adding the field directly to LogFields
from antmocdata.log import LogFields
LogFields().add(Field(name="NewField2", patterns=["NewField2.*"]))

Save log database

A LogDB object can be dumped as json files to a specific directory.

options["savedb"].value = "antmoc-logdb/"
# ...
# setup the LogDB object
# ...
logdb.save(options("savedb"))

Subpackage: ANT-MOC MGXS

Package antmocdata.mgxs provides tools for checking, manipulating, and generating MGXS files for ANT-MOC.

Examples

Please check the directory examples/mgxs for live examples.

Each of the sample scripts accepts command line arguments.

python ./examples/mgxs/h5/fix-materials.py --help
python ./examples/mgxs/xml/fix-materials-in-xml.py --help

HDF5 data layout

There are two layouts of material data in an H5 file. Materials are treated as data groups in both of the layouts.

In addition to materials, the H5 file must contain a top-level attribute named '# groups' for the number of energy groups.

Layout: named

This is the default cross-section data layout for ANT-MOC.

Cross-section arrays are stored in H5 datasets individually. In this layout, a group for a material consists of several datasets:

  • absorption
  • fission
  • nu-fission
  • transport, or total
  • chi
  • scatter matrix, or nu-scatter matrix, or consistent scatter matrix

For example, a simple cross-section file could have the following hierarchy

attribute  "# groups"
group      /
group      /material/MOX-8.7%
dataset    /material/MOX-8.7%/chi
dataset    /material/MOX-8.7%/fission
dataset    /material/MOX-8.7%/nu-fission
dataset    /material/MOX-8.7%/scatter matrix
dataset    /material/MOX-8.7%/total
group      /material/UO2
dataset    /material/UO2/chi
dataset    /material/UO2/fission
dataset    /material/UO2/nu-fission
dataset    /material/UO2/scatter matrix
dataset    /material/UO2/total

The scatter matrix is usually a flattened n-by-n matrix, where n is the number of energy groups. Elements in the scatter dataset respect the source-major order, which is much like the row-major order.

For example, a scatter matrix with 2 energy groups has 4 elements, which are stored as

1->1
1->2
2->1
2->2

The number before symbol -> is the source group, and the number after the symbol is the destination group.

Layout: compact/compressed

TODO

Common modules

  • material: class Material representing cross-sections. A material object could be written to an HDF5 file as a dataset.
  • materialxml: representation for the XML material definition, which is used to handle materials.xml.
  • manip: data manipulation utilities.
  • options: representation of command line options.

Type A

Package antmocdata.mgxs.type_a defines a generator which accepts two files to create an mgxs input for antmoc:

  • infilecross: cross-sections in plain text.
  • materials.xml: material definitions in XML, including nuclear densities.

Modules

  • material: definition of MaterialTypeA, which is a sub-class of Material.
  • nuclides: representations of nuclides and nuclide sets, which are basically defined in a plain text file called "infilecross".
  • infilecross: functions for parsing an "infilecross" file. The file must be well-formed.
  • generate: functions for mgxs generation.
  • options: definition of OptionsTypeA, which is a sub-class of Options.

File formats

materials.xml

This is an XML file consisting of material definitions.

<?xml version="1.0" encoding="utf-8"?>
<MATERIAL>
    <material name="1" set="1" density="0.0E+00" temperature="523.15K" label="Some material">
        <nuclide id="1102301" radio="1.6098e-2"/>
        <nuclide id="601201"  radio="7.3771e-5"/>
    </material>
</MATERIAL>
  • material: definition of a material object.
  • material.name: material name which will be written into the H5 file (string).
  • material.set: nuclide set ID for MGXS calculations (int, defaults to name).
  • material.label: a short description.
  • nuclide: nuclide information for MGXS calculations.
  • nuclide.id: nuclide ID containing its atomic number and mass (int).
  • nuclide.radio: density used by MGXS calculations (float).
infilecross

This is a plain text file consisting of nuclide set definitions.

$SOMESTRING 1  7SETs 1SET
   2  30   6   5   0   0
  0.57564402E+00  0.29341474E+00  0.12731624E+00 ...
  ...
  2511102302   0   3   6SODIUM-23
  ...
  4311402802   0   3   6SILICON-28
  ...
$SOMESTRING 1  7SETs SET2
  ...

Type B

TODO

Modules

File formats

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

antmocdata-0.1.3.tar.gz (53.9 kB view details)

Uploaded Source

Built Distribution

antmocdata-0.1.3-py3-none-any.whl (43.6 kB view details)

Uploaded Python 3

File details

Details for the file antmocdata-0.1.3.tar.gz.

File metadata

  • Download URL: antmocdata-0.1.3.tar.gz
  • Upload date:
  • Size: 53.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for antmocdata-0.1.3.tar.gz
Algorithm Hash digest
SHA256 2d14bab32a47e70cbc2c83cb88c58734acbfd0c0455e5c7eaed6d67fbbae2904
MD5 b2bbf228a2c7d34e96e7971c39ef3be4
BLAKE2b-256 17aa267e420d63c3cb016d9cf9345045c32d7d4b8f0109fe2fa261322040dd08

See more details on using hashes here.

File details

Details for the file antmocdata-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: antmocdata-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 43.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for antmocdata-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 baa4b43e11b99da5fc64faedce5ba6651accdaf43ea78d82515bad83b9f040ef
MD5 8e58629e72ed982a5a943c65935c9668
BLAKE2b-256 52257f3d59d10e9b014416702b05fe1a22fe9b2a9da5517a569858001ad145d1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page