Skip to main content

xarray engine to load ovf files from mumax simulations into xarray

Project description

mumaxXR contains an engine for xarray to read output directories of mumax directly, lazily into xarray. Therefore, the class OvfEngine needs to be imported from mumaxXR. Simply pass the directory, or an ovf file (e.g. m000000.ovf) as the filename_or_obj argument to the method. If only the directory is specified, the engine tries to find magnetization ovf files. If no magnetization ovf files are found the engine tries to load displacement ovf files. If neither magnetization nor displacement ovf files are found the engine just loads the first appearing type of ovf files. The engine has to be passed to xarray as engine=OvfEngine. Set chunks=’auto’ in order to parallelize loading of the ovf-files. If you want to set chunks to something else, keep in mind, that each ovf file has to be accessed multiple times, if the chunks for the dimensions x, y, z and comp are not set the default values.

from mumaxXR import OvfEngine
dataset = xr.open_dataset(example_mumax_output_dir, chunks='auto', engine=OvfEngine)

If multiple kinds of ovf files have been stored in the same time steps, another argument ‘wavetype’ can be passed to open_dataset, containing a list of the other short names, e.g.,

dataset = xr.open_dataset(path_to_example_mumax_output_dir, chunks='auto', engine=OvfEngine, wavetype=['m', 'u'])

Keep in mind that the names passed to wavetype have to be exactly like the prefix of ovf files.

If multiple simulation directories are supposed to be concatenated, multiple mumax simulation directories can be passed to the engine. If the number of ovf-files in the directories is not the same, the data is going to be interpolated lineariliy towards the time scaling, of the major mumax-directory. Please notice that this only makes sense, if the simulations did not crash or were interrupted in another way. This is meant for concatenating simulations of parameter sweeps. For 1D concatenation simply pass the parameter values inside a list to the engine.

dataset = xr.open_dataset(example_mumax_output_dir_0, chunks='auto', engine=OvfEngine, dirListToConcat=[path_to_example_mumax_output_dir_1, path_to_example_mumax_output_dir_2], sweepName='name_of_changed_parameter', sweepParam=[value_0_of_parameter, value_1_of_parameter, value_2_of_parameter])

For ND concatenation pass a list of names to sweepNames and a list of tuples to sweepParams

dataset = xr.open_dataset(example_mumax_output_dir_1, chunks='auto', engine=OvfEngine, dirListToConcat=[path_to_example_mumax_output_dir_2, path_to_example_mumax_output_dir_3, path_to_example_mumax_output_dir_4], sweepName=['name_of_changed_parameter_1', 'name_of_changed_parameter_2'], sweepParam=[(value_1_1_of_parameter, value_2_1_of_parameter), (value_1_2_of_parameter, value_2_1_of_parameter), (value_1_1_of_parameter, value_2_2_of_parameter), (value_1_2_of_parameter, value_2_2_of_parameter)])

If you want the ovf-files to be deleted after reading, set removeOvfFiles=True. This only works, if the ovf-files don’t have to be accessed multiple times. If you want to use ovf-files periodically, set useEachNthOvfFile=N. N has to be larger than one. Otherwise it is ignored. Using this flag, only each nth ovf file is used for the dataset. The singleLoad option allows you to not load all of the ovf-files but only one specific one.

Of course the wavetype option and the mumax directory concatenation can be used simultaneously.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mumaxxr-0.0.18.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mumaxxr-0.0.18-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file mumaxxr-0.0.18.tar.gz.

File metadata

  • Download URL: mumaxxr-0.0.18.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for mumaxxr-0.0.18.tar.gz
Algorithm Hash digest
SHA256 30d0dca7c17d1ee28593ab2af015e283308fdf7abc20213a8f9dd48e9b61fc1d
MD5 5a746c22e657f7c17648210ccba7b37e
BLAKE2b-256 6c9284d04ff6319340d64247dafffd20fe1a9b256dcf2a1756dc8ce3b8b84448

See more details on using hashes here.

Provenance

The following attestation bundles were made for mumaxxr-0.0.18.tar.gz:

Publisher: publish.yml on timvgl/mumaxXR

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mumaxxr-0.0.18-py3-none-any.whl.

File metadata

  • Download URL: mumaxxr-0.0.18-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for mumaxxr-0.0.18-py3-none-any.whl
Algorithm Hash digest
SHA256 8339468712ed2a53201ed92c79f777a558aba020ad64d325d122a215d11168f2
MD5 92c90960e21529a8037e8d5c730eea01
BLAKE2b-256 493c63aaa2238a405d65ce907d5844abdec5ec2c3d6fa197052235bc89ec38cf

See more details on using hashes here.

Provenance

The following attestation bundles were made for mumaxxr-0.0.18-py3-none-any.whl:

Publisher: publish.yml on timvgl/mumaxXR

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page