Skip to main content

xarray engine to load ovf files from mumax simulations into xarray

Project description

mumaxXR contains an engine for xarray to read output directories of mumax directly, lazily into xarray. Therefore, the class OvfEngine needs to be imported from mumaxXR. Simply pass the directory, or an ovf file (e.g. m000000.ovf) as the filename_or_obj argument to the method. If only the directory is specified, the engine tries to find magnetization ovf files. If no magnetization ovf files are found the engine tries to load displacement ovf files. If neither magnetization nor displacement ovf files are found the engine just loads the first appearing type of ovf files. The engine has to be passed to xarray as engine=OvfEngine. Set chunks=’auto’ in order to parallelize loading of the ovf-files. If you want to set chunks to something else, keep in mind, that each ovf file has to be accessed multiple times, if the chunks for the dimensions x, y, z and comp are not set the default values.

from mumaxXR import OvfEngine
dataset = xr.open_dataset(example_mumax_output_dir, chunks='auto', engine=OvfEngine)

If multiple kinds of ovf files have been stored in the same time steps, another argument ‘wavetype’ can be passed to open_dataset, containing a list of the other short names, e.g.,

dataset = xr.open_dataset(path_to_example_mumax_output_dir, chunks='auto', engine=OvfEngine, wavetype=['m', 'u'])

Keep in mind that the names passed to wavetype have to be exactly like the prefix of ovf files.

If multiple simulation directories are supposed to be concatenated, multiple mumax simulation directories can be passed to the engine. If the number of ovf-files in the directories is not the same, the data is going to be interpolated lineariliy towards the time scaling, of the major mumax-directory. Please notice that this only makes sense, if the simulations did not crash or were interrupted in another way. This is meant for concatenating simulations of parameter sweeps. For 1D concatenation simply pass the parameter values inside a list to the engine.

dataset = xr.open_dataset(example_mumax_output_dir_0, chunks='auto', engine=OvfEngine, dirListToConcat=[path_to_example_mumax_output_dir_1, path_to_example_mumax_output_dir_2], sweepName='name_of_changed_parameter', sweepParam=[value_0_of_parameter, value_1_of_parameter, value_2_of_parameter])

For ND concatenation pass a list of names to sweepNames and a list of tuples to sweepParams

dataset = xr.open_dataset(example_mumax_output_dir_1, chunks='auto', engine=OvfEngine, dirListToConcat=[path_to_example_mumax_output_dir_2, path_to_example_mumax_output_dir_3, path_to_example_mumax_output_dir_4], sweepName=['name_of_changed_parameter_1', 'name_of_changed_parameter_2'], sweepParam=[(value_1_1_of_parameter, value_2_1_of_parameter), (value_1_2_of_parameter, value_2_1_of_parameter), (value_1_1_of_parameter, value_2_2_of_parameter), (value_1_2_of_parameter, value_2_2_of_parameter)])

If you want the ovf-files to be deleted after reading, set removeOvfFiles=True. This only works, if the ovf-files don’t have to be accessed multiple times. If you want to use ovf-files periodically, set useEachNthOvfFile=N. N has to be larger than one. Otherwise it is ignored. Using this flag, only each nth ovf file is used for the dataset. The singleLoad option allows you to not load all of the ovf-files but only one specific one.

Of course the wavetype option and the mumax directory concatenation can be used simultaneously.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mumaxxr-0.0.35.tar.gz (16.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mumaxxr-0.0.35-py3-none-any.whl (15.6 kB view details)

Uploaded Python 3

File details

Details for the file mumaxxr-0.0.35.tar.gz.

File metadata

  • Download URL: mumaxxr-0.0.35.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mumaxxr-0.0.35.tar.gz
Algorithm Hash digest
SHA256 748e27ccd4dedc13d7c5722bf5198130bf0fadb9a56c06ea9d45e2a6bc4d3dd3
MD5 37e1b47acdb98596c2f2f4bdb282ffa3
BLAKE2b-256 64a22fdbcaa4c9681bcc057593bf942bc4a375ae36c877f21d0092e08b8f67c2

See more details on using hashes here.

Provenance

The following attestation bundles were made for mumaxxr-0.0.35.tar.gz:

Publisher: publish.yml on timvgl/mumaxXR

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mumaxxr-0.0.35-py3-none-any.whl.

File metadata

  • Download URL: mumaxxr-0.0.35-py3-none-any.whl
  • Upload date:
  • Size: 15.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mumaxxr-0.0.35-py3-none-any.whl
Algorithm Hash digest
SHA256 8fa2557743803675a023fe7974b720fb0a39a246ba6668a1d2f1699673f22ae0
MD5 66c934846e6717e0728796670e158c5c
BLAKE2b-256 72a0c109f7bb059e8d98514dcee6ba9a46d07c1c8867fcbac328b9c58fd3a557

See more details on using hashes here.

Provenance

The following attestation bundles were made for mumaxxr-0.0.35-py3-none-any.whl:

Publisher: publish.yml on timvgl/mumaxXR

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page