Skip to main content

xarray engine to load ovf files from mumax simulations into xarray

Project description

mumaxXR contains an engine for xarray to read output directories of mumax directly, lazily into xarray. Therefore, the class OvfEngine needs to be imported from mumaxXR. Simply pass the directory, or an ovf file (e.g. m000000.ovf) as the filename_or_obj argument to the method. If only the directory is specified, the engine tries to find magnetization ovf files. If no magnetization ovf files are found the engine tries to load displacement ovf files. If neither magnetization nor displacement ovf files are found the engine just loads the first appearing type of ovf files. The engine has to be passed to xarray as engine=OvfEngine. Set chunks=’auto’ in order to parallelize loading of the ovf-files. If you want to set chunks to something else, keep in mind, that each ovf file has to be accessed multiple times, if the chunks for the dimensions x, y, z and comp are not set the default values.

from mumaxXR import OvfEngine
dataset = xr.open_dataset(example_mumax_output_dir, chunks='auto', engine=OvfEngine)

If multiple kinds of ovf files have been stored in the same time steps, another argument ‘wavetype’ can be passed to open_dataset, containing a list of the other short names, e.g.,

dataset = xr.open_dataset(path_to_example_mumax_output_dir, chunks='auto', engine=OvfEngine, wavetype=['m', 'u'])

Keep in mind that the names passed to wavetype have to be exactly like the prefix of ovf files.

If multiple simulation directories are supposed to be concatenated, multiple mumax simulation directories can be passed to the engine. If the number of ovf-files in the directories is not the same, the data is going to be interpolated lineariliy towards the time scaling, of the major mumax-directory. Please notice that this only makes sense, if the simulations did not crash or were interrupted in another way. This is meant for concatenating simulations of parameter sweeps. For 1D concatenation simply pass the parameter values inside a list to the engine.

dataset = xr.open_dataset(example_mumax_output_dir_0, chunks='auto', engine=OvfEngine, dirListToConcat=[path_to_example_mumax_output_dir_1, path_to_example_mumax_output_dir_2], sweepName='name_of_changed_parameter', sweepParam=[value_0_of_parameter, value_1_of_parameter, value_2_of_parameter])

For ND concatenation pass a list of names to sweepNames and a list of tuples to sweepParams

dataset = xr.open_dataset(example_mumax_output_dir_1, chunks='auto', engine=OvfEngine, dirListToConcat=[path_to_example_mumax_output_dir_2, path_to_example_mumax_output_dir_3, path_to_example_mumax_output_dir_4], sweepName=['name_of_changed_parameter_1', 'name_of_changed_parameter_2'], sweepParam=[(value_1_1_of_parameter, value_2_1_of_parameter), (value_1_2_of_parameter, value_2_1_of_parameter), (value_1_1_of_parameter, value_2_2_of_parameter), (value_1_2_of_parameter, value_2_2_of_parameter)])

If you want the ovf-files to be deleted after reading, set removeOvfFiles=True. This only works, if the ovf-files don’t have to be accessed multiple times. If you want to use ovf-files periodically, set useEachNthOvfFile=N. N has to be larger than one. Otherwise it is ignored. Using this flag, only each nth ovf file is used for the dataset. The singleLoad option allows you to not load all of the ovf-files but only one specific one.

Of course the wavetype option and the mumax directory concatenation can be used simultaneously.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mumaxxr-0.0.37.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mumaxxr-0.0.37-py3-none-any.whl (15.7 kB view details)

Uploaded Python 3

File details

Details for the file mumaxxr-0.0.37.tar.gz.

File metadata

  • Download URL: mumaxxr-0.0.37.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mumaxxr-0.0.37.tar.gz
Algorithm Hash digest
SHA256 fe0ed2260829cf760f82e59ea4b0d25a759dfcf9f070574c9b97a59768f85335
MD5 b168bfa17ba30dcad134f95df7111e9f
BLAKE2b-256 3da9108c3d85317e99ea5f701f8534c32cea5317f1e7179278f6b0826c602394

See more details on using hashes here.

Provenance

The following attestation bundles were made for mumaxxr-0.0.37.tar.gz:

Publisher: publish.yml on timvgl/mumaxXR

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mumaxxr-0.0.37-py3-none-any.whl.

File metadata

  • Download URL: mumaxxr-0.0.37-py3-none-any.whl
  • Upload date:
  • Size: 15.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mumaxxr-0.0.37-py3-none-any.whl
Algorithm Hash digest
SHA256 bf495776d26ffde09d44cef8ef5f065e38abbd34ba47816d337f9823652956ad
MD5 b4c38b2504fbb29dd13682f3a023e15f
BLAKE2b-256 704cb2e74a88b6e166ac060e27b7185eba6a0ce176f06573713d665a99b8c4c5

See more details on using hashes here.

Provenance

The following attestation bundles were made for mumaxxr-0.0.37-py3-none-any.whl:

Publisher: publish.yml on timvgl/mumaxXR

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page