Skip to main content

Utility package for submitting data to the 4DN Data Portal

Project description

Submit 4DN - Data Submitter Tools

Build Status Coverage Status Code Quality PyPI version

The Submit4DN package is written by the 4DN Data Coordination and Integration Center for data submitters from the 4DN Network. Please contact us to get access to the system, or if you have any questions or suggestions. Detailed documentation on data submission can be found at this link

Installing the package

pip install submit4dn

To upgrade to the latest version

pip install submit4dn --upgrade

Troubleshooting

This package is not supported on older Python versions and is supported and tested for versions 3.8 - 3.11. It may work with other python versions but your mileage may vary.

It is recommended to install this package in a virtual environment to avoid dependency clashes.

Problems have been reported on recent MacOS X and Windows versions having to do with the inablity to find libmagic, a C library to check file types that is used by the python-magic library.

eg. ImportError: failed to find libmagic. Check your installation

First thing to try is:

pip uninstall python-magic
pip install python-magic

If that doesn't work one solution that has worked for some from here:

pip uninstall python-magic
pip install python-magic-bin==0.4.14

Others have had success using homebrew to install libmagic:

brew install libmagic
brew link libmagic  (if the link is already created is going to fail, don't worry about that)

Additionally, problems have been reported on Windows when installing Submit4DN inside a virtual environment, due to aws trying to use the global python instead of the python inside the virtual environment.

The workaround, then, because it’s actually OK if aws doesn’t use the python inside the virtual environment, is to just install awscli in the global environment before entering the virtual environment. Or if you discover the problem after you’re in, then go outside, install awscli, and re-enter the virtual environment.

deactivate
pip install awscli
VENV\scripts\activate  # replace VENV with your virtual environment name
aws --version  # this is to test that awscli is now installed correctly

Connecting to the Data Portal

To be able to use the provided tools, you need to generate an AccessKey on the data portal. If you do not yet have access, please contact 4DN Data Wranglers to get an account and learn how to generate and save a key.

Generating data submission forms

To create the data submission excel workbook, you can use get_field_info.

It will accept the following parameters:

    --keyfile        the path to the file where you have stored your access key info (default ~/keypairs.json)
    --key            the name of the key identifier for the access key and secret in your keys file (default=default)
    --type           use for each sheet that you want to add to the excel workbook
    --nodesc         do not add the descriptions in the second line (by default they are added)
    --noenums        do not add the list of options for a field if they are specified (by default they are added)
    --comments       adds any (usually internal) comments together with enums (by default False)
    --outfile        change the default file name "fields.xlsx" to a specified one
    --debug          to add more debugging output
    --noadmin        if you have admin access to 4DN this option lets you generate the sheet as a non-admin user

Examples generating a single sheet:

get_field_info --type Biosample
get_field_info --type Biosample --comments
get_field_info --type Biosample --comments --outfile biosample.xlsx

Example Workbook with all sheets:

get_field_info --outfile MetadataSheets.xlsx

Examples for Workbooks using a preset option:

get_field_info --type HiC --comments --outfile exp_hic_generic.xlsx
get_field_info --type ChIP-seq --comments --outfile exp_chipseq_generic.xlsx
get_field_info --type FISH --comments --outfile exp_fish_generic.xlsx

Current presets include: Hi-C, ChIP-seq, Repli-seq, ATAC-seq, DamID, ChIA-PET, Capture-C, FISH, SPT

Data submission

Please refer to the submission guidelines and become familiar with the metadata structure prior to submission.

After you fill out the data submission forms, you can use import_data to submit the metadata. The method can be used both to create new metadata items and to patch fields of existing items.

	import_data filename.xlsx

Uploading vs Patching

Runnning import_data without one of the flags described below will perform a dry run submission that will include several validation checks. It is strongly recommended to do a dry run prior to actual submission and if necessary work with a Data Wrangler to correct any errors.

If there are uuid, alias, @id, or accession fields in the excel form that match existing entries in the database, you will be asked if you want to PATCH each object. You can use the --patchall flag, if you want to patch ALL objects in your document and ignore that message.

If no object identifiers are found in the document, you need to use --update for POSTing to occur.

Other Helpful Advanced parameters

Normally you are asked to verify the Lab and Award that you are submitting for. In some cases it may be desirable to skip this prompt so a submission can be run by a scheduler or in the background:

--remote is an option that will skip any prompt before submission

However if you submit for more than one Lab or there is more than one Award associated with your lab you will need to specify these values as parameters using --lab and/or --award followed by the uuids for the appropriate items.

Development

Note if you are attempting to run the scripts in the wranglertools directory without installing the package then in order to get the correct sys.path you need to run the scripts from the parent directory using the following command format:

  python -m wranglertools.get_field_info —-type Biosource
	python -m wranglertools.import_data filename.xlsx

pypi page is - https://pypi.python.org/pypi/Submit4DN

Submit4DN is packaged with poetry. New versions can be released and submitted to pypi using poetry publish

Pytest

Every function is tested by pytest implementation. It can be run in terminal in submit4dn folder by:

py.test

Some tests need internet access, and labeled with "webtest" mark.

Some tests have file operations, and labeled with "file_operation" mark.

To run the mark tests, or exclude them from the tests you can use the following commands:

# Run all tests
py.test

# Run only webtest
py.test -m webtest

# Run only tests with file_operation
py.test -m file_operation

# skip tests that use ftp (do this when testing locally)
py.test -m "not ftp"

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

submit4dn-4.1.1.tar.gz (31.6 kB view details)

Uploaded Source

Built Distribution

submit4dn-4.1.1-py3-none-any.whl (30.8 kB view details)

Uploaded Python 3

File details

Details for the file submit4dn-4.1.1.tar.gz.

File metadata

  • Download URL: submit4dn-4.1.1.tar.gz
  • Upload date:
  • Size: 31.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.9 Linux/6.5.0-1025-azure

File hashes

Hashes for submit4dn-4.1.1.tar.gz
Algorithm Hash digest
SHA256 b5dffcf8498ccfa509e0dd041f1df6d5ed88a5f9fcce7de03805faa702666f1f
MD5 b224bcb0199beca6b72a4746ae5efc12
BLAKE2b-256 a88e3c36b98f12aa4f9cccb28ba928184bfc5b7d189d0f6ca4e2b47b9bc88be0

See more details on using hashes here.

File details

Details for the file submit4dn-4.1.1-py3-none-any.whl.

File metadata

  • Download URL: submit4dn-4.1.1-py3-none-any.whl
  • Upload date:
  • Size: 30.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.9 Linux/6.5.0-1025-azure

File hashes

Hashes for submit4dn-4.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a41a82982870d513bf6bba41d79e84dd3d5ae94e75b3f86fd3f793f517fd7ac4
MD5 ce8a329e00f225dc129ffbec24494b9f
BLAKE2b-256 0dc51c178dddb0a0b355fef9a1326689a2c05f1a4f8c3c06a00c5bf432344cb1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page