Skip to main content

A python package that splits large files into smaller chunks.

Project description

Tutorial using the pysplit package

pysplit is a Python package used for splitting large files into smaller chunks.

Currently the default chunk size is 100MB. This size was chosen to work around GitHub's upload file size limit.

Install the latest version of pysplitter

Run the cell below to ensure you have the latest version of pysplitter installed on your machine.

# !pip install --upgrade pysplitter

Import required packages

import pysplitter as pysp
import numpy as np

Import helpful packages

import sys
import os

Create a numpy array that will exceed 100MB when saved to disk.

The numeric values of the data are not important. Random values were used for convenience only.

dim = 250
num = int(dim * dim * dim)
x = np.random.normal(size=num).reshape(dim, dim, dim)
x.shape
(250, 250, 250)

Save numpy array to disk and list directory contents

np.save('x.npy', x)
os.listdir()
['.ipynb_checkpoints',
 '1-split-unsplit-tutorial.ipynb',
 'x.npy']

Display size of file on disk

size = os.path.getsize('x.npy')
print(f'{size / 1e6} MB')
125.000128 MB

As many people may know, GitHub will not allow files exceeding 100 MB to be uploaded.

Use the commands below to split the original (and too large) file into multiple .split files.

Currently the default split size is <= 100 MB, but this may become a variable paramter in furture distributions.

os.listdir()
['.ipynb_checkpoints',
 '1-split-unsplit-tutorial.ipynb',
 'x(unsplit).npy',
 'x.npy']
src = 'x.npy'
pysp.split(src)
2 file(s) written.

Check file size of the two chunks that were just written.

os.listdir()
['.ipynb_checkpoints',
 '1-split-unsplit-tutorial.ipynb',
 'x(unsplit).npy',
 'x.npy',
 'x0000.npy.split',
 'x0001.npy.split']
print(os.path.getsize('x0000.npy.split') / 1e6, 'MB')
100.0 MB
print(os.path.getsize('x0001.npy.split') / 1e6, 'MB')
25.000128 MB

As is clearly shown from the output of the above cells, both chunks are <= 100MB. This means that this data can now pushed to GitHub as any other file would.

Recombine the data chunks back into a single file

search_pattern = './x*.split'
dst = '.'
pysp.unsplit(search_pattern, dst, validate=True, orig_src=src)
File reconstructed without loss: True
os.listdir()
['.ipynb_checkpoints',
 '1-split-unsplit-tutorial.ipynb',
 'x(unsplit).npy',
 'x.npy',
 'x0000.npy.split',
 'x0001.npy.split']
x_unsplit = np.load('x(unsplit).npy')
x_unsplit.shape
(250, 250, 250)

Show that the manipulated data x_unsplit is the same as the original data x.

np.allclose(x, x_unsplit)
True

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pysplitter-0.0.10.tar.gz (3.3 kB view details)

Uploaded Source

Built Distribution

pysplitter-0.0.10-py3-none-any.whl (5.7 kB view details)

Uploaded Python 3

File details

Details for the file pysplitter-0.0.10.tar.gz.

File metadata

  • Download URL: pysplitter-0.0.10.tar.gz
  • Upload date:
  • Size: 3.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200622 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.7.7

File hashes

Hashes for pysplitter-0.0.10.tar.gz
Algorithm Hash digest
SHA256 e1a75df8c128ec648a1bc9143312d1711bdd88a8e46374832b37026fc86c6c4c
MD5 70c46b9099dd4960718064092baeacd1
BLAKE2b-256 0c95d7263599733a66cf53fcbe3f1de4881a7d3f230de1e3c13e064d8d61ac5d

See more details on using hashes here.

File details

Details for the file pysplitter-0.0.10-py3-none-any.whl.

File metadata

  • Download URL: pysplitter-0.0.10-py3-none-any.whl
  • Upload date:
  • Size: 5.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200622 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.7.7

File hashes

Hashes for pysplitter-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 f7a47ded38c258428b2b77d9b7b36bf26aae15239ac7ad74d077c5e7cc2175de
MD5 c1834c31a1802979facf5f9dd64479fc
BLAKE2b-256 ac127259b665d604de3c6474665bcf9294a4c0b969be318a1a91479fcdd7e8d7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page