A small library for taking the transpose of arbitrarily large .csvs
Project description
bigcsv: A small Python library to manipulate large csv files that can't fit in memory.
Transposition
bigcsv
allows for easy calculation of csv transposes, even when the csv is much too large to fit in memory.
Converting to h5ad
If data is purely numeric, it is much more efficient to store in in h5ad
(readable by AnnData
), which uses the amazing HDF5 format under-the-hood.
Installation
To install, run pip install bigcsv
How to use
All operations are method of the BigCSV
class, which contains metadata information used to do all calculations.
from bigcsv import BigCSV
obj = BigCSV(
file='massive_dataset.csv',
chunksize=400, # Number of rows to read in at each iteration
# leave as default
# insep=',',
# outsep=',',
# chunksize=400,
# save_chunks=False,
# quiet=False,
)
obj.to_h5ad(outfile='converted.h5ad')
# Or maybe we want to keep as csv, but transpose it (in the case of non-numerical data)
obj.transpose(outfile='dataset_T.csv')
Documentation
bigcsv.BigCSV
Class containing methods for manipulating csvs.
Parameters:
file
: Path to input file
outfile
: Path to output file (transposed input file)
insep=','
: Input separator for delimited file, by default is ,
outsep=','
: Output separator for delimited file (in the case of csv --> csv operations)
chunksize=400
: Number of lines per iteration
save_chunks
: To save intermediate chunks or not
chunkfolder=None
: Optional, Path to chunkfolder
quiet=False
: Boolean indicating whether to print progress or not
bigcsv.BigCSV.transpose_csv
Parameters:
outfile=None
: Ouput file to write to, or if specified in initialization, writes to that file name
bigcsv.BigCSV.to_h5ad
Parameters
outfile=None
: Ouput file to write to, or if specified in initialization, writes to that file name
sparsify: bool=False
: Sparsify rows in h5 matrix
compression: str='infer'
: Compression format of input csv, if compressed. Probably just leave to infer unless the filename is weird.
lines: int=None
: Number of lines in the file. If you know a priori, this saves some time. Also cannot be calculated for compressed files.
dtype: Any=None
: dtype of entries of input matrix
index_col: str=None
: Column of input csv to use as index, if any.
index: bool=True
: Save index when converting to h5ad.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file bigcsv-1.0.2.tar.gz
.
File metadata
- Download URL: bigcsv-1.0.2.tar.gz
- Upload date:
- Size: 1.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.8.1 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a9648dd0657591eb1c778effcff463e03b67da777ea66744c4323f214ca5ad4f |
|
MD5 | 8c352de46b0f39f38fb506c2b9684bc9 |
|
BLAKE2b-256 | d7fd2c12cecd8890efa20bc1d52e0edfa881e6fa13cac4265ce89f1e2ac1274d |
File details
Details for the file bigcsv-1.0.2-py2.py3-none-any.whl
.
File metadata
- Download URL: bigcsv-1.0.2-py2.py3-none-any.whl
- Upload date:
- Size: 10.4 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.8.1 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 39b7d45571e6d3df8dd669209eb7c2d356f64b21bd51c1a436c642da77061f9e |
|
MD5 | a461804caeaaee6768e2703c8c21c358 |
|
BLAKE2b-256 | 446dfb8169f902f13ce0b1b9aa4e3a55fbe20867b774c764fa818c67ce282c84 |