Python Boilerplate contains all the boilerplate you need to create a Python package.
Project description
D4Data
Data Engineered with python
- Proof of concept project for python data engineering. Envisioned use cases:
Data access and sharing with data defined as code.
Data catologing and discovery.
Data transfer and partitioning for distributed computing.
Go from remote data sources to model training with simple and expressive python.
Installation
pip install d4data
Example API:
Define data as code
from d4data.storage_clients import FTPStorageClient
from d4data.sources import CSVDataSource
class NIHChromosomeSNPS38(CSVDataSource):
def __init__(self, chromosome, output_path):
# define data that is specific to your data source
self.chromosome = chromosome
# give your data source a name, file name, local paths to save to and uri
self.name = "NIH_Chromose_{}_SNPS38".format(self.chromosome)
self.file_name = "bed_chr_{}.bed.gz".format(self.chromosome)
self.uri = "https://ftp.ncbi.nlm.nih.gov/snp/organisms/human_9606_b151_GRCh38p7/BED/" + self.file_name
self.local_paths = [os.path.join(output_path, self.file_name)]
# assign a storage client
self.client = FTPStorageClient()
Download data programmatically
data = NIHChromosomeSNPS38(chromosome=1, local_path="./datasources")
# calls client.download(uri=self.uri)
data.to_disk()
Process data
dataset = data.to_dataset()
for i in range(len(dataset)):
some_func(dataset[i])
Compose DataSources dynamically with a DataStrategy:
from d4data.storage_clients import HTTPStorageClient
from d4data.core import DataStrategy, CompositeDataSource
# Define the DataSource
class HaploRegSource(CSVDataSource):
def __init__(self, population, local_path):
self.name = "LD_{}".format(population.upper())
self.file_name = self.name + ".tsv.gz"
self.uri = "https://pubs.broadinstitute.org/mammals/haploreg/data/" + self.file_name
self.local_paths = [os.path.join(local_path, self.file_name)]
self.client = HTTPStorageClient()
# Define the DataStrategy
# Data Strategies contain logic for building data sources from some higher level data about the data, e.g list of s3 urls.
# Data Strategies can also contain a partition strategy where logic for partitioning data sources can be implemented- you may want to partition based on compute resources available.
class HaploRegStrategy(DataStrategy):
def __init__(self, populations, local_path):
self.populations = populations
self.local_path = local_path
self._sources = {
"haplo_reg": HaploRegSource
}
def create_sources(self):
comp_source = CompositeDataSource()
source = self._sources["haplo_reg"]
for population in self.populations:
ds = source(population, self.local_path)
comp_source.add(ds)
return comp_source
pops = ["afr", "eur", "amr]
haplo_strategy = HaploRegStrategy(pops, local_path="./data_sources")
comp_source = haplo_strategy.create_sources()
for source in comp_source:
# Download sources to in-memory file system
d = s.to_memfs()
Prefect Integration: TODO
Pytorch Integration: TODO
Free software: Apache Software License 2.0
Documentation: https://d4data.readthedocs.io.
Features
TODO
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file d4data-0.1.3.tar.gz
.
File metadata
- Download URL: d4data-0.1.3.tar.gz
- Upload date:
- Size: 11.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 09d7edb7119aef43ae2bfc4bf078978bd916afcb9886eaee6e6ddff0140ff995 |
|
MD5 | cf999d8e3fb044c577adb99f460571da |
|
BLAKE2b-256 | 44be272b0f4adfd56bd56cec82617808c392091da77c3a200f185bfd8462ef8d |
File details
Details for the file d4data-0.1.3-py2.py3-none-any.whl
.
File metadata
- Download URL: d4data-0.1.3-py2.py3-none-any.whl
- Upload date:
- Size: 8.2 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c0b2702032b0deff94d180ed9f54b65e2e93bb969c1cbbccfe28e9c122fda812 |
|
MD5 | c5b15c965e150ab3eb2e13c18411bd35 |
|
BLAKE2b-256 | e6af883e58ff9ba8e8c849e35df4f4ff20e15053f99a2b26b299adf9bcf01eca |