Skip to main content

A python package for working with the BriteCore ETL.

Project description

A python package for working with the BriteCore ETL.

PLEASE NOTE: brite_etl follows Semantic Versioning, and is currently in the initial development phase (0.x.x). Use with caution.

Use

This is all broken down on the introduction page.

import brite_etl
from brite_etl.core.io.frame_sources import CsvSource

# Create a "set" of frames to work with...
contoso = brite_etl.lib.FrameSet('contoso')

#Set the source of our csvs (can also pass BriteDataFrame/PreparedDataFrame)...
contoso.set_data_sources(source=CsvSource(DF_ROOT), prepared_source=CsvSource(DF_PREP))

# Easy handling of dataframes, works same for both csv and britedataframe sources.
# Essentially a wrapper around the pandas DataFrame. Dates parsed automatically.
contoso.frames.get('property_items')
contoso.frames.get('agencies').df # original dataframe

# Import BriteCore reports. Don't have to open/change/save columns in excel, hyperlinks and other
# formatting issues are handled. Don't even have to rename the file to take out the dates.
from brite_etl.core.io import import_report
adv_prem = import_report('/tmp/input', 'Advance Premium', sheet='Advance Premium List', skip_rows=2) # Pandas DataFrame
contoso.frames.set('ap', df=adv_prem) # Make custom frames in your frame set

# Define frame-specific operations...
contoso.frames.get('prepared.lines').endOfMonthSum()

# Or use universal operations, chain across multiple frames...
_contoso = contoso.chain
(_contoso
    .filter_dates('date filter for multiple frames actually isn\'t done yet (soon, though)')
    .hash_cols(['policyId']) # MD5 hashed dataframes
    .export_excel(
        path='/tmp/output',
        file_name='end_month_integrity_hash.xlsx'
    ) # Every frame is put into it's own sheet during export
    .run()
)

# Computations make use of multiple frames within a frame set (also chainable)...
trans = _contoso.get_item_transactions().value()

# Create multiple, isolated sets of frames...
wrk = brite_etl.lib.FrameSet('working', from_set=contoso)

Installation

pip install brite_etl

Development

To run the all tests run:

tox

Test just your desired python version with tox -e py27 or tox -e py35. Much faster than running all test envirornments.

Note about testing: some of the tests require real df_cache data to run. The locations for the df_cache directories is defined in the setup.cfg file. When running, the tests will check to make sure the directories exist and contain files. If they don’t those tests will be skipped, the rest of the tests should function like normal.

Changelog

0.1.0 (2016-10-03)

  • Update docs

  • Femove pypy env

  • Use semantic versioning

0.0.1 (2016-10-02)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brite_etl-0.1.1.tar.gz (40.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

brite_etl-0.1.1-py2.py3-none-any.whl (53.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file brite_etl-0.1.1.tar.gz.

File metadata

  • Download URL: brite_etl-0.1.1.tar.gz
  • Upload date:
  • Size: 40.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for brite_etl-0.1.1.tar.gz
Algorithm Hash digest
SHA256 75d3dc42052873d76e820e33203f53bf24f6f611616950a6d988c8703a399415
MD5 1b3eeda65866fa64ee138e1daedc2f97
BLAKE2b-256 699975daffc4b8668bc0b1a0d8ecd153bf335154bf9cbe5249670b1c76d2a7fc

See more details on using hashes here.

File details

Details for the file brite_etl-0.1.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for brite_etl-0.1.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 541f9f8e38b9f6c156176214fc8a5ea35a005829eb965709d2e5c2ca95aac2b3
MD5 ac52377582fb8091441ba32f0e030a0e
BLAKE2b-256 d229dbe75356c17838603249e738a8d8e2655f752214101f776fd367f10ffa0b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page