Skip to main content

Persistent Objectified Indexed Data

Project description

Join the chat at https://gitter.im/Equitable/trump Documentation Status Travis CI Status Coveralls.io

Persistent Objectification of Indexed Data

Trump is a framework for objectifying data, with the goal of centralizing the responsibility of managing feeds, munging, calculating and validating data, upstream of any application or user requirement.

With a focus on business processes, Trump’s long run goals enable data feeds to be:

  • Prioritized, flexibly - a symbol can be associated with multiple data source for a variety of reasons including redundancy, calculations, or optionality.

  • Modified, reliably - a symbol’s data feeds can be changed out, without any changes requiring testing to the downstream application or user.

  • Verified, systematically - a variety of common data processing checks are performed as the symbol’s data is cached.

  • Audited, quickly - alerts and reports all become possible to assess integrity or inspect where manual over-rides have been performed.

  • Aggregated, intelligently - on a symbol by symbol basis, feeds can be combined and used in an extensible number of ways.

  • Customized, dynamically - extensibility is possible at the templating, munging, aggregation, and validity steps.

Planning

See docs/planning.md for the direction of the project.

Basic Usage

This example dramatically understates the utility of Trump’s long term feature set.

Adding a Symbol

from trump.orm import SymbolManager
from trump.templating import QuandlFT, GoogleFinanceFT, YahooFinanceFT

sm = SymbolManager()

TSLA = sm.create(name = "TSLA",
                 description = "Tesla Closing Price USD")

TSLA.add_tags(["stocks","US"])

#Try Google First
#If Google's feed has a problem, try Quandl's backup
#If all else fails, use Yahoo's data...

TSLA.add_feed(GoogleFinanceFT("TSLA"))
TSLA.add_feed(QuandlFT("GOOG/NASDAQ_TSLA",fieldname='Close'))
TSLA.add_feed(YahooFinanceFT("TSLA"))

#Optional munging, validity checks and aggregation settings would be
#implemented here...

#All three feeds are cached...
TSLA.cache()

#But only a clean version of the data is served up...
print TSLA.df.tail()

              TSLA
dateindex
2015-03-20  198.08
2015-03-23  199.63
2015-03-24  201.72
2015-03-25  194.30
2015-03-26  190.40

sm.finish()

Using a Symbol

from trump.orm import SymbolManager

sm = SymbolManager()

TSLA = sm.get("TSLA")

#optional
TSLA.cache()

print TSLA.df.tail()

              TSLA
dateindex
2015-03-20  198.08
2015-03-23  199.63
2015-03-24  201.72
2015-03-25  194.30
2015-03-26  190.40

sm.finish()

Contributing

If you’re interested in contributing to Trump, we would love for you to do so! The best place to start is cloning the project, then use the latest commit from the master branch to install the package. After that, follow the configuration instructions, in the installation instructions linked below. While you do so, please make notes about any unclear parts or errors you get, during the installation instructions. Please post an issue on GitHub with ANY notes, or if you’re ambitious, feel free to submit a pull request yourself. Don’t hesitate, doing either.

If it’s not working, or unclear, it’s our fault. And, we really want it to be easy for people to get started. It’s really hard for the creator of the project, to assess their own instructions.

After installation, there are many paths to take; each one can be addressed by posting an issue, or a pull request. Exploring the docs, you’ll inevitably find areas that need improving. Explore the open issues, ones tagged with “Good First Pull Request” are the low hanging fruit. Often, current issues won’t have a ton of information. If you want to work on one, just add a comment, asking for more info and mention that you’re going to try to tackle it. Also, just posting an issue to “Say Hi”, and ask for recommended issues to start on, is a great way to get started too.

Installation

See the latest Installation instructions on ReadTheDocs.org

Requirements

  • Python 2.7; Support for Python 3.3 or 3.4 is do-able, if there is demand.

  • A Relational Database Supported by SQLAlchemy should work, however the following is tested: * PostgreSQL 9.4 * Persistent SQLite (ie, file-based). Certain features of Trump, wouldn’t make sense with an in-memory implementation)

Dependencies

Data Source Dependencies

Documentation

Read the latest on ReadTheDocs.org

Communication

License

BSD-3 clause. See the actual License.

Background

The prototype for Trump was built at Equitable Life of Canada in 2014 by Jeffrey McLarty, CFA and Derek Vinke, CFA. Jeffrey McLarty currently leads the Open Source initiative.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Trump-0.0.5.tar.gz (71.1 kB view details)

Uploaded Source

Built Distribution

Trump-0.0.5-py2-none-any.whl (92.9 kB view details)

Uploaded Python 2

File details

Details for the file Trump-0.0.5.tar.gz.

File metadata

  • Download URL: Trump-0.0.5.tar.gz
  • Upload date:
  • Size: 71.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for Trump-0.0.5.tar.gz
Algorithm Hash digest
SHA256 074555672aa300ef457cabbb761239f86f415e3b9adaacd9fb5dff38e5d4e516
MD5 56c18690c7935be816afcaf8395beb1e
BLAKE2b-256 4b0d0ab1a8471757284da5137cd3faf6c435481ada2976855b104279a37f75aa

See more details on using hashes here.

File details

Details for the file Trump-0.0.5-py2-none-any.whl.

File metadata

File hashes

Hashes for Trump-0.0.5-py2-none-any.whl
Algorithm Hash digest
SHA256 d71b90949d41f1fc1f32cd09e4e3b8f20f2da9634d29e010b1e1adfe35db8e5e
MD5 fdde3b4719d88e38f8dd41a05c52f3aa
BLAKE2b-256 d143be7d436e72409360a01bddf6c1acd840c60ef037d30a2bc6d5e1fa2280b1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page