Skip to main content

Data analysis tool for quickly building charts, tables, reports, and dashboards

Project description

BuildStatus CoverageStatus Codacy Docs PyPi License

fireant is a a data analysis tool used for quickly building charts, tables, reports, and dashboards. It defines a schema for configuring metrics and dimensions which removes most of the leg work of writing queries and formatting charts. fireant even works great with Jupyter notebooks and in the Python shell providing quick and easy access to your data.

Read more at http://fireant.readthedocs.io/en/latest/

Installation

To install fireant, run the following command in the terminal:

pip install fireant

Introduction

fireant arose out of an environment where several different teams, each working with data sets often with crossover, were individually building their own dashboard platforms. fireant was developed as a centralized way of building dashboards without the legwork.

fireant is used to create configurations of data sets using DataSet which backs a database table containing analytics and defines sets of Field. A Field can be used to group data by properties, such as a timestamp, an account, a device type, etc, or to render quantifiers such as clicks, ROI, conversions into a widget such as a chart or table.

A DataSet exposes a rich builder API that allows a wide range of queries to be constructed that can be rendered as several widgets. A DataSet can be used directly in a Jupyter notebook, eliminating the need to write repetitive custom queries and render the data in visualizations.

Data Sets

DataSet are the core component of fireant. A DataSet is a representation of a data set and is used to execute queries and transform result sets into widgets such as charts or tables.

A DataSet requires only a couple of definitions in order to use: A database connector, a database table, join tables, and dimensions and metrics. Metrics and Dimension definitions tell fireant how to query and use data in widgets. Once a dataset is created, it’s query API can be used to build queries with just a few lines of code selecting which dimensions and metrics to use and how to filter the data.

Instantiating a Data Set

from fireant.dataset import *
from fireant.database import VerticaDatabase
from pypika import Tables, functions as fn

vertica_database = VerticaDatabase(user='myuser', password='mypassword')
analytics, customers = Tables('analytics', 'customers')

my_dataset = DataSet(
    database=vertica_database,
    table=analytics,
    fields=[
        Field(
            # Non-aggregate definition
            alias='customer',
            definition=customers.id,
            label='Customer'
        ),
        Field(
            # Date/Time type, also non-aggregate
            alias='date',
            definition=analytics.timestamp,
            type=DataType.date,
            label='Date'
        ),
        Field(
            # Text type, also non-aggregate
            alias='device_type',
            definition=analytics.device_type,
            type=DataType.text,
            label='Device_type'
        ),
        Field(
            # Aggregate definition (The SUM function aggregates a group of values into a single value)
            alias='clicks',
            definition=fn.Sum(analytics.clicks),
            label='Clicks'
        ),
        Field(
            # Aggregate definition (The SUM function aggregates a group of values into a single value)
            alias='customer-spend-per-clicks',
            definition=fn.Sum(analytics.customer_spend / analytics.clicks),
            type=DataType.number,
            label='Spend / Clicks'
        )
    ],
    joins=[
        Join(customers, analytics.customer_id == customers.id),
    ],

Building queries with a Data Set

Use the query property of a data set instance to start building a data set query. A data set query allows method calls to be chained together to select what should be included in the result.

This example uses the data set defined above

from fireant import Matplotlib, Pandas, day

 matplotlib_chart, pandas_df = my_dataset.data \
      .dimension(
         # Select the date dimension with a daily interval to group the data by the day applies to
         # dimensions are referenced by `dataset.fields.{alias}`
         day(my_dataset.fields.date),

         # Select the device_type dimension to break the data down further by which device it applies to
         my_dataset.fields.device_type,
      ) \
      .filter(
         # Filter the result set to data to the year of 2018
         my_dataset.fields.date.between(date(2018, 1, 1), date(2018, 12, 31))
      ) \
      # Add a week over week reference to compare data to values from the week prior
      .reference(WeekOverWeek(dataset.fields.date))
      .widget(
         # Add a matpotlib chart widget
         Matplotlib()
            # Add axes with series to the chart
            .axis(Matplotlib.LineSeries(dataset.fields.clicks))

            # metrics are referenced by `dataset.metrics.{alias}`
            .axis(Matplotlib.ColumnSeries(
                my_dataset.fields['customer-spend-per-clicks']
            ))
      ) \
      .widget(
         # Add a pandas data frame table widget
         Pandas(
             my_dataset.fields.clicks,
             my_dataset.fields['customer-spend-per-clicks']
         )
      ) \
      .fetch()

 # Display the chart
 matplotlib_chart.plot()

 # Display the chart
 print(pandas_df)

License

Copyright 2020 KAYAK Germany, GmbH

Licensed under the Apache License, Version 2.0 (the “License”); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Crafted with ♥ in Berlin.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fireant-8.1.0.tar.gz (341.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fireant-8.1.0-py3-none-any.whl (191.9 kB view details)

Uploaded Python 3

File details

Details for the file fireant-8.1.0.tar.gz.

File metadata

  • Download URL: fireant-8.1.0.tar.gz
  • Upload date:
  • Size: 341.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fireant-8.1.0.tar.gz
Algorithm Hash digest
SHA256 f5358ad5a77769f9b406108a700162cc33f4e20ef19a86d6525c15ad3a85c0cb
MD5 5e62a12e098c39addf6edbc261e45cd0
BLAKE2b-256 6f16b371e6805257db22c4bce5f1cb316673c7318eaa1b9c458ac035aaac42dd

See more details on using hashes here.

Provenance

The following attestation bundles were made for fireant-8.1.0.tar.gz:

Publisher: release.yml on kayak/fireant

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fireant-8.1.0-py3-none-any.whl.

File metadata

  • Download URL: fireant-8.1.0-py3-none-any.whl
  • Upload date:
  • Size: 191.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fireant-8.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 92a3e47039a6f8ca5337bd43e69e61072c490f2fe3cc31852c2a84f1b0a10e14
MD5 05a63d36478948b799c4fbac309f3a7a
BLAKE2b-256 0ce554378bda5cfd6eede45882d923b42e841f08b6cf1982781c55b009cd5e72

See more details on using hashes here.

Provenance

The following attestation bundles were made for fireant-8.1.0-py3-none-any.whl:

Publisher: release.yml on kayak/fireant

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page