Skip to main content

A tool for extracting data from FERC XBRL Filings.

Project description

Project Status: Active Tox-PyTest Status Codecov Test Coverage Read the Docs Build Status PyPI Latest Version conda-forge Version Supported Python Versions Any color you want, so long as it's black. pre-commit CI https://zenodo.org/badge/471019769.svg

The Federal Energy Regulatory Commission (FERC) has moved to collecting and distributing data using XBRL. XBRL is primarily designed for financial reporting, and has been adopted by regulators in the US and other countries. Much of the tooling in the XBRL ecosystem is targeted towards filers, and rendering individual filings in a human readable way, but there is very little targeted towards accessing and analyzing large collections of filings. This tool is designed to provide that functionality for FERC XBRL data. Specifically, it can extract data from a set of XBRL filings, and write that data to a SQLite database whose structure is generated from an XBRL Taxonomy. While each XBRL instance contains a reference to a taxonomy, this tool requires a path to a single taxonomy that will be used to interpret all instances being processed. This means even if instances were created from different versions of a Taxonomy, the provided taxonomy will be used when processing all of these instances, so the output database will have a consistent structure. For more information on the technical details of the XBRL extraction, see the docs.

We are currently using this tool to extract and publish the following FERC data:

FERC Form

Archived XBRL

SQLite DB

Form 1

https://doi.org/10.5281/zenodo.4127043

https://data.catalyst.coop/ferc1_xbrl

Form 2

https://doi.org/10.5281/zenodo.5879542

https://data.catalyst.coop/ferc2_xbrl

Form 6

https://doi.org/10.5281/zenodo.7126395

https://data.catalyst.coop/ferc6_xbrl

Form 60

https://doi.org/10.5281/zenodo.7126434

https://data.catalyst.coop/ferc60_xbrl

Form 714

https://doi.org/10.5281/zenodo.4127100

https://data.catalyst.coop/ferc714_xbrl

Usage

Installation

To install using conda, run the following command, and activate the environment.

$ conda env create -f environment.yml

Activate:

$ conda activate ferc-xbrl-extractor

CLI

This tool can be used as a library, as it is in PUDL, or there is a CLI provided for interacting with XBRL data. The only required options for the CLI are a path to the filings to be extracted, and a path to the output SQLite database. The path to the filings can point to a directory full of XBRL Filings, a single XBRL filing, or a zipfile with XBRL filings. If the path to the database points to an existing database, the --clobber option can be used to drop all existing data before performing the extraction.

$ xbrl_extract {path_to_filings} {path_to_database}

This repo contains a small selection of FERC Form 1 filings from 2021, along with an archive of taxonomies in the examples directory. To test the tool on these filings, use the command:

$ xbrl_extract examples/ferc1-2021-sample.zip ./ferc1-2021-sample.sqlite \
    --taxonomy examples/ferc1-xbrl-taxonomies.zip

The tool expects the --taxonomy option to point to a zipfile containing archived taxonomies produced by the pudl-archiver. The extractor will parse all taxonomies in the archive, then use the taxonomy referenced in each filing while parsing it.

Parsing XBRL filings can be a time consuming and CPU heavy task, so this tool implements some basic multiprocessing to speed this up. It uses a process pool to do this. There are two options for configuring the process pool, --batch-size and --workers. The batch size configures how many filings will be processed by each child process at a time, and workers specifies how many child processes to create in the pool. It may take some experimentation to get these options optimally configured. The following command will use 5 worker processes to process batches of 50 filings at a time.

$ xbrl_extract examples/ferc1-2021-sample.zip ./ferc1-2021-sample.sqlite \
    --taxonomy examples/ferc1-xbrl-taxonomies.zip
    --workers 5 \
    --batch-size 50

There are also several options included for extracting metadata from the taxonomy. First is the --datapackage-path command to save a frictionless datapackage descriptor as JSON, which annotates the generated SQLite database. There is also the --metadata-path option, which writes more extensive taxonomy metadata to a json file, grouped by table name. See the ferc_xbrl_extractor.arelle_interface module for more info on the extracted metadata. To create both of these files using the example filings and taxonomy, run the following command.

$ xbrl_extract examples/ferc1-2021-sample.zip ./ferc1-2021-sample.sqlite \
    --taxonomy examples/ferc1-xbrl-taxonomies.zip
    --metadata-path metadata.json \
    --datapackage-path datapackage.json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file catalystcoop_ferc_xbrl_extractor-1.5.0.tar.gz.

File metadata

File hashes

Hashes for catalystcoop_ferc_xbrl_extractor-1.5.0.tar.gz
Algorithm Hash digest
SHA256 9735aa1bd325dd9d0dc809f8b18602a377c8076765c7d99a2360d906d0711f9c
MD5 52d08d337df1e7a4d4ab8099c3889cfa
BLAKE2b-256 2115a5bbe136e971c5dbd43f62afa751bdadf73db9a9cc84bba90843f9e29c4e

See more details on using hashes here.

File details

Details for the file catalystcoop.ferc_xbrl_extractor-1.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for catalystcoop.ferc_xbrl_extractor-1.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6e1a27316b49536b8d93fa50d1dcb22719a30350bc6b7fd4bfb213010241be45
MD5 83c08f8fe44cbd1da68d749e1bb8dd62
BLAKE2b-256 2a51ebe6599d20a1e54541f478eb45a8346e131225d985ca01384f8847ff907f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page