Skip to main content

Downloads various SEC EDGAR files converting many to CSV files

Project description

EDGARquery

Table of Contents

pip install edgarquery
  • License edgarquery is distributed under the terms of the MIT license.

-[Usage]

edgarquery

required environmental variable

EQEMAIL - required by the SEC to download some of the files with curl.
used as the user-agent in the url request by the scripts.

These commands retrieve various data from SEC EDGAR. They use a
CIK or Central Index Key to identify entities such as companies or
insiders - company officers or large stock holders.
Use edgartickerstocsv and edgarcikperson to find CIKs by name
or ticker and then use that CIK to gather the data of interest.
To display facts for a company aggregated by the SEC, invoke
edgarcompanyfactshow.

edgartickerstocsv


usage: edgartickerstocsv [-h] [--directory DIRECTORY]

collect EDGAR companyticker json files and convert them to csv

options:
-h, --help show this help message and exit
--directory DIRECTORY
where to deposit the fileѕ

## edgarcikperson

usage: edgarcikperson [-h] [--cikpersondb CIKPERSONDB] [--file FILE]

extract CIK and person names from form345 zip files

options:
-h, --help show this help message and exit
--cikpersondb CIKPERSONDB
full path to the sqlite3 database - default in memory
--file FILE where to store the output - default stdout

## edgarcompanyconcepttocsv
usage: edgarcompanyconcepttocsv [-h] --file FILE [--directory DIRECTORY]

Parse an SEC EDGAR companyconcepts json file after it has been altered to deal
with its multipart character and generate a csv file from its contents

options:
-h, --help show this help message and exit
--file FILE json file to process
--directory DIRECTORY
where to deposit the fileѕ

## edgarcompanyfactshow

usage: edgarcompanyfactsshow [-h] --cik CIK [--directory DIRECTORY] [--show]

parse EDGAR company facts for a cik and display them in a browser

options:
-h, --help show this help message and exit
--cik CIK Centralized Index Key for the company
--directory DIRECTORY
where to store the html file to display
--show display the html in your browser

## edgarcompanyfactstocsv

usage: edgarcompanyfactstocsv [-h] --file FILE [--directory DIRECTORY]

Parse an SEC EDGAR companyfacts json file after it has been altered to deal
with its multipart character and generate CSV files from its content

options:
-h, --help show this help message and exit
--file FILE json file to process
--directory DIRECTORY
where to deposit the csv fileѕ

## edgarcompanyfactsziptocsv

usage: edgarcompanyfactsziptocsv [-h] --zipfile ZIPFILE [--directory DIRECTORY]
[--files FILES]

Extract one or more json files from an SEC EDGAR companyfacts.zip file and
convert to CSV

options:
-h, --help show this help message and exit
--zipfile ZIPFILE submissions.zip file to process. Іt can be downloadæd
with edgarquery.query
--directory DIRECTORY
where to deposit the output
--files FILES comma separated(no spaces) content file(s) to process
a subset of the files in the zip file

## edgarquery

usage: edgarquery [-h] [--cik CIK] [--cy CY] [--frame FRAME] [--units UNITS]
[--fact FACT] [--directory DIRECTORY] [--file FILE]
[--companyconcept] [--companyfacts] [--xbrlframes]
[--companyfactsarchivezip] [--submissionszip]
[--financialstatementandnotesdataset]

query SEC EDGAR site NOTE thæt EQEMAIL env variable is required and must
contain a valid User-Agent such as your email address

options:
-h, --help show this help message and exit
--cik CIK 10-digit Central Index Key
--cy CY calendar year e.g. CY2023, CY2023Q1, CY2023Q4I
--frame FRAME reporting frame e.g us-gaap, ifrs-full, dei, srt
--units UNITS USD or shares
--fact FACT fact to collect e.g AccountsPayableCurrent, USD-per-
shares
--directory DIRECTORY
directory to store the output
--file FILE file in which to store the output argument allowed for
each query type if --directory is not provided, it
should be the full path
--companyconcept returns all the XBRL disclosures from a single company
--cik required --frame - default us-gaap --fact -
default USD-per-shares
--companyfacts aggregates one fact for each reporting entity that is
last filed that most closely fits the calendrical
period requested --cik required
--xbrlframes returns all the company concepts data for a CIK --cy
required
--companyfactsarchivezip
returns daily companyfacts index in a zip file
--submissionszip returns daily index of submissions in a zip file
--financialstatementandnotesdataset
returns zip file with financial statement and notes
summaries --cy required
## edgarlatest10K

usage: edgarlatest10K [-h] --cik CIK [--link] [--directory DIRECTORY]

find the most recent 10-K for cik

options:
-h, --help show this help message and exit
--cik CIK 10-digit Central Index Key
--link return the url for the latest 10-K
--directory DIRECTORY
directory to store the output
## edgarlatestsubmissions

usage: edgarlatestsubmissions [-h] --cik CIK [--directory DIRECTORY]
[--file FILE]

find the most recent submissions for cik

options:
-h, --help show this help message and exit
--cik CIK 10-digit Central Index Key
--directory DIRECTORY
directory to store the output
--file FILE json file to process

## edgarsubmissions

usage: edgarsubmissions [-h] --cik CIK [--year YEAR] [--file FILE]
[--directory DIRECTORY]

find the most recent submissions for cik

options:
-h, --help show this help message and exit
--cik CIK 10-digit Central Index Key
--year YEAR year to search for submissions if not current year
--file FILE store the output in this file
--directory DIRECTORY
store the output in this directory

## edgarsubmissionsziptocsv

usage: edgarsubmissionsziptocsv [-h] [--zipfile ZIPFILE] [--directory DIRECTORY]
[--files FILES]

Extract one or more json files from an SEC EDGAR submissions.zip file and
convert to CSV

options:
-h, --help show this help message and exit
--zipfile ZIPFILE submissions.zip file to process - required
--directory DIRECTORY
where to deposit the output
--files FILES comma separated(no spaces) content file(s) to process
a subset of the files in the zip file

## edgarxbrlframestocsv

usage: edgarxbrlframestocsv [-h] --file FILE [--directory DIRECTORY]

Parse an SEC EDGAR xbrlframes json file after it has been altered to deal with
its multipart character and generate a csv file from its contents

options:
-h, --help show this help message and exit
--file FILE xbrl frames json file to process
--directory DIRECTORY
where to deposit the output

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

edgarquery-0.0.49.tar.gz (25.7 kB view details)

Uploaded Source

Built Distribution

edgarquery-0.0.49-py3-none-any.whl (34.0 kB view details)

Uploaded Python 3

File details

Details for the file edgarquery-0.0.49.tar.gz.

File metadata

  • Download URL: edgarquery-0.0.49.tar.gz
  • Upload date:
  • Size: 25.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.0

File hashes

Hashes for edgarquery-0.0.49.tar.gz
Algorithm Hash digest
SHA256 64b3df8cf7657a589363cd35d7ab16297e29c1393823e72b9f78fd60bebc2231
MD5 a04e758cf9719a1a15b06f6703453c49
BLAKE2b-256 1b36fce357459422dddc6451d3a1ae59097ecad6f87f81597a16bdbfc3936c89

See more details on using hashes here.

File details

Details for the file edgarquery-0.0.49-py3-none-any.whl.

File metadata

  • Download URL: edgarquery-0.0.49-py3-none-any.whl
  • Upload date:
  • Size: 34.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.0

File hashes

Hashes for edgarquery-0.0.49-py3-none-any.whl
Algorithm Hash digest
SHA256 0bf89e21b31a7d62cf20638e52b0139ba49c2974b0d5ab70e816a0492cb3f885
MD5 89a7d9ede22b8a9ab36e8648aa504947
BLAKE2b-256 2c9e7a644fbfcbcf92c3bb67e1d176f93dbf73bb5018d3a48c1fbc3d78da54de

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page