Skip to main content

Sayou Stock Package for Financial Data from SEC, FnGuide, Naver, OpenDART, Yahoo

Project description

Sayou Stock

PyPI version License Docs

📦 Installation

sayou-stock is automatically installed when you install any Sayou library.

pip install sayou-stock

🔑 Key Components

  1. EDGARCrawler: Retrieves 10-K, 10-Q, 8-K, 13F, DEF 14A documents using SEC EDGAR API.
  2. FnGuideCrawler: Crawls Company Information & Financial Statements from FnGuide.
  3. NaverCrawler: Retrieves Market News using Naver API and Crawls Market Data from Naver.
  4. OpenDartCrawler: Retrieves Company Information & Financial Statements using OpenDart API.
  5. YahooCrawler: Retrieves Company Information & Market Data using Yahoo Finance API.

🤝 Usage Example

Retrieve SEC EDGAR 10-K

from sayou.stock.edgar import EDGARCrawler

crawler = EDGARCrawler(user_agent="Sayouzone sjkim@sayouzone.com")
ticker = "AAPL"

# Retrieve CIK by Ticker
cik = crawler.fetch_cik_by_ticker(ticker)

# EDGAR 10-K Annual Report
filings = crawler.fetch_filings(cik, doc_type="10-K", count=1)
data = crawler.extract_10k(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 10-Q Quarterly Report
filings = crawler.fetch_filings(cik, doc_type="10-Q", count=1)
data = crawler.extract_10q(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 8-K Current Report
filings = crawler.fetch_filings(cik, doc_type="8-K", count=1)
data = crawler.extract_8k(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 13F Institutional Holdings
filings = crawler.fetch_filings(cik, doc_type="13F", count=1)
data = crawler.extract_13f(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR DEF 14A Proxy Statement 
filings = crawler.fetch_filings(cik, doc_type="DEF 14A", count=1)
data = crawler.extract_def14a(cik, filings[0].document_url, filings[0].accession_number)

Retrieve FnGuide's Company Information

from sayou.stock.fnguide import FnGuideCrawler

stock = "005930"
crawler = FnGuideCrawler()

# Company Finance
data = crawler.finance(stock)
print(data)

# Company Information
data = crawler.company(stock)
print(data)

# Company Finance Ratio
data = crawler.finance_ratio(stock)
print(data)

# Company Investment
data = crawler.invest(stock)
print(data)

# Company Consensus
data = crawler.consensus(stock)
print(data)

Retrieve Naver's Company News

from sayou.stock.naver import NaverCrawler

client_id = "YOUR_CLIENT_ID"
client_secret = "YOUR_CLIENT_SECRET"
crawler = NaverCrawler(client_id, client_secret)

# Naver Category News
articles = crawler.category_news()
print(articles)

# Naver company's News
articles = crawler.news(query="삼성전자", max_articles=10)
print(articles)

Retrieve OpenDart's Company Information

from sayou.stock.opendart import OpenDartCrawler

DART_API_KEY = "YOUR_DART_API_KEY"

stock = "005930"
crawler = OpenDartCrawler(api_key=DART_API_KEY)

# Search corp_code from Company Name or Stock Code
corp_code = crawler.fetch_corp_code(stock)
print(corp_code)

# Single Company's Main Accounts
api_type = "단일회사 주요계정"
last_year = 2024
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

# Multiple Companies' Main Accounts
api_type = "다중회사 주요계정"
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

# Single Company's Total Financial Statements (Linked)
api_type = "단일회사 전체 재무제표"
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

Retrieve Yahoo's Company Information

from sayou.stock.yahoo import YahooCrawler

ticker = "AAPL"
crawler = YahooCrawler()

# Company Calendar
data = crawler.calendar(ticker)
print(data)

# Earning Estimate
data = crawler.earnings_estimate(ticker)
print(data)

# Revenue Estimate
data = crawler.revenue_estimate(ticker)
print(data)

# Earnings History
data = crawler.earnings_history(ticker)
print(data)

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_stock-0.1.3.tar.gz (3.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_stock-0.1.3-py3-none-any.whl (219.0 kB view details)

Uploaded Python 3

File details

Details for the file sayou_stock-0.1.3.tar.gz.

File metadata

  • Download URL: sayou_stock-0.1.3.tar.gz
  • Upload date:
  • Size: 3.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_stock-0.1.3.tar.gz
Algorithm Hash digest
SHA256 b6c0f50b452c8d8b826a98d7a4124fef40044f99a913ba16ed152b1015b4715e
MD5 d4eb9d119e9efbc34f59be4118b369c3
BLAKE2b-256 0eac60cfd5e4f9e6e27f0cc60621d00b0f8d94721e65b77187359231a43996e0

See more details on using hashes here.

File details

Details for the file sayou_stock-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: sayou_stock-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 219.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_stock-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 cbed1a27a8cb99759f7e473428d7e2fef3e361256962a174e5e7cda8433a4ad6
MD5 fdb9d3aba42b3ff963b0deb362e93476
BLAKE2b-256 2d9a337b842be6b8de57c7e92b3234d844134478de22fc6969ddc1c26d3ca0d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page