Skip to main content

Sayou Stock Package for Financial Data from SEC, FnGuide, Naver, OpenDART, Yahoo

Project description

Sayou Stock

PyPI version License Docs

📦 Installation

sayou-stock is automatically installed when you install any Sayou library.

pip install sayou-stock

🔑 Key Components

  1. EDGARCrawler: Retrieves 10-K, 10-Q, 8-K, 13F, DEF 14A documents using SEC EDGAR API.
  2. FnGuideCrawler: Crawls Company Information & Financial Statements from FnGuide.
  3. NaverCrawler: Retrieves Market News using Naver API and Crawls Market Data from Naver.
  4. OpenDartCrawler: Retrieves Company Information & Financial Statements using OpenDart API.
  5. YahooCrawler: Retrieves Company Information & Market Data using Yahoo Finance API.

🤝 Usage Examples

Retrieve 10-K document from SEC EDGAR

from sayou.stock.edgar import EDGARCrawler

user_agent = "YOUR_NAME YOUR_EMAIL"
crawler = EDGARCrawler(user_agent=user_agent)
ticker = "AAPL"

# Retrieve CIK by Ticker
cik = crawler.fetch_cik_by_ticker(ticker)

# EDGAR 10-K Annual Report
filings = crawler.fetch_filings(cik, doc_type="10-K", count=1)
data = crawler.extract_10k(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 10-Q Quarterly Report
filings = crawler.fetch_filings(cik, doc_type="10-Q", count=1)
data = crawler.extract_10q(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 8-K Current Report
filings = crawler.fetch_filings(cik, doc_type="8-K", count=1)
data = crawler.extract_8k(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 13F Institutional Holdings
filings = crawler.fetch_filings(cik, doc_type="13F", count=1)
data = crawler.extract_13f(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR DEF 14A Proxy Statement 
filings = crawler.fetch_filings(cik, doc_type="DEF 14A", count=1)
data = crawler.extract_def14a(cik, filings[0].document_url, filings[0].accession_number)

Retrieve Company Information from FnGuide

from sayou.stock.fnguide import FnGuideCrawler

stock = "005930"
crawler = FnGuideCrawler()

# Company Finance
data = crawler.finance(stock)
print(data)

# Company Information
data = crawler.company(stock)
print(data)

# Company Finance Ratio
data = crawler.finance_ratio(stock)
print(data)

# Company Investment
data = crawler.invest(stock)
print(data)

# Company Consensus
data = crawler.consensus(stock)
print(data)

Retrieve Company News from Naver News

from sayou.stock.naver import NaverCrawler

client_id = "YOUR_CLIENT_ID"
client_secret = "YOUR_CLIENT_SECRET"
crawler = NaverCrawler(client_id, client_secret)

# Naver Category News
articles = crawler.category_news()
print(articles)

# Naver company's News
articles = crawler.news(query="삼성전자", max_articles=10)
print(articles)

Retrieve Company Information from OpenDart

from sayou.stock.opendart import OpenDartCrawler

DART_API_KEY = "YOUR_DART_API_KEY"

stock = "005930"
crawler = OpenDartCrawler(api_key=DART_API_KEY)

# Search corp_code from Company Name or Stock Code
corp_code = crawler.fetch_corp_code(stock)
print(corp_code)

# Single Company's Main Accounts
api_type = "단일회사 주요계정"
last_year = 2024
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

# Multiple Companies' Main Accounts
api_type = "다중회사 주요계정"
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

# Single Company's Total Financial Statements (Linked)
api_type = "단일회사 전체 재무제표"
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

Retrieve Company Information from Yahoo Finance

from sayou.stock.yahoo import YahooCrawler

ticker = "AAPL"
crawler = YahooCrawler()

# Company Calendar
data = crawler.calendar(ticker)
print(data)

# Earning Estimate
data = crawler.earnings_estimate(ticker)
print(data)

# Revenue Estimate
data = crawler.revenue_estimate(ticker)
print(data)

# Earnings History
data = crawler.earnings_history(ticker)
print(data)

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_stock-0.1.8.tar.gz (3.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_stock-0.1.8-py3-none-any.whl (223.7 kB view details)

Uploaded Python 3

File details

Details for the file sayou_stock-0.1.8.tar.gz.

File metadata

  • Download URL: sayou_stock-0.1.8.tar.gz
  • Upload date:
  • Size: 3.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_stock-0.1.8.tar.gz
Algorithm Hash digest
SHA256 7c4da0e5ede87dc326bd3d6f41de09e60ac232a42fd6de3565b9998d9c8d6ff3
MD5 84cabaf40d08bc9143c9512e0891a209
BLAKE2b-256 48ead637a0ddffe8a52c40fa8554a2cff3646bd10b3beedf0230ea774022caf7

See more details on using hashes here.

File details

Details for the file sayou_stock-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: sayou_stock-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 223.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_stock-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 4c54aeb8a726bff523664faf6fd9e5d46267509dae03e7b08fa17936642111a9
MD5 f14abb962434bd467fafac2e410ff432
BLAKE2b-256 9c20ea69c16596b3c937f6293be649c6571eeb7a4bf69255e60ada744fe0bd42

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page