Skip to main content

Sayou Stock Package for Financial Data from SEC, FnGuide, Naver, OpenDART, Yahoo

Project description

Sayou Stock

PyPI version License Docs

📦 Installation

sayou-stock is automatically installed when you install any Sayou library.

pip install sayou-stock

🔑 Key Components

  1. EDGARCrawler: Retrieves 10-K, 10-Q, 8-K, 13F, DEF 14A documents using SEC EDGAR API.
  2. FnGuideCrawler: Crawls Company Information & Financial Statements from FnGuide.
  3. NaverCrawler: Retrieves Market News using Naver API and Crawls Market Data from Naver.
  4. OpenDartCrawler: Retrieves Company Information & Financial Statements using OpenDart API.
  5. YahooCrawler: Retrieves Company Information & Market Data using Yahoo Finance API.

🤝 Usage Examples

Retrieve 10-K document from SEC EDGAR

from sayou.stock.edgar import EDGARCrawler

user_agent = "YOUR_NAME YOUR_EMAIL"
crawler = EDGARCrawler(user_agent=user_agent)
ticker = "AAPL"

# Retrieve CIK by Ticker
cik = crawler.fetch_cik_by_ticker(ticker)

# EDGAR 10-K Annual Report
filings = crawler.fetch_filings(cik, doc_type="10-K", count=1)
data = crawler.extract_10k(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 10-Q Quarterly Report
filings = crawler.fetch_filings(cik, doc_type="10-Q", count=1)
data = crawler.extract_10q(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 8-K Current Report
filings = crawler.fetch_filings(cik, doc_type="8-K", count=1)
data = crawler.extract_8k(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR 13F Institutional Holdings
filings = crawler.fetch_filings(cik, doc_type="13F", count=1)
data = crawler.extract_13f(cik, filings[0].document_url, filings[0].accession_number)

# EDGAR DEF 14A Proxy Statement 
filings = crawler.fetch_filings(cik, doc_type="DEF 14A", count=1)
data = crawler.extract_def14a(cik, filings[0].document_url, filings[0].accession_number)

Retrieve Company Information from FnGuide

from sayou.stock.fnguide import FnGuideCrawler

stock = "005930"
crawler = FnGuideCrawler()

# Company Finance
data = crawler.finance(stock)
print(data)

# Company Information
data = crawler.company(stock)
print(data)

# Company Finance Ratio
data = crawler.finance_ratio(stock)
print(data)

# Company Investment
data = crawler.invest(stock)
print(data)

# Company Consensus
data = crawler.consensus(stock)
print(data)

Retrieve Company News from Naver News

from sayou.stock.naver import NaverCrawler

client_id = "YOUR_CLIENT_ID"
client_secret = "YOUR_CLIENT_SECRET"
crawler = NaverCrawler(client_id, client_secret)

# Naver Category News
articles = crawler.category_news()
print(articles)

# Naver company's News
articles = crawler.news(query="삼성전자", max_articles=10)
print(articles)

Retrieve Company Information from OpenDart

from sayou.stock.opendart import OpenDartCrawler

DART_API_KEY = "YOUR_DART_API_KEY"

stock = "005930"
crawler = OpenDartCrawler(api_key=DART_API_KEY)

# Search corp_code from Company Name or Stock Code
corp_code = crawler.fetch_corp_code(stock)
print(corp_code)

# Single Company's Main Accounts
api_type = "단일회사 주요계정"
last_year = 2024
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

# Multiple Companies' Main Accounts
api_type = "다중회사 주요계정"
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

# Single Company's Total Financial Statements (Linked)
api_type = "단일회사 전체 재무제표"
data = crawler.finance(corp_code, last_year, api_type=api_type)
status = data.get("status", "")
list = data.get("list", [])
if status == "000" and len(list) > 0:
    print(f"\n{api_type} {last_year}년 ({corp_name}, {corp_code})")
    df = pd.DataFrame(list)
    print(df)

Retrieve Company Information from Yahoo Finance

from sayou.stock.yahoo import YahooCrawler

ticker = "AAPL"
crawler = YahooCrawler()

# Company Calendar
data = crawler.calendar(ticker)
print(data)

# Earning Estimate
data = crawler.earnings_estimate(ticker)
print(data)

# Revenue Estimate
data = crawler.revenue_estimate(ticker)
print(data)

# Earnings History
data = crawler.earnings_history(ticker)
print(data)

📜 License

Apache 2.0 License © 2025 Sayouzone

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sayou_stock-0.2.1.tar.gz (3.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sayou_stock-0.2.1-py3-none-any.whl (226.3 kB view details)

Uploaded Python 3

File details

Details for the file sayou_stock-0.2.1.tar.gz.

File metadata

  • Download URL: sayou_stock-0.2.1.tar.gz
  • Upload date:
  • Size: 3.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_stock-0.2.1.tar.gz
Algorithm Hash digest
SHA256 ef9e9c40bf1bfede6d81ee69d4ff9d4854db1f7a8e415e6c8446be4ab62d6721
MD5 a25b43acb3ff7dba1847bf9b7a8887c7
BLAKE2b-256 25b20b4b9b35e57457130b7e4fd87ef137fbe8164e0296be526dde73c3ccc4c7

See more details on using hashes here.

File details

Details for the file sayou_stock-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: sayou_stock-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 226.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sayou_stock-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e11e5b532cd35f5c606e496bc36d8ddf544e9b13da844c76d5e0047f15e10420
MD5 892fd170774a310e462650f5688ec096
BLAKE2b-256 792f8cda904fd5f4bc82ba99efb81977c4069b295cc80236c788cb263d7ffbe4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page