A Python-based stock broker module
Project description
PKBrokers
| Platforms | ||||
|---|---|---|---|---|
| Package / Docs | ||||
| Tests/Code-Quality |
Table of Contents
- What is PKBrokers?
- Installation
- Quick Start
- Architecture Overview
- Core Modules
- API Reference
- Environment Variables
- Contributing
- Related Projects
What is PKBrokers?
PKBrokers is a high-performance Python library for connecting to stock brokers (primarily Zerodha's Kite Connect) to fetch real-time market data, instruments, and ticks. Key features include:
- ๐ High-Performance Candle Store - O(1) access to OHLCV candles across 10 timeframes
- ๐ Real-Time Tick Processing - WebSocket-based tick aggregation
- ๐พ Multi-Source Data Management - SQLite, Turso, pickle files, and Kite API
- ๐ค Telegram Bot Integration - Distribute tick data via Telegram
- ๐ Automated Authentication - TOTP-based Kite login
- ๐ฆ 24/7 Data Availability - GitHub-based data persistence
This library is part of the PKScreener ecosystem.
Installation
From PyPI
pip install pkbrokers
From Source
git clone https://github.com/pkjmesra/pkbrokers.git
cd pkbrokers
pip install -r requirements.txt
pip install -e .
Requirements
- Python 3.9+
- Zerodha Kite Connect account (for real-time data)
- See
requirements.txtfor dependencies
Quick Start
High-Performance Data Provider
from pkbrokers.kite import get_candle_store, HighPerformanceDataProvider
# Get singleton candle store
store = get_candle_store()
# Or use high-level data provider
provider = HighPerformanceDataProvider()
# Get 5-minute candles for any stock
df = provider.get_stock_data("RELIANCE", interval="5m", count=50)
# Get current day's OHLCV
ohlcv = provider.get_current_ohlcv("TCS")
print(f"Open: {ohlcv['open']}, High: {ohlcv['high']}, Low: {ohlcv['low']}, Close: {ohlcv['close']}")
Data Manager (Multi-Source)
from pkbrokers.kite.datamanager import InstrumentDataManager
# Initialize manager
manager = InstrumentDataManager()
# Execute data synchronization
success = manager.execute()
if success:
# Access stock data
reliance = manager.pickle_data["RELIANCE"]
df = pd.DataFrame(
data=reliance['data'],
columns=reliance['columns'],
index=reliance['index']
)
print(f"Shape: {df.shape}")
Kite Authentication
from pkbrokers.kite.examples.externals import kite_auth
# Authenticate and get access token
# Requires KUSER, KPWD, KTOTP environment variables
kite_auth()
# Token is now available as KTOKEN
from PKDevTools.classes.Environment import PKEnvironment
token = PKEnvironment().KTOKEN
Architecture Overview
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ PKBrokers Architecture โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Application Layer โ โ
โ โ PKScreener | Custom Applications โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Data Provider API โ โ
โ โ HighPerformanceDataProvider | InstrumentDataManager โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ โ โ
โ โโโโโโผโโโโโ โโโโโโโโโผโโโโโโโโ โโโโโโโโโผโโโโโโโโ โ
โ โInMemory โ โ Local SQLite โ โ Remote Data โ โ
โ โCandle โ โ Database โ โ (GitHub/Turso)โ โ
โ โStore โ โ โ โ โ โ
โ โโโโโโฌโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ-โ โ
โ โ Tick Processing Layer โ โ
โ โ KiteTokenWatcher | CandleAggregator | TickProcessor โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ WebSocket Layer โ โ
โ โ ZerodhaWebSocketClient | KiteTicker โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Kite Connect API / Authentication โ โ
โ โ Authenticator | KiteInstruments โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Bot Layer (Telegram) โ โ
โ โ PKTickBot | Orchestrator | Consumer โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
See more details on Architecture
Core Modules
1. In-Memory Candle Store
High-performance, in-memory OHLCV storage with O(1) access to all timeframes.
from pkbrokers.kite.inMemoryCandleStore import InMemoryCandleStore, get_candle_store
# Get singleton instance
store = get_candle_store()
# Process incoming tick
store.process_tick({
'instrument_token': 256265,
'last_price': 21500.50,
'volume': 1000000,
'timestamp': datetime.now()
})
# Get completed candles
candles = store.get_candles(
instrument_token=256265,
interval='5m',
count=50
)
# Get current forming candle
current = store.get_current_candle(
instrument_token=256265,
interval='5m'
)
# Export to ticks.json
store.save_ticks_json("/path/to/ticks.json")
# Get statistics
stats = store.get_stats()
print(f"Instruments: {stats['instrument_count']}")
print(f"Ticks processed: {stats['ticks_processed']}")
Supported Timeframes
| Interval | Description | Max Candles Stored |
|---|---|---|
1m |
1 minute | 375 (full day) |
2m |
2 minutes | 188 |
3m |
3 minutes | 125 |
4m |
4 minutes | 94 |
5m |
5 minutes | 75 |
10m |
10 minutes | 38 |
15m |
15 minutes | 25 |
30m |
30 minutes | 13 |
60m |
60 minutes | 7 |
day |
Daily | 1 |
Features
- O(1) Access: Instant lookup via hash-based indexing
- No Rate Limits: Unlike Yahoo Finance
- Auto-Persistence: Saves to disk every 5 minutes
- Memory Efficient: ~100MB for 2000 instruments
- Thread-Safe: Lock-protected operations
2. Data Manager
Comprehensive data synchronization from multiple sources.
from pkbrokers.kite.datamanager import InstrumentDataManager
manager = InstrumentDataManager()
# Set specific stocks (optional)
manager.list_stock_codes = ["RELIANCE", "TCS", "INFY"]
# Execute synchronization
# Priority: SQLite โ InMemoryCandleStore โ Kite API โ Pickle files
success = manager.execute()
# Access data
if success:
for symbol, data in manager.pickle_data.items():
df = pd.DataFrame(
data=data['data'],
columns=data['columns'],
index=data['index']
)
print(f"{symbol}: {len(df)} rows")
Data Source Priority
-
During Market Hours:
- Local SQLite database
- InMemoryCandleStore (real-time ticks)
- Kite API (authenticated)
- GitHub ticks.json
-
After Market Hours:
- Local pickle files
- Remote GitHub pickle files
3. Kite Instruments
Manage instrument data from Kite Connect API.
from pkbrokers.kite.instruments import KiteInstruments, Instrument
# Initialize with credentials
kite = KiteInstruments(
api_key="your_api_key",
access_token="your_access_token"
)
# Sync instruments from Kite API
kite.sync_instruments(force_fetch=True)
# Get instrument count
count = kite.get_instrument_count()
print(f"Total instruments: {count}")
# Get NSE stocks only
equities = kite.get_equities(only_nse_stocks=True)
# Get instrument tokens for subscription
tokens = kite.get_instrument_tokens(equities)
# Fetch instrument by token
instrument = kite.get_instrument(256265) # NIFTY 50
print(f"Symbol: {instrument.tradingsymbol}")
Instrument Model
@dataclass
class Instrument:
instrument_token: int # Unique identifier
exchange_token: str # Exchange-specific token
tradingsymbol: str # Trading symbol (e.g., 'RELIANCE')
name: Optional[str] # Full name
last_price: Optional[float]
expiry: Optional[str] # For derivatives
strike: Optional[float] # For options
tick_size: float
lot_size: int
instrument_type: str # EQ, FUT, OPT, INDEX
segment: str # NSE, BSE
exchange: str
last_updated: str
nse_stock: bool
4. Tick Watcher
WebSocket-based real-time tick processing.
from pkbrokers.kite.kiteTokenWatcher import KiteTokenWatcher
# Initialize watcher
watcher = KiteTokenWatcher()
# Start watching (blocking)
try:
watcher.watch(test_mode=False)
except KeyboardInterrupt:
watcher.stop()
Command-Line Usage
# Start tick watcher
pkkite --ticks
# Test mode (3 minutes)
pkkite --ticks --test
# Authenticate first
pkkite --auth
# Fetch historical data
pkkite --history=5minute
5. Local Candle Database
SQLite-based candle storage for persistence.
from pkbrokers.kite.localCandleDatabase import LocalCandleDatabase
# Initialize database
db = LocalCandleDatabase()
# Save daily candle
db.save_daily_candle(
symbol="RELIANCE",
date=date.today(),
open_price=2500.0,
high_price=2550.0,
low_price=2480.0,
close_price=2530.0,
volume=1000000
)
# Load candles
candles = db.load_daily_candles("RELIANCE", days=30)
# Save intraday candles
db.save_intraday_candle(
symbol="RELIANCE",
timestamp=datetime.now(),
interval="5m",
open_price=2500.0,
high_price=2510.0,
low_price=2495.0,
close_price=2505.0,
volume=50000
)
6. Telegram Bots
PKTickBot
Telegram bot for distributing tick data.
from pkbrokers.bot.tickbot import PKTickBot
bot = PKTickBot(
bot_token="your_bot_token",
ticks_file_path="/path/to/ticks.json",
chat_id="-1001234567890"
)
# Start bot (blocking)
bot.run()
Available Commands:
| Command | Description |
|---|---|
/ticks |
Get zipped ticks.json file |
/db |
Get local SQLite database |
/status |
Check bot and data status |
/top |
Get top 20 ticking symbols |
/token |
Get current KTOKEN |
/refresh_token |
Generate new KTOKEN |
/restart |
Refresh token and restart watcher |
/test_ticks |
Start 3-minute tick test |
/help |
Show help message |
Orchestrator
Multi-process orchestrator for bot and data management.
from pkbrokers.bot.orchestrator import Orchestrator
orchestrator = Orchestrator()
# Check if market is open
if orchestrator.should_run_kite_process():
orchestrator.start_kite_process()
7. Authentication
Automated Kite Connect authentication using TOTP.
from pkbrokers.kite.authenticator import KiteAuthenticator
auth = KiteAuthenticator(
user_id="your_user_id",
password="your_password",
totp_secret="your_totp_secret",
api_key="your_api_key"
)
# Get access token
access_token = auth.authenticate()
# Token is automatically saved to environment
Environment Variables Required:
KUSER: Kite user IDKPWD: Kite passwordKTOTP: TOTP secret keyKAPI: Kite API key
8. GitHub Actions Workflows
PKBrokers includes automated GitHub Actions workflows for OHLCV data collection.
History Data Workflow
The w1-workflow-history-data-child.yml workflow fetches historical data from Kite API and saves to PKScreener.
Triggering with --history=day:
# Via pkkite CLI
pkkite --history=day --pastoffset=0 --verbose
What happens:
- Fetches all NSE instrument tokens (~2000 stocks)
- Calls Kite Historical API for each instrument (rate-limited: 3 req/sec)
- Saves to local SQLite database (
instrument_history.db) - Exports to pkl files (
stock_data_DDMMYYYY.pkl) - Commits to PKScreener actions-data-download branch
Data Flow:
Kite API โ SQLite DB โ PKL Export โ Git Commit โ PKScreener Branch
PKL Files Saved to PKScreener:
actions-data-download/stock_data_DDMMYYYY.pkl- Daily candlesactions-data-download/daily_candles.pkl- Latest daily dataresults/Data/- Secondary storage location
Programmatic Trigger:
from pkbrokers.bot.dataSharingManager import DataSharingManager
manager = DataSharingManager()
manager.trigger_history_download_workflow(past_offset=5) # Fetch last 5 days
See ARCHITECTURE.md for detailed workflow documentation.
9. PKL Generator Script
Unified script for generating pkl files from ticks.json OR SQLite database with historical data merge.
# From ticks.json (default - used by Ticks Runner)
python pkbrokers/scripts/generate_pkl_from_ticks.py --data-dir results/Data --verbose
# From SQLite database (used by History Data Child workflow)
python pkbrokers/scripts/generate_pkl_from_ticks.py --from-db --data-dir results/Data --verbose
# Programmatic usage
from pkbrokers.scripts.generate_pkl_from_ticks import (
download_historical_pkl,
download_ticks_json,
load_from_sqlite,
find_sqlite_database,
convert_ticks_to_candles,
merge_candles,
save_pkl_files
)
# From ticks.json
historical = download_historical_pkl() # ~37MB from GitHub
ticks = download_ticks_json() # Today's ticks
candles = convert_ticks_to_candles(ticks)
merged = merge_candles(historical, candles)
save_pkl_files(merged, "results/Data")
# From SQLite database
db_path = find_sqlite_database()
db_candles = load_from_sqlite(db_path)
merged = merge_candles(historical, db_candles)
save_pkl_files(merged, "results/Data")
What it does:
- Loads new data from ticks.json OR SQLite database
- Downloads historical pkl (~37MB) from PKScreener actions-data-download
- Converts data to candle format
- Merges today's data with historical (~2000 stocks ร 2+ years)
- Saves both intraday and daily pkl files (~37MB+)
Output Files:
| File | Description |
|---|---|
stock_data_DDMMYYYY.pkl |
Daily candles merged with historical |
daily_candles.pkl |
Same as above (generic name) |
intraday_stock_data_DDMMYYYY.pkl |
Today's intraday data only |
intraday_1m_candles.pkl |
Same as above (generic name) |
API Reference
Main Exports
from pkbrokers.kite import (
# Candle Store
InMemoryCandleStore,
get_candle_store,
# Data Providers
HighPerformanceDataProvider,
InstrumentDataManager,
# Instruments
KiteInstruments,
Instrument,
# Tick Processing
KiteTokenWatcher,
CandleAggregator,
# Database
LocalCandleDatabase,
# Authentication
KiteAuthenticator,
)
from pkbrokers.bot import (
PKTickBot,
Orchestrator,
)
Module Structure
pkbrokers/
โโโ __init__.py
โโโ bot/
โ โโโ __init__.py
โ โโโ consumer.py # Data consumer
โ โโโ orchestrator.py # Multi-process orchestrator
โ โโโ tickbot.py # Telegram tick bot
โโโ kite/
โ โโโ __init__.py
โ โโโ authenticator.py # Kite authentication
โ โโโ candleAggregator.py # Tick โ Candle aggregation
โ โโโ datamanager.py # Multi-source data manager
โ โโโ databasewriter.py # Database writer
โ โโโ inMemoryCandleStore.py # In-memory candle store
โ โโโ instrumentHistory.py # Historical data
โ โโโ instruments.py # Instrument management
โ โโโ kiteTokenWatcher.py # WebSocket tick watcher
โ โโโ localCandleDatabase.py # SQLite candle storage
โ โโโ tickProcessor.py # Tick processing
โ โโโ ticks.py # Tick utilities
โ โโโ trader.py # Trading operations
โ โโโ zerodhaWebSocketClient.py # WebSocket client
โ โโโ examples/
โ โโโ externals.py # External helpers
โ โโโ pkkite.py # CLI entry point
โโโ scripts/
โโโ publish_candle_data.py # Data publishing
Environment Variables
| Variable | Required | Description |
|---|---|---|
KUSER |
Yes* | Kite user ID |
KPWD |
Yes* | Kite password |
KTOTP |
Yes* | TOTP secret for 2FA |
KAPI |
Yes* | Kite API key |
KTOKEN |
Auto | Access token (auto-generated) |
TOKEN |
Yes** | Telegram bot token |
CHAT_ID |
Yes** | Default Telegram chat ID |
TURSO_DB_URL |
No | Turso database URL |
TURSO_DB_AUTH_TOKEN |
No | Turso auth token |
*Required for Kite Connect features
**Required for Telegram bot features
Contributing
Development Setup
git clone https://github.com/pkjmesra/pkbrokers.git
cd pkbrokers
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
pip install -e .
Running Tests
pytest test/
pytest --cov=pkbrokers test/
Code Style
ruff check pkbrokers/
ruff format pkbrokers/
Related Projects
- PKScreener - Stock screening application
- PKDevTools - Common development tools
- PKNSETools - NSE market data tools
License
MIT License - see LICENSE file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pkbrokers-0.1.20260505.69.tar.gz.
File metadata
- Download URL: pkbrokers-0.1.20260505.69.tar.gz
- Upload date:
- Size: 193.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b9b49c07f2f9148763a05f484eb3c396bfe54f6b051c5d43b8c25d145579e347
|
|
| MD5 |
ccab64227e1ca71c8e5723c075b72ea9
|
|
| BLAKE2b-256 |
eb681abc6fdbc560f9739a589725a770553021b4f8cbff1aee33f80cd940c064
|
File details
Details for the file pkbrokers-0.1.20260505.69-cp312-cp312-win_amd64.whl.
File metadata
- Download URL: pkbrokers-0.1.20260505.69-cp312-cp312-win_amd64.whl
- Upload date:
- Size: 202.1 kB
- Tags: CPython 3.12, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
894a906eb0877fe6b11b90d3c0bcbf37f6eccc947770b60bab5087e328f6041c
|
|
| MD5 |
c9d6ae1367c0214672f11822ff9b42fe
|
|
| BLAKE2b-256 |
675e785a841692bf0b73366e59bc24074fb27756e44315fd5650b28a197e38a4
|
File details
Details for the file pkbrokers-0.1.20260505.69-cp312-cp312-manylinux2014_x86_64.whl.
File metadata
- Download URL: pkbrokers-0.1.20260505.69-cp312-cp312-manylinux2014_x86_64.whl
- Upload date:
- Size: 200.7 kB
- Tags: CPython 3.12
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f46b99c6f2589597aaae588a1820c2f291eee7ebc9e60db68eaefe6e93d83e33
|
|
| MD5 |
fdd875ff837f84eda527d0b1ba322215
|
|
| BLAKE2b-256 |
538e82e15ac1108641bec88f3be733e116cf622061eb776a01d9a09c14afd94e
|
File details
Details for the file pkbrokers-0.1.20260505.69-cp310-cp310-macosx_13_0_arm64.whl.
File metadata
- Download URL: pkbrokers-0.1.20260505.69-cp310-cp310-macosx_13_0_arm64.whl
- Upload date:
- Size: 200.6 kB
- Tags: CPython 3.10, macOS 13.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
412daaeec5ccb57ea07c969fa68b96b488852363f0178562ec6a0f1331148ca7
|
|
| MD5 |
1a84d123717f8d6ab824d0aa4763a6b0
|
|
| BLAKE2b-256 |
7106a136b1c80e30bb755ec50d68d7e264d7496f3de3ae75e23ee430dccb7159
|
File details
Details for the file pkbrokers-0.1.20260505.69-cp310-cp310-macosx_10_9_x86_64.whl.
File metadata
- Download URL: pkbrokers-0.1.20260505.69-cp310-cp310-macosx_10_9_x86_64.whl
- Upload date:
- Size: 200.6 kB
- Tags: CPython 3.10, macOS 10.9+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
681410439753d83c102207255fb1deda00a3d885d3aa015a5cbd04c24d8951e3
|
|
| MD5 |
a0e2267bc6b2a9edbac776fb7d1fe123
|
|
| BLAKE2b-256 |
8b4b8287309083d43aeb7a0edbb6c2dfe375171c50d2f0b74b7b39c16b89c9d3
|