Skip to main content

No project description provided

Project description

PyPI Downloads PyPI version

Nepse Scraper

A robust and feature-complete Python client for the Nepal Stock Exchange (NEPSE) API.

nepse-scraper provides a clean, high-level interface to access real-time and historical stock market data, enabling developers, analysts, and investors to build powerful financial applications and analysis tools.

Table of Contents

Installation

Install the package directly from PyPI:

pip install nepse-scraper

Quick Start & Important SSL/TLS Note

Here's how to get started with just a few lines of code.

:warning: Important Note on SSL/TLS Verification

The official NEPSE server has a known issue where it does not provide a complete SSL/TLS certificate chain. This will cause SSLCertVerificationError connection errors in most standard Python environments.

It is highly recommended to initialize the client with verify_ssl=False to ensure a successful connection.

from nepse_scraper import NepseScraper

# Recommended initialization:
scraper = NepseScraper(verify_ssl=False)

Basic Usage

from nepse_scraper import NepseScraper

# 1. Initialize the client (with SSL verification disabled as recommended)
scraper = NepseScraper(verify_ssl=False)

# 2. Check if the market is open
is_open = scraper.is_market_open()
print(f"Is the NEPSE market open? {'Yes' if is_open else 'No'}")

# 3. Fetch today's price data for all companies
try:
    today_prices = scraper.get_today_price()
    if today_prices:
        print(f"\nFetched {len(today_prices)} records for today's price.")
        # Find and print the record for a specific symbol
        aclbsl_data = next((item for item in today_prices if item['symbol'] == 'ACLBSL'), None)
        if aclbsl_data:
            print("Example record for ACLBSL:")
            print(aclbsl_data)

except Exception as e:
    print(f"An error occurred: {e}")

# 4. Get detailed information for a specific ticker
nabil_info = scraper.get_ticker_info('NABIL')
print("\nFetched Ticker Info for NABIL:")
print(nabil_info.get('security', {}).get('securityName'))

Advanced Usage

Extensibility: Using Custom Endpoints

The NEPSE API may change or have undocumented endpoints. nepse-scraper allows you to dynamically register and call any endpoint at runtime.

# 1. Initialize the client
scraper = NepseScraper(verify_ssl=False)

# 2. Register a new or custom endpoint
#    (Using an existing endpoint as an example with a new name)
scraper.register_endpoint(
    name='custom_market_status', 
    path='/api/nots/nepse-data/market-open', 
    method='GET'
)

# 3. Call your custom endpoint using the generic `call_endpoint` method
custom_response = scraper.call_endpoint(name='custom_market_status')
print("\nResponse from custom endpoint 'custom_market_status':")
print(custom_response)

Key Features

  • Complete API Coverage: Access to all major NEPSE endpoints.
  • Robust & Resilient: Built-in smart retries for handling transient network and server errors.
  • Secure by Default: Enforces secure SSL connections, with a clear, configurable option for known server/network issues.
  • Extensible: Dynamically add and call new or undocumented API endpoints at runtime.
  • Modern Architecture: Fully typed, decoupled, and built on a high-performance session-based core.
  • User-Friendly Errors: Catches common connection problems and provides clear, actionable error messages.

Available Data

  • Market Status: Check if the market is open.
  • Live Data: Get live trades and real-time index graphs.
  • Daily Data: Fetch today's prices and market summaries.
  • Historical Data: Access historical prices for tickers and indices.
  • Company Info: Retrieve security details, contact information, and corporate disclosures/notices.
  • Top Stocks: Get lists of top gainers, losers, turnover, trade volume, transactions, and more.
  • And much more...

Documentation

For a more detailed API reference and examples, please see the Documentation (docs/index.md).

The source code in nepse_scraper/client.py is also extensively documented with docstrings and type hints.

Contributing

Contributions are welcome! Whether it's adding new features, improving documentation, or reporting bugs, please feel free to open an issue or submit a pull request on our GitHub repository.

License

This project is licensed under the MIT License. See the LICENSE.txt file for details.

Happy Cod1ng!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nepse_scraper-1.0.0.tar.gz (14.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nepse_scraper-1.0.0-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file nepse_scraper-1.0.0.tar.gz.

File metadata

  • Download URL: nepse_scraper-1.0.0.tar.gz
  • Upload date:
  • Size: 14.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.13.7 Linux/6.17.8-arch1-1

File hashes

Hashes for nepse_scraper-1.0.0.tar.gz
Algorithm Hash digest
SHA256 d3a89311d4e5628969d91aaa0f998c73d742d8ae4562fe7ca8826ac9901afccd
MD5 bace94f41d8519ba2898ba5dc84663a4
BLAKE2b-256 0058d9ce1ec6b31a5a767e6e4b2bb280dc3b3eee98d077cfcbea3049933839ba

See more details on using hashes here.

File details

Details for the file nepse_scraper-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: nepse_scraper-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 15.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.13.7 Linux/6.17.8-arch1-1

File hashes

Hashes for nepse_scraper-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e9a87922f1809e1673555bd1f9452d2d7161e223e37bc5d6429a3c2d006d0c4f
MD5 30c81229ef6895ebd7edbe2c80713c5e
BLAKE2b-256 b70d88e531c4d281a9fadfbe170c86c640cacf79855acd80f6bf79c952b7bc7c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page