Skip to main content

Python SDK for QuantDeFi Pool Rankings and Analytics API

Project description

QDF SDK - Quantitative DeFi Data SDK

PyPI Version Python Version License Downloads

Introduction

QDF SDK is a professional DeFi pool data access tool that provides real-time data access to over 7,000 DeFi liquidity pools. Through a clean Python interface, you can easily retrieve pool rankings, search for specific pools, and view market macro data.

Core Features

  • 🚀 Simple to Use - Get started with just 3 lines of code
  • 📊 Massive Dataset - Covers 60+ blockchains, 7,000+ liquidity pools
  • 🎯 Intelligent Ranking - Multi-dimensional scoring system based on IQR algorithm
  • High Performance - Built-in caching and retry mechanism for stability
  • 🔍 Flexible Queries - Supports various filtering and sorting methods

Why QuantDefi Rankings?

Traditional DeFi analytics often focus on single metrics like TVL or APY, which can be misleading:

  • High TVL ≠ Good Investment: Large pools may have low yields and poor capital efficiency
  • High APY ≠ Safe Returns: Extreme yields often indicate high risk or unsustainable tokenomics
  • Volume Alone Misses Context: High volume might mean high impermanent loss

QuantDefi Rankings solve this by:

  • 📊 Multi-dimensional Analysis: Evaluating 7+ factors simultaneously
  • 🎯 Risk-Adjusted Scoring: Balancing returns against various risk factors
  • 🔍 Statistical Outlier Detection: Identifying unusual opportunities and potential risks
  • Real-time Updates: Rankings refresh every 4 hours based on latest data

Perfect for:

  • 💼 Quantitative Traders: Build strategies on comprehensive pool metrics
  • 🏦 DeFi Investors: Find the best risk-adjusted yield opportunities
  • 🔬 Researchers: Access clean, normalized data across all major chains
  • 🤖 Bot Developers: Integrate rankings into automated trading systems

Quick Start

Installation

pip install qdf-sdk

Basic Usage

from qdf import QDFClient

# Create client
client = QDFClient()

# Get Top 10 pools
top_pools = client.get_top_pools(n=10)
for pool in top_pools:
    print(f"{pool.symbol}: Score {pool.ranking_score:.2f}")

Usage Examples

1. Get High-Scoring Pools

# Get top 5 highest-scoring pools
top_pools = client.get_top_pools(n=5)

for pool in top_pools:
    print(f"""
    Pool: {pool.symbol}
    Chain: {pool.chain}
    Overall Score: {pool.ranking_score:.2f}
    TVL: ${pool.tvl_usd:,.0f}
    APY: {pool.apy:.2f}%
    """)

2. Sort by Different Metrics

# Sort by TVL
high_tvl = client.get_top_pools(n=10, metric="tvl")

# Sort by APY
high_apy = client.get_top_pools(n=10, metric="apy")

# Sort by score (default)
top_score = client.get_top_pools(n=10, metric="ranking_score")

3. Search for Specific Pools

# Search for USDC-related pools
usdc_pools = client.search_pools("USDC", limit=10)

for pool in usdc_pools:
    print(f"{pool.symbol} on {pool.chain}")

4. Get Paginated Data

# Get page 1 data
rankings = client.get_rankings(page=1, size=20)

print(f"Total pools: {rankings.pagination.total}")
print(f"Total pages: {rankings.pagination.pages}")
print(f"Current page: {rankings.pagination.page}")

for pool in rankings.data:
    print(f"{pool.rank_overall}. {pool.symbol}")

5. Filter by Chain

# Get only Ethereum pools
eth_pools = client.get_rankings(chain="Ethereum", size=10)

# Get Arbitrum pools
arb_pools = client.get_rankings(chain="Arbitrum", size=10)

6. View Ranking Changes

# Get pools with biggest ranking changes in 24 hours
movers = client.get_ranking_movers(period="24h", limit=10)

for mover in movers:
    print(f"{mover.symbol}: Rank change {mover.rank_change}")

7. Get Market Macro Data

# Get overall market data
macro = client.get_macro_data()

print(f"BTC Price: ${macro.btc_price:,.0f}")
print(f"ETH Price: ${macro.eth_price:,.0f}")
print(f"Fear & Greed Index: {macro.crypto_fear_greed}")
print(f"Market Regime: {macro.market_regime_name}")
print(f"ETH Gas: {macro.eth_gas_price_gwei} Gwei")

Our Ranking Methodology

The QuantDefi Score Formula

Our proprietary ranking system combines multiple factors to create a comprehensive score that reflects both opportunity and risk:

Overall Score = Weighted Sum of Component Scores

Scoring Components

Each pool is evaluated across 7 key dimensions:

  1. APY Score (Yield Performance)

    • Measures annualized yield potential
    • Adjusts for unsustainable tokenomics
    • Higher weight for consistent, proven yields
  2. TVL Score (Liquidity Depth)

    • Evaluates pool size and stability
    • Larger TVL generally means lower slippage
    • Considers TVL trends over time
  3. Volume Score (Trading Activity)

    • Assesses daily trading volume
    • Higher volume indicates better liquidity
    • Volume-to-TVL ratio analysis
  4. Fee Score (Revenue Efficiency)

    • Analyzes fee generation relative to TVL
    • Identifies capital-efficient pools
    • 24-hour and 7-day fee trends
  5. Risk Score (Safety Assessment)

    • Impermanent loss potential
    • Smart contract age and audit status
    • Historical volatility metrics
  6. Age Score (Pool Maturity)

    • Time since pool creation
    • Survival bias indicator
    • Mature pools score higher
  7. Concentration Score (Decentralization)

    • Token holder distribution
    • Whale concentration risk
    • LP token distribution analysis

IQR Statistical Analysis

We use Interquartile Range (IQR) analysis to:

  • Classify Pools: Categorize into quartiles (Q1-Q4) for each metric
  • Detect Outliers: Flag statistical anomalies that may indicate opportunities or risks
  • Label Pools: Generate descriptive labels like "GiantTVL-HighVol-HighYield"

Example pool classifications:

  • tvl_iqr_level: 1-5 (1=lowest quartile, 5=highest/outlier)
  • is_statistical_outlier: True for pools with unusual metric combinations
  • pool_label: Human-readable classification

Momentum Scoring

In addition to static metrics, we track momentum:

  • 24-hour and 7-day performance changes
  • Trend detection algorithms
  • "Hot pool" identification for emerging opportunities

Data Pipeline & Updates

Data Sources

We aggregate data from multiple sources for accuracy and completeness:

  • Primary Source: DeFi Llama - The industry-standard DeFi TVL aggregator
  • On-chain Data: Direct blockchain queries for real-time updates
  • Price Feeds: Multiple oracle sources for accurate pricing

Special thanks to DeFi Llama for providing comprehensive DeFi data that powers our analytics.

Update Frequency

Our system continuously refreshes data to ensure accuracy:

  • Price Data: Real-time updates
  • 📊 Pool Metrics: Hourly updates
  • 🎯 Ranking Calculations: Every 4 hours
  • 🔄 Health Checks: Continuous monitoring

Data Quality

  • Validation: Multi-source verification for critical metrics
  • Cleaning: Automatic outlier detection and correction
  • Normalization: Standardized formats across all chains
  • Audit Trail: Historical data preserved for backtesting

API Documentation

Main Methods

Method Description Parameters
get_top_pools() Get top pools n: count, metric: sorting metric
get_rankings() Get paginated rankings page, size, chain, protocol
search_pools() Search pools query: search term, limit: result limit
get_pool_detail() Get pool details pool_id: pool ID
get_ranking_movers() Get ranking changes period: time period, limit: count
get_macro_data() Get macro data None

Sorting Metrics

  • ranking_score - Overall score (default)
  • tvl - Total Value Locked
  • apy - Annual Percentage Yield
  • momentum_score - Momentum and trend score
  • il_risk - Impermanent loss risk

Supported Blockchains

Major supported chains include:

  • Ethereum
  • Arbitrum
  • Optimism
  • Polygon
  • BSC
  • Avalanche
  • Base
  • And 50+ other chains...

Data Models

RankedPool

class RankedPool:
    pool_id: str           # Pool unique ID
    chain: str             # Blockchain
    project: str           # Project name
    symbol: str            # Pool symbol
    tvl_usd: float        # TVL (USD)
    apy: float            # Annual Percentage Yield
    ranking_score: float   # Overall score
    rank_overall: int      # Current overall rank
    momentum_score: float  # Momentum score
    # ... more fields

MacroLiveData

class MacroLiveData:
    btc_price: float              # BTC price
    eth_price: float              # ETH price
    crypto_fear_greed: float      # Fear & Greed Index
    market_regime_name: str       # Market regime
    eth_gas_price_gwei: float     # Gas price
    # ... more fields

Advanced Usage

Custom Configuration

from qdf import QDFClient

# Custom API endpoint (for private deployment)
client = QDFClient(
    base_url="https://your-api.com",
    timeout=30,
    max_retries=5
)

Error Handling

from qdf import QDFClient, APIError

client = QDFClient()

try:
    pools = client.get_top_pools(n=10)
except APIError as e:
    print(f"API error: {e}")
except Exception as e:
    print(f"Other error: {e}")

Performance Optimization

The SDK has multiple built-in optimizations:

  1. Connection Pooling - Reuses HTTP connections to reduce latency
  2. Auto Retry - Automatically handles temporary errors
  3. Smart Caching - Reduces duplicate requests
  4. Batch Requests - Supports batch data fetching

FAQ

Q: How often is the data updated?

A: Core data is updated hourly, ranking data is updated every 4 hours.

Q: What filtering options are supported?

A: Supports filtering by chain, protocol, TVL range, APY range, and more.

Q: How do I get historical data?

A: Current version only provides real-time data, historical data feature is under development.

Q: Are there API call limits?

A: Default limit is 100 requests per minute. Contact us for higher limits.

Roadmap

  • Basic query functionality
  • Ranking system
  • Macro data integration
  • WebSocket real-time push
  • Historical data queries
  • Strategy backtesting support
  • More chain support

Contributing

Issues and Pull Requests are welcome!

  1. Fork this repository
  2. Create a feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Create a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Links & Resources

Contact Us


QuantDefi SDK - Professional DeFi Pool Analytics
Powered by data from DeFi Llama and on-chain sources

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qdf_sdk-0.2.1.tar.gz (22.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qdf_sdk-0.2.1-py3-none-any.whl (17.4 kB view details)

Uploaded Python 3

File details

Details for the file qdf_sdk-0.2.1.tar.gz.

File metadata

  • Download URL: qdf_sdk-0.2.1.tar.gz
  • Upload date:
  • Size: 22.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.12.1.2 readme-renderer/44.0 requests/2.32.5 requests-toolbelt/1.0.0 urllib3/2.5.0 tqdm/4.67.1 importlib-metadata/8.7.0 keyring/25.6.0 rfc3986/1.5.0 colorama/0.4.6 CPython/3.12.6

File hashes

Hashes for qdf_sdk-0.2.1.tar.gz
Algorithm Hash digest
SHA256 07454a637aa9e1e17565d434f8c53dcb4b5a33c4eedf009c56ab3712ba08a02d
MD5 0fe0b97d7c948cffb11fd9217e1e6d47
BLAKE2b-256 02378243094998c2d19548cb1ccd32e1015e3d0f9b68be2e9c24c411606b9d60

See more details on using hashes here.

File details

Details for the file qdf_sdk-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: qdf_sdk-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 17.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.12.1.2 readme-renderer/44.0 requests/2.32.5 requests-toolbelt/1.0.0 urllib3/2.5.0 tqdm/4.67.1 importlib-metadata/8.7.0 keyring/25.6.0 rfc3986/1.5.0 colorama/0.4.6 CPython/3.12.6

File hashes

Hashes for qdf_sdk-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9a9f1f7213bbdd01a1127ea2c11876546a35c13b3dba5f257381b9b530abd390
MD5 4b452db24f9f24af5cc90a0f96841a31
BLAKE2b-256 1ab125a27f969568aed24cc71fd915ca6653a35e996c54fe78fb2850ca2ac7b9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page