A library to extract data from Baidu SERP and output it as JSON objects
Project description
Baidu SERP API
A Python library to extract data from Baidu Search Engine Results Pages (SERP) and output it as JSON objects.
Installation
pip install baidu-serp-api
Usage
Basic Usage
from baidu_serp_api import BaiduPc, BaiduMobile
# Basic usage (default optimized for proxy rotation)
pc_serp = BaiduPc()
results = pc_serp.search('keyword', date_range='20240501,20240531', pn='2', proxies={'http': 'http://your-proxy-server:port'})
print(results)
m_serp = BaiduMobile()
results = m_serp.search('keyword', date_range='day', pn='2', proxies={'http': 'http://your-proxy-server:port'})
print(results)
# Filter the specified content. The following returned results do not contain 'recommend', 'last_page', 'match_count'
results = m_serp.search('关键词', exclude=['recommend', 'last_page', 'match_count'])
Network Connection Optimization
Connection Mode Configuration
# Single connection mode (default, suitable for proxy rotation and scraping)
pc = BaiduPc(connection_mode='single')
# Connection pool mode (suitable for fixed proxy or high-performance scenarios)
pc = BaiduPc(connection_mode='pooled')
# Custom mode (fully customizable parameters)
pc = BaiduPc(
connection_mode='custom',
connect_timeout=5,
read_timeout=15,
pool_connections=5,
pool_maxsize=20,
keep_alive=True
)
Performance Monitoring
# Get performance data
results = pc.search('keyword', include_performance=True)
if results['code'] == 200:
performance = results['data']['performance']
print(f"Response time: {performance['response_time']}s")
print(f"Status code: {performance['status_code']}")
Resource Management
# Manual resource management
pc = BaiduPc()
try:
results = pc.search('keyword')
finally:
pc.close() # Manually release resources
# Recommended: Use context manager
with BaiduPc() as pc:
results = pc.search('keyword')
# Automatically release resources
Parameters
Search Parameters
keyword: The search keyword.date_range(optional): Search for results within the specified date range. the format should be a time range string like'20240501,20240531', representing searching results between May 1, 2024, and May 31, 2024.pn(optional): Search for results on the specified page.proxies(optional): Use proxies for searching.exclude(optional): Exclude specified fields, e.g.,['recommend', 'last_page'].include_performance(optional): Whether to include performance data, defaultFalse.
Connection Configuration Parameters
connection_mode: Connection mode, options:'single'(default): Single connection mode, suitable for proxy rotation'pooled': Connection pool mode, suitable for high-performance scenarios'custom': Custom mode, use custom parameters
connect_timeout: Connection timeout in seconds, default 5read_timeout: Read timeout in seconds, default 10max_retries: Maximum retry count, default 0pool_connections: Number of connection pools, default 1pool_maxsize: Maximum connections per pool, default 1keep_alive: Whether to enable keep-alive, defaultFalse
Technical Details
PC Version Request Headers & Cookies
Key Request Parameters:
rsv_pq: Random query parameter (64-bit hex)rsv_t: Random timestamp hashoq: Original query (same as search keyword)
Cookie Parameters (automatically generated):
BAIDUID: Unique browser identifier (32-char hex)H_PS_645EC: Synchronized withrsv_tparameterH_PS_PSSID: Session ID with multiple numeric segmentsBAIDUID_BFESS: Same as BAIDUID for security- Plus 13 additional cookies for complete browser simulation
Mobile Version Request Headers & Cookies
Key Request Parameters:
rsv_iqid: Random identifier (19 digits)rsv_t: Random timestamp hashsugid: Suggestion ID (14 digits)rqid: Request ID (same as rsv_iqid)inputT: Input timestamp- Plus 11 additional parameters for mobile simulation
Cookie Parameters (automatically generated):
BAIDUID: Synchronized with internal parametersH_WISE_SIDS: Mobile-specific session with 80 numeric segmentsrsv_i: Complex encoded string (64 chars)__bsi: Special session ID formatFC_MODEL: Feature model parameters- Plus 14 additional cookies for mobile browser simulation
All parameters are automatically generated and synchronized to ensure realistic browser behavior.
Return Values
Successful Response
{'code': 200, 'msg': 'ok', 'data': {...}}: Successful responseresults: Search results listrecommend: Basic recommendation keywords (may be empty array)ext_recommend: Extended recommendation keywords (mobile only, may be empty array)last_page: Indicates whether it's the last pagematch_count: Number of matching resultsperformance(optional): Performance data, containsresponse_timeandstatus_code
Error Response
Application Errors (400-499)
{'code': 404, 'msg': '未找到相关结果'}: No relevant results found{'code': 405, 'msg': '无搜索结果'}: No search results
Server Errors (500-523)
{'code': 500, 'msg': '请求异常'}: General network request exception{'code': 501, 'msg': '百度安全验证'}: Baidu security verification required{'code': 502, 'msg': '响应提前结束'}: Response data incomplete{'code': 503, 'msg': '连接超时'}: Connection timeout{'code': 504, 'msg': '读取超时'}: Read timeout{'code': 505-510}: Proxy-related errors (connection reset, auth failure, etc.){'code': 511-513}: SSL-related errors (certificate verification, handshake failure, etc.){'code': 514-519}: Connection errors (connection refused, DNS resolution failure, etc.){'code': 520-523}: HTTP errors (403 forbidden, 429 rate limit, server error, etc.)
Connection Optimization Best Practices
Proxy Rotation Scenarios
# Recommended configuration: default single mode is already optimized
with BaiduPc() as pc: # Automatically uses single connection to avoid connection reuse issues
for proxy in proxy_list:
results = pc.search('keyword', proxies=proxy)
# Process results...
High-Performance Fixed Proxy Scenarios
# Use pooled mode for better performance
with BaiduPc(connection_mode='pooled') as pc:
results = pc.search('keyword', proxies=fixed_proxy)
# Connection pool automatically manages connection reuse
Error Handling and Retry
def robust_search(keyword, max_retries=3):
for attempt in range(max_retries):
with BaiduPc() as pc:
results = pc.search(keyword, include_performance=True)
if results['code'] == 200:
return results
elif results['code'] in [503, 504]: # Timeout errors
continue # Retry
elif results['code'] in [505, 506, 514, 515]: # Connection issues
continue # Retry
else:
break # Don't retry other errors
return results
Mobile Extended Recommendations
Mobile version supports two types of recommendations:
recommend: Basic recommendation keywords extracted directly from search results pageext_recommend: Extended recommendation keywords obtained through additional API call
How to get extended recommendations:
# Get all recommendations (including extended recommendations)
results = m_serp.search('keyword', exclude=[])
# Get only basic recommendations (default behavior)
results = m_serp.search('keyword') # equivalent to exclude=['ext_recommend']
# Get no recommendations
results = m_serp.search('keyword', exclude=['recommend']) # automatically excludes ext_recommend
Notes:
- Extended recommendations require an additional network request and are only fetched on the first page (pn=1 or None)
- Extended recommendations depend on basic recommendations; if basic recommendations are excluded, extended recommendations are automatically excluded as well
Disclaimer
This project is intended for educational purposes only and must not be used for commercial purposes or for large-scale scraping of Baidu data. This project is licensed under the GPLv3 open-source license. If other projects utilize the content of this project, they must be open-sourced and acknowledge the source. Additionally, the author of this project shall not be held responsible for any legal risks resulting from misuse. Violators will bear the consequences at their own risk.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file baidu_serp_api-1.1.7.tar.gz.
File metadata
- Download URL: baidu_serp_api-1.1.7.tar.gz
- Upload date:
- Size: 30.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.0.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
23cdb66ea6ee488fd7e393dedc9e6eb6fdaf0be00bd9a5622d44dc8f029fa6ce
|
|
| MD5 |
82b44bf111fd484a34c2d0ea9fd58e71
|
|
| BLAKE2b-256 |
ac362b4c253ce752f2468dd34182258b721cef02d4ff1f98d8d9034c8ac351d5
|
File details
Details for the file baidu_serp_api-1.1.7-py3-none-any.whl.
File metadata
- Download URL: baidu_serp_api-1.1.7-py3-none-any.whl
- Upload date:
- Size: 30.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.0.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1de51c04cca1fb504be880724e0bf2f61c807444d4cfa2a79999cc3ce42d7b9a
|
|
| MD5 |
4c44954aa1abfcb556ee99d6514de916
|
|
| BLAKE2b-256 |
7b9e297f79760982ff68dcb77b46e699c0a230bd3998c768d2f102e184caa3ef
|