Skip to main content

Cloudflare Workers proxy IP pool client

Project description

CFspider

基于 Cloudflare Workers 的代理 IP 池,使用 Cloudflare 全球边缘节点 IP 作为代理出口。

特性

  • 使用 Cloudflare 全球 300+ 边缘节点 IP
  • 与 requests 库语法一致,无学习成本
  • 支持 GET、POST、PUT、DELETE 等所有 HTTP 方法
  • 支持 Session 会话管理
  • 返回 Cloudflare 节点信息(cf_colo、cf_ray)
  • 完全免费,Workers 免费版每日 100,000 请求

部署 Workers

  1. 登录 Cloudflare Dashboard
  2. 进入 Workers & Pages
  3. 点击 Create application → Create Worker
  4. workers.js 代码粘贴到编辑器中
  5. 点击 Deploy

部署完成后,你将获得一个 Workers 地址,如 https://xxx.username.workers.dev

如需自定义域名,可在 Worker → Settings → Triggers → Custom Domain 中添加。

安装

pip install cfspider

快速开始

import cfspider

cf_proxies = "https://your-workers.dev"

response = cfspider.get("https://httpbin.org/ip", cf_proxies=cf_proxies)

print(response.text)

API 参考

请求方法

CFspider 支持以下 HTTP 方法,语法与 requests 库一致:

import cfspider

cf_proxies = "https://your-workers.dev"

cfspider.get(url, cf_proxies=cf_proxies)
cfspider.post(url, cf_proxies=cf_proxies, json=data)
cfspider.put(url, cf_proxies=cf_proxies, data=data)
cfspider.delete(url, cf_proxies=cf_proxies)
cfspider.head(url, cf_proxies=cf_proxies)
cfspider.options(url, cf_proxies=cf_proxies)
cfspider.patch(url, cf_proxies=cf_proxies, json=data)

请求参数

参数 类型 说明
url str 目标 URL
cf_proxies str Workers 地址(必填)
params dict URL 查询参数
data dict/str 表单数据
json dict JSON 数据
headers dict 请求头
cookies dict Cookies
timeout int/float 超时时间(秒)

响应对象

属性 类型 说明
text str 响应文本
content bytes 响应字节
json() dict 解析 JSON
status_code int HTTP 状态码
headers dict 响应头
cf_colo str Cloudflare 节点代码(如 NRT)
cf_ray str Cloudflare Ray ID

使用示例

GET 请求

import cfspider

cf_proxies = "https://your-workers.dev"

response = cfspider.get(
    "https://httpbin.org/get",
    cf_proxies=cf_proxies,
    params={"key": "value"}
)

print(response.status_code)
print(response.json())

POST 请求

import cfspider

cf_proxies = "https://your-workers.dev"

response = cfspider.post(
    "https://httpbin.org/post",
    cf_proxies=cf_proxies,
    json={"name": "cfspider", "version": "1.0"}
)

print(response.json())

使用 Session

Session 可以复用 Workers 地址,无需每次请求都指定:

import cfspider

cf_proxies = "https://your-workers.dev"

session = cfspider.Session(cf_proxies=cf_proxies)

r1 = session.get("https://httpbin.org/ip")
r2 = session.post("https://httpbin.org/post", json={"test": 1})
r3 = session.get("https://example.com")

print(r1.text)
print(r2.json())

session.close()

获取 Cloudflare 节点信息

import cfspider

cf_proxies = "https://your-workers.dev"

response = cfspider.get("https://httpbin.org/ip", cf_proxies=cf_proxies)

print(f"出口 IP: {response.json()['origin']}")
print(f"节点代码: {response.cf_colo}")
print(f"Ray ID: {response.cf_ray}")

自定义请求头

import cfspider

cf_proxies = "https://your-workers.dev"

response = cfspider.get(
    "https://httpbin.org/headers",
    cf_proxies=cf_proxies,
    headers={
        "User-Agent": "MyApp/1.0",
        "Accept-Language": "zh-CN"
    }
)

print(response.json())

设置超时

import cfspider

cf_proxies = "https://your-workers.dev"

response = cfspider.get(
    "https://httpbin.org/delay/5",
    cf_proxies=cf_proxies,
    timeout=10
)

错误处理

import cfspider

cf_proxies = "https://your-workers.dev"

try:
    response = cfspider.get("https://httpbin.org/ip", cf_proxies=cf_proxies)
    response.raise_for_status()
    print(response.text)
except cfspider.CFSpiderError as e:
    print(f"请求失败: {e}")
except Exception as e:
    print(f"其他错误: {e}")

注意事项

  1. Workers 免费版限制:每日 100,000 请求,单次 CPU 时间 10ms
  2. 请求体大小限制:免费版 100MB,付费版无限制
  3. 超时限制:免费版 30 秒,付费版无限制
  4. 不支持 WebSocket、gRPC 等非 HTTP 协议

License

MIT License

链接

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cfspider-1.0.2.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cfspider-1.0.2-py3-none-any.whl (6.0 kB view details)

Uploaded Python 3

File details

Details for the file cfspider-1.0.2.tar.gz.

File metadata

  • Download URL: cfspider-1.0.2.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for cfspider-1.0.2.tar.gz
Algorithm Hash digest
SHA256 ebf7634de25ea3b8406133cf3f9bf54699afbd0a92327de0f0bd3d9f3bcc7aa0
MD5 e4ea893b93f1046182ab9967f292d209
BLAKE2b-256 ff50cb3a6d4b777bca68bf4062d183ff2e3aff8019e538149afa8534be673265

See more details on using hashes here.

File details

Details for the file cfspider-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: cfspider-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 6.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.0

File hashes

Hashes for cfspider-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 48dd0cb2e95b9912d5a555dbc486419356a97d990752ad1b7fa7f4bd2c60094d
MD5 2d6ee900adc930c05ccc06cfa028fae1
BLAKE2b-256 94754abfd75aeeca9902d2102ae6cdf3a5f608c97149361bfe8eaa602bce8d07

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page