Skip to main content

FreeProxy: Collecting free proxies from internet.

Project description


docs PyPI - Python Version PyPI license PyPI - Downloads PyPI - Downloads issue resolution open issues

๐Ÿ“š Documentation

https://freeproxy.readthedocs.io/

โšก Live Proxies (ๆœ€ๆ–ฐๅ…่ดน้ซ˜่ดจ้‡ไปฃ็†, ๆฏๅฐๆ—ถๆ›ดๆ–ฐไธ€ๆฌก)

https://charlespikachu.github.io/freeproxy/

demo

ๅญฆไน ๆ”ถ่Žทๆ›ดๅคšๆœ‰่ถฃ็š„ๅ†…ๅฎน, ๆฌข่ฟŽๅ…ณๆณจๅพฎไฟกๅ…ฌไผ—ๅท๏ผšCharles็š„็šฎๅกไธ˜

โœจ What's New

  • 2026-01-10: Released pyfreeproxy v0.3.4 โ€” Implemented partial code optimizations and introduced an additional free proxy source.
  • 2025-12-25: Released pyfreeproxy v0.3.3 โ€” Adopt a more robust proxy auto-configuration approach.
  • 2025-12-23: Released pyfreeproxy v0.3.2 โ€” Add a new free proxy source, with automatic retrieval of the FineProxy nonce parameter.

๐Ÿ“˜ Introduction

FreeProxy continuously discovers and updates lists of free proxies. If you find value here, please star the project to keep it on your radar.

๐ŸŒ Supported Proxy Sources

Proxy Source (EN) Proxy Source (CN) HTTP HTTPS SOCKS4 SOCKS5 Code Snippet
DatabayProxiedSession Databay โœ” โœ” โŒ โœ” databay.py
FreeproxylistProxiedSession FreeProxyList โœ” โœ” โŒ โŒ freeproxylist.py
FineProxyProxiedSession FineProxy โœ” โœ” โœ” โœ” fineproxy.py
FreeProxyDBProxiedSession FreeProxyDB โœ” โŒ โœ” โœ” freeproxydb.py
GeonodeProxiedSession Geonode โœ” โœ” โœ” โœ” geonode.py
IhuanProxiedSession ๅฐๅนปไปฃ็† โœ” โœ” โŒ โŒ ihuan.py
IPLocateProxiedSession IPLocate โœ” โœ” โœ” โœ” iplocate.py
IP3366ProxiedSession ไบ‘ไปฃ็† โœ” โœ” โŒ โŒ ip3366.py
IP89ProxiedSession IP89 โœ” โŒ โŒ โŒ ip89.py
JiliuipProxiedSession ็งฏๆตไปฃ็† โœ” โŒ โŒ โŒ jiliuip.py
KuaidailiProxiedSession ๅฟซไปฃ็† โœ” โœ” โŒ โŒ kuaidaili.py
KxdailiProxiedSession ๅผ€ๅฟƒไปฃ็† โœ” โœ” โŒ โŒ kxdaili.py
ProxyhubProxiedSession ProxyHub โœ” โœ” โœ” โœ” proxyhub.py
ProxydbProxiedSession ProxyDB โœ” โœ” โŒ โœ” proxydb.py
ProxylistProxiedSession ProxyList โœ” โœ” โœ” โœ” proxylist.py
ProxiflyProxiedSession Proxifly โœ” โœ” โœ” โœ” proxifly.py
ProxydailyProxiedSession ProxyDaily โœ” โœ” โœ” โœ” proxydaily.py
ProxyScrapeProxiedSession ProxyScrape โœ” โŒ โœ” โœ” proxyscrape.py
ProxyEliteProxiedSession ProxyElite โœ” โŒ โœ” โœ” proxyelite.py
QiyunipProxiedSession ้ฝไบ‘ไปฃ็† โœ” โœ” โŒ โŒ qiyunip.py
SpysoneProxiedSession SPYS.ONE โœ” โŒ โŒ โœ” spysone.py
Tomcat1235ProxiedSession ๅŒ—ๆžๅ…‰ไปฃ็† โŒ โŒ โŒ โœ” tomcat1235.py
TheSpeedXProxiedSession TheSpeedX โœ” โŒ โœ” โœ” thespeedx.py

๐ŸŽฎ Playground

Here are some projects built on top of pyfreeproxy,

Project WeChat Article Project Location
ICU996 ็”จๆ•ฐไธ‡ๆกๆ•ฐๆฎๅธฆๅคงๅฎถ็œ‹็œ‹ๅˆฐๅบ•ๆ˜ฏๅ“ชไบ›ไบบๅœจๅๅฏน996~ click

๐Ÿ“ฆ Install

You have three installation methods to choose from,

# from pip
pip install pyfreeproxy
# from github repo method-1
pip install git+https://github.com/CharlesPikachu/freeproxy.git@master
# from github repo method-2
git clone https://github.com/CharlesPikachu/freeproxy.git
cd freeproxy
python setup.py install

Please note that some proxy sources need to be crawled using Playwright. Playwright will automatically download and configure the browser drivers, so there is no need to worry โ€” it is not malware. For more details, you can refer to the official Playwright documentation.

๐Ÿš€ Quick Start

Scrape proxies from multiple sources

After installing freeproxy, you can run a script to:

  • scrape proxies from multiple sources,
  • print basic statistics for each source,
  • save all retrieved proxies into a JSON file.

Example code (scrape + summarize + save):

import json, random
from tqdm import tqdm
from freeproxy.modules import BaseProxiedSession, ProxyInfo, BuildProxiedSession, printtable, colorize

'''settings'''
SOURCES = ["ProxiflyProxiedSession", "KuaidailiProxiedSession", "QiyunipProxiedSession", "ProxylistProxiedSession"]
TITLES = ["Source", "Retrieved Example", "HTTP", "HTTPS", "SOCKS4", "SOCKS5", "Chinese IP", "Elite", "Total"]

'''scrape'''
def scrape(src: str) -> list[ProxyInfo]:
    try:
        sess: BaseProxiedSession = BuildProxiedSession({"max_pages": 1, "type": src, "disable_print": False})
        return sess.refreshproxies()
    except Exception:
        return []

'''stats'''
def stats(proxies: list[ProxyInfo]) -> dict:
    return {
        "http":   sum(p.protocol.lower() == "http"   for p in proxies),
        "https":  sum(p.protocol.lower() == "https"  for p in proxies),
        "socks4": sum(p.protocol.lower() == "socks4" for p in proxies),
        "socks5": sum(p.protocol.lower() == "socks5" for p in proxies),
        "cn":     sum(bool(p.in_chinese_mainland) for p in proxies),
        "elite":  sum(p.anonymity.lower() == "elite" for p in proxies),
        "total":  len(proxies),
        "ex":     (random.choice(proxies).proxy if proxies else "NULL"),
    }

'''row'''
def row(src: str, s: dict) -> list:
    ex = colorize(s["ex"], "green") if s["total"] else "NULL"
    return [
        src.removesuffix("ProxiedSession"),
        ex,
        colorize(s["http"], "number"),
        colorize(s["https"], "number"),
        colorize(s["socks4"], "number"),
        colorize(s["socks5"], "number"),
        colorize(s["cn"], "number"),
        colorize(s["elite"], "number"),
        colorize(s["total"], "number"),
    ]

'''main'''
def main():
    free_proxies, items = {}, []
    for src in tqdm(SOURCES):
        proxies = scrape(src)
        items.append(row(src, stats(proxies)))
        free_proxies[src] = [p.todict() for p in proxies]
    print("The proxy distribution for each source you specified is as follows:")
    printtable(titles=TITLES, items=items, terminal_right_space_len=1)
    json.dump(free_proxies, open("free_proxies.json", "w"), indent=2)

'''tests'''
if __name__ == "__main__":
    main()

Example output (terminal):

C:\Users\Charles\Desktop>python test.py
KuaidailiProxiedSession >>> adding country_code: 37it [00:05,  6.57it/s]                 | 1/4 [00:18<00:56, 18.95s/it]
100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 4/4 [00:28<00:00,  7.17s/it]
The proxy distribution for each source you specified is as follows:
+-----------+-------------------------------+------+-------+--------+--------+------------+-------+-------+
|   Source  |       Retrieved Example       | HTTP | HTTPS | SOCKS4 | SOCKS5 | Chinese IP | Elite | Total |
+-----------+-------------------------------+------+-------+--------+--------+------------+-------+-------+
|  Proxifly |   http://195.231.69.203:443   | 5112 |   0   |  1043  |  477   |     48     |  2157 |  6632 |
| Kuaidaili |   http://113.45.158.25:3128   |  20  |   13  |   0    |   0    |     19     |   33  |   33  |
|  Qiyunip  |   https://114.103.88.18:8089  |  6   |   9   |   0    |   0    |     15     |   14  |   15  |
| Proxylist | socks4://184.181.217.206:4145 | 420  |   59  |  182   |  156   |     54     |  699  |  817  |
+-----------+-------------------------------+------+-------+--------+--------+------------+-------+-------+

All proxies are saved to free_proxies.json in the current directory, e.g.:

{
  "KuaidailiProxiedSession": [
    {
      "source": "KuaidailiProxiedSession",
      "protocol": "http",
      "ip": "58.216.109.17",
      "port": "800",
      "country_code": "CN",
      "in_chinese_mainland": true,
      "anonymity": "elite",
      "delay": 124,
      "test_timeout": 5,
      "test_url": "http://www.baidu.com",
      "test_headers": {
        "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36"
      },
      "failed_connection_default_timeout": 3600000,
      "created_at": "2025-12-03T12:43:25.018208",
      "extra": {}
    }
  ],
  "ProxiflyProxiedSession": [],
  "QiyunipProxiedSession": [],
  "ProxylistProxiedSession": []
}

Tip: Increase max_pages to fetch more proxies from each source.

List supported proxy sources

To list all proxy sources supported by your current freeproxy version:

python -c "from freeproxy.modules import ProxiedSessionBuilder; print(ProxiedSessionBuilder.REGISTERED_MODULES.keys())"

Example output:

dict_keys([
  'ProxiflyProxiedSession', 'FreeproxylistProxiedSession', 'IhuanProxiedSession', 'IP89ProxiedSession', 
  'IP3366ProxiedSession', 'KuaidailiProxiedSession', 'KxdailiProxiedSession', 'ProxydailyProxiedSession', 
  'ProxydbProxiedSession', 'ProxyhubProxiedSession', 'ProxylistProxiedSession', 'QiyunipProxiedSession', 
  'SpysoneProxiedSession', 'Tomcat1235ProxiedSession', 'DatabayProxiedSession', 'FineProxyProxiedSession', 
  'IPLocateProxiedSession', 'JiliuipProxiedSession', 'TheSpeedXProxiedSession', 'GeonodeProxiedSession', 
  'FreeProxyDBProxiedSession', 'ProxyScrapeProxiedSession'
])

Apply stricter filtering

By default, freeproxy:

  • validates proxy format,
  • de-duplicates results,
  • does not aggressively filter by geography/anonymity/speed unless you specify rules.

You can enforce stricter filtering by passing filter_rule.

Common fields in filter_rule:

  • country_code: e.g., ['CN'], ['US']
  • anonymity: elite, anonymous, transparent (string or list)
  • protocol: http, https, socks4, socks5 (string or list)
  • max_tcp_ms: maximum TCP connect latency (ms)
  • max_http_ms: maximum HTTP request latency to test_url (ms)

Example A: only mainland China proxies

from freeproxy.modules.proxies import IP3366ProxiedSession

sess = IP3366ProxiedSession(filter_rule={"country_code": ["CN"]})
sess.refreshproxies()
print(sess.getrandomproxy(proxy_format="freeproxy"))

Example B: US + elite anonymity

from freeproxy.modules.proxies import SpysoneProxiedSession

sess = SpysoneProxiedSession(filter_rule={"anonymity": ["elite"], "country_code": ["US"]})
sess.refreshproxies()
print(sess.getrandomproxy(proxy_format="freeproxy"))

Example C: constrain protocol + speed

from freeproxy.modules.proxies import FreeproxylistProxiedSession

sess = FreeproxylistProxiedSession(
    filter_rule={
        "protocol": ["http", "https"],
        "max_tcp_ms": 10000,
        "max_http_ms": 10000,
    }
)
sess.refreshproxies()
print(sess.getrandomproxy(proxy_format="freeproxy"))

Note (performance): max_tcp_ms / max_http_ms may significantly slow down crawling when too many proxies are scraped, because each proxy requires additional testing. In general, itโ€™s better to crawl first, then run a separate post-test script if you need strict speed constraints.

Unified client: ProxiedSessionClient

ProxiedSessionClient provides a unified interface across proxy sources and behaves like a requests.Session with an automatically maintained proxy pool.

  • It keeps a proxy pool where all proxies satisfy your filter_rule.
  • Each .get() / .post() consumes at least one proxy.
  • When the pool is low, it automatically replenishes proxies by scraping again.

Minimal example:

from freeproxy.freeproxy import ProxiedSessionClient

proxy_sources = ["KuaidailiProxiedSession"]
init_proxied_session_cfg = {"filter_rule": {"country_code": ["CN", "US"]}}
client = ProxiedSessionClient(
    proxy_sources=proxy_sources, init_proxied_session_cfg=init_proxied_session_cfg,
)
headers = {
    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36"
}
resp = client.get("https://space.bilibili.com/406756145", headers=headers)
print(resp.text)

Quiet mode (suppress logs):

from freeproxy import freeproxy

client = freeproxy.ProxiedSessionClient(
    proxy_sources=["ProxydbProxiedSession"], disable_print=True,
)

Init arguments:

  • proxy_sources (list[str]): proxy sources to use.
  • init_proxied_session_cfg (dict): session config; supports:
    • max_pages: pages to fetch per source
    • filter_rule: filtering rules described above
    • plus standard requests.Session options
  • disable_print (bool): suppress proxy usage logs.
  • max_tries (int): max attempts per .get() / .post() call.

Example: filter scraped proxies via the unified client

from freeproxy.freeproxy import ProxiedSessionClient

client = ProxiedSessionClient(
    proxy_sources=["ProxyScrapeProxiedSession", "ProxylistProxiedSession"],
    init_proxied_session_cfg={
        "max_pages": 2,
        "filter_rule": {
            "country_code": ["CN"],
            "anonymity": ["elite"],
            "protocol": ["http", "https"],
        },
    },
    disable_print=False,
    max_tries=20,
)
headers = {
    'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/143.0.0.0 Safari/537.36',
}
resp = client.get("https://www.baidu.com/", timeout=10, headers=headers)
print(resp.text)
resp = client.get("https://httpbin.org/ip", timeout=5)
print(resp.json())
resp = client.get("https://httpbin.org/anything", timeout=15)
print(resp.json())
print("origin:", resp.json().get("origin"))
print("X-Forwarded-For:", resp.json()["headers"].get("X-Forwarded-For"))
print("Forwarded:", resp.json()["headers"].get("Forwarded"))

Final note: you can refer to freeproxyโ€™s source code to unlock more features, the overall codebase is small and easy to navigate.

โญ Recommended Projects

Project โญ Stars ๐Ÿ“ฆ Version โฑ Last Update ๐Ÿ›  Repository
๐ŸŽต Musicdl
่ฝป้‡็บงๆ— ๆŸ้Ÿณไนไธ‹่ฝฝๅ™จ
Stars Version Last Commit ๐Ÿ›  Repository
๐ŸŽฌ Videodl
่ฝป้‡็บง้ซ˜ๆธ…ๆ— ๆฐดๅฐ่ง†้ข‘ไธ‹่ฝฝๅ™จ
Stars Version Last Commit ๐Ÿ›  Repository
๐Ÿ–ผ๏ธ Imagedl
่ฝป้‡็บงๆตท้‡ๅ›พ็‰‡ๆœ็ดขไธ‹่ฝฝๅ™จ
Stars Version Last Commit ๐Ÿ›  Repository
๐ŸŒ FreeProxy
ๅ…จ็ƒๆตท้‡้ซ˜่ดจ้‡ๅ…่ดนไปฃ็†้‡‡้›†ๅ™จ
Stars Version Last Commit ๐Ÿ›  Repository
๐ŸŒ MusicSquare
็ฎ€ๆ˜“้Ÿณไนๆœ็ดขไธ‹่ฝฝๅ’Œๆ’ญๆ”พ็ฝ‘้กต
Stars Version Last Commit ๐Ÿ›  Repository
๐ŸŒ FreeGPTHub
็œŸๆญฃๅ…่ดน็š„GPT็ปŸไธ€ๆŽฅๅฃ
Stars Version Last Commit ๐Ÿ›  Repository

๐Ÿ“š Citation

If you use this project in your research, please cite the repository.

@misc{freeproxy2022,
    author = {Zhenchao Jin},
    title = {FreeProxy: Collecting free proxies from internet},
    year = {2022},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://github.com/CharlesPikachu/freeproxy}},
}

๐ŸŒŸ Star History

Star History Chart

โ˜• Appreciation (่ตž่ต / ๆ‰“่ต)

WeChat Appreciation QR Code (ๅพฎไฟก่ตž่ต็ ) Alipay Appreciation QR Code (ๆ”ฏไป˜ๅฎ่ตž่ต็ )

๐Ÿ“ฑ WeChat Official Account (ๅพฎไฟกๅ…ฌไผ—ๅท):

Charles็š„็šฎๅกไธ˜ (Charles_pikachu)
img

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyfreeproxy-0.3.4.tar.gz (40.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyfreeproxy-0.3.4-py3-none-any.whl (60.1 kB view details)

Uploaded Python 3

File details

Details for the file pyfreeproxy-0.3.4.tar.gz.

File metadata

  • Download URL: pyfreeproxy-0.3.4.tar.gz
  • Upload date:
  • Size: 40.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.14

File hashes

Hashes for pyfreeproxy-0.3.4.tar.gz
Algorithm Hash digest
SHA256 62f4f6c5fdbafb930cafc506b04a79e1c90a0e4506819618215cac09526659e3
MD5 73335532790e8998bc278debb05c9d0c
BLAKE2b-256 b6a954ec8b4834e18d679a4384b3e4cd139d24c8d781429b62beed6aa306325d

See more details on using hashes here.

File details

Details for the file pyfreeproxy-0.3.4-py3-none-any.whl.

File metadata

  • Download URL: pyfreeproxy-0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 60.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.14

File hashes

Hashes for pyfreeproxy-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 59bf82767720afe9b378bf5aa0cc5c954875910d5f3e87e7716d418966840b74
MD5 4129bcd837da79350a461bb878f0d01e
BLAKE2b-256 1996466a31ec14f41e585f72d4cd5e1ebb046967cd85ad967795954acf74bf87

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page