Scrape tweets, profiles, followers and more from Twitter/X — no API key needed
Project description
Scweet — Twitter / X Scraper
Scrape tweets, profiles, followers and more from Twitter/X. No official API key needed — uses X's own web GraphQL API, authenticated with your browser cookies.
Last verified working: March 2026
What you can scrape:
- Tweets — by keyword, hashtag, user, date range, engagement filters, language, location
- Profile timelines — a user's full tweet history
- Followers / Following — full account lists at scale
- User profiles — bio, follower count, verification status, and more
Get started
Hosted — no setup needed
The quickest way to get Twitter/X data: run on Apify with no code, no cookies, and no account management. Free tier included.
Python library
1. Install
pip install -U Scweet
2. Get your auth_token
Log into x.com → DevTools F12 → Application → Cookies → https://x.com → copy the auth_token value.
Paste the auth_token alone and Scweet auto-bootstraps the ct0 CSRF token — or use the cookies.json format below for multiple accounts at once.
3. Scrape
from Scweet import Scweet
# First run: credentials are stored in scweet_state.db automatically
# Use a proxy to avoid rate limits and bans
# All methods have async variants: asearch(), aget_profile_tweets(), aget_followers(), ...
s = Scweet(auth_token="YOUR_AUTH_TOKEN", proxy="http://user:pass@host:port")
# Search and save to CSV (save_format="json" or "both" also works; use save_dir= and save_name= to control the output path)
tweets = s.search("bitcoin", since="2025-01-01", limit=500, save=True)
# Profile timeline
tweets = s.get_profile_tweets(["elonmusk"], limit=200)
# Followers
users = s.get_followers(["elonmusk"], limit=1000)
# Next run: reuse provisioned accounts — no credentials needed again
s = Scweet(db_path="scweet_state.db")
tweets = s.search("ethereum", limit=500, save=True)
Multiple accounts with per-account proxies — for higher throughput and reduced ban risk:
[
{ "username": "acct1", "cookies": { "auth_token": "..." }, "proxy": "http://user1:pass1@host1:port1" },
{ "username": "acct2", "cookies": { "auth_token": "..." }, "proxy": "http://user2:pass2@host2:port2" }
]
s = Scweet(cookies_file="cookies.json") # proxies are read from the file, one per account
Always set
limit— without it, scraping continues until your account's daily cap is hit.
For the full list of supported search operators, see twitter-advanced-search.
From the CLI — no Python code needed:
# Search with proxy, save to CSV
scweet --auth-token YOUR_AUTH_TOKEN --proxy http://user:pass@host:port search "bitcoin" --since 2025-01-01 --limit 500 --save
# Followers, saved as JSON
scweet --auth-token YOUR_AUTH_TOKEN followers elonmusk --limit 1000 --save --save-format json
For structured search filters, async patterns, resume, multiple accounts, and the full API reference — see Full Documentation.
Why Scweet?
| twint | snscrape | twscrape | Scweet | |
|---|---|---|---|---|
| Works in 2026 | ❌ unmaintained | ❌ broken | ✅ | ✅ |
| Cookie / token auth | ❌ | ❌ | ✅ | ✅ |
| Multi-account pooling | ❌ | ❌ | ✅ | ✅ |
| Proxy support | ❌ | ❌ | ✅ | ✅ |
| Resume interrupted scrapes | ❌ | ❌ | ❌ | ✅ |
| Built-in CSV / JSON output | ✅ | ✅ | ❌ | ✅ |
| Sync + async API | ❌ | ❌ | Async only | ✅ both |
| Hosted, no-setup option | ❌ | ❌ | ❌ | ✅ Apify |
| Active maintenance | ❌ | ❌ | ⚠️ | ✅ |
twint has been unmaintained since 2023. snscrape broke after X's backend changes. twscrape is the closest active alternative — worth knowing, but async-only, no built-in file output, and no resume support.
FAQ
Does it work without an official Twitter API key? Yes. Scweet calls X's internal GraphQL API — the same one the web app uses. No developer account or API key required.
Is it a replacement for twint or snscrape? Yes. Both are broken as of 2024–2025. Scweet uses a different, currently-working approach: cookies + GraphQL instead of legacy unauthenticated endpoints.
How many tweets can I scrape? A single account typically handles hundreds to a few thousand tweets per day before hitting rate limits. Multi-account pooling scales this proportionally. The hosted Apify actor manages accounts and rate limits automatically.
Will my account get banned? Never use your personal account — use dedicated accounts only. To further reduce risk: use multiple accounts (distributes the load across them) and pair each with a proxy (prevents all requests coming from a single IP). The Apify actor handles both automatically — managed accounts and proxies are included.
Does it work for private accounts? No. Only publicly visible content is accessible.
Does it still work in 2025 / 2026? Yes — last verified working in March 2026 against X's current GraphQL API.
Documentation
Full API reference, all config options, structured search filters, async patterns, resume, proxies, and troubleshooting:
Community
Have a question or want to share what you built with Scweet? Open a thread in GitHub Discussions.
Found it useful? Star the repo ⭐
Contributing
Bug reports, feature suggestions, and PRs are welcome. See CONTRIBUTING.md.
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file scweet-5.2.tar.gz.
File metadata
- Download URL: scweet-5.2.tar.gz
- Upload date:
- Size: 138.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2a0d0971500a9a51f389995541eea725cf6e7054523c57180488b1c1268475ef
|
|
| MD5 |
1a0a417b8f854e4772377ab82ab6f8e6
|
|
| BLAKE2b-256 |
5598653cbfb103b48639eaa186aa139391e617293ed20a5fd38326540d48891c
|
File details
Details for the file scweet-5.2-py3-none-any.whl.
File metadata
- Download URL: scweet-5.2-py3-none-any.whl
- Upload date:
- Size: 96.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
76355ab6b4747afbe2bfc679d507a30b734f7214a2beeb7a621c75ea951ecc7b
|
|
| MD5 |
995f6649bd2edccaa487a7f531cc824d
|
|
| BLAKE2b-256 |
ae6f4ec4223df43566008e064020eaecdf54e73d33dc8ca80d90f26e67ff24e5
|