Skip to main content

Scrape the Twitter Frontend API without authentication.

Project description

Twitter Scraper

GitHub GitHub contributors code size maintain status

🇰🇷 Read Korean Version

Twitter's API is annoying to work with, and has lots of limitations — luckily their frontend (JavaScript) has it's own API, which I reverse–engineered. No API rate limits. No restrictions. Extremely fast.

You can use this library to get the text of any user's Tweets trivially.

Prerequisites

Before you begin, ensure you have met the following requirements:

  • Internet Connection
  • Python 3.6+

Installing twitter-scraper

If you want to use latest version, install from source. To install twitter-scraper from source, follow these steps:

Linux and macOS:

git clone https://github.com/bisguzar/twitter-scraper.git
cd twitter-scraper
sudo python3 setup.py install 

Also, you can install with PyPI.

pip3 install twitter_scraper

Using twitter_scraper

Just import twitter_scraper and call functions!

→ function get_tweets(query: str [, pages: int]) -> dictionary

You can get tweets of profile or parse tweets from hashtag, get_tweets takes username or hashtag on first parameter as string and how much pages you want to scan on second parameter as integer.

Keep in mind:

  • First parameter need to start with #, number sign, if you want to get tweets from hashtag.
  • pages parameter is optional.
Python 3.7.3 (default, Mar 26 2019, 21:43:19) 
[GCC 8.2.1 20181127] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from twitter_scraper import get_tweets
>>> 
>>> for tweet in get_tweets('twitter', pages=1):
...     print(tweet['text'])
... 
spooky vibe check

It returns a dictionary for each tweet. Keys of the dictionary;

Key Type Description
tweetId string Tweet's identifier, visit twitter.com/USERNAME/ID to view tweet.
isRetweet boolean True if it is a retweet, False othercase
time datetime Published date of tweet
text string Content of tweet
replies integer Replies count of tweet
retweets integer Retweet count of tweet
likes integer Like count of tweet
entries dictionary Has hashtags, videos, photos, urls keys. Each one's value is list

→ function get_trends() -> list

You can get the Trends of your area simply by calling get_trends(). It will return a list of strings.

Python 3.7.3 (default, Mar 26 2019, 21:43:19) 
[GCC 8.2.1 20181127] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from twitter_scraper import get_trends
>>> get_trends()
['#WHUTOT', '#ARSSOU', 'West Ham', '#AtalantaJuve', '#バビロニア', '#おっさんずラブinthasky', 'Southampton', 'Valverde', '#MMKGabAndMax', '#23NParoNacional']

→ class Profile(username: str) -> class instance

You can get personal information of a profile, like birthday and biography if exists and public. This class takes username parameter. And returns itself. Access informations with class variables.

Python 3.7.3 (default, Mar 26 2019, 21:43:19) 
[GCC 8.2.1 20181127] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from twitter_scraper import Profile
>>> profile = Profile('bugraisguzar')
>>> profile.location
'Istanbul'
>>> profile.name
'Buğra İşgüzar'
>>> profile.username
'bugraisguzar'

.to_dict() -> dict

to_dict is a method of Profile class. Returns profile datas as Python dictionary.

Python 3.7.3 (default, Mar 26 2019, 21:43:19) 
[GCC 8.2.1 20181127] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from twitter_scraper import Profile
>>> profile = Profile("bugraisguzar")
>>> profile.to_dict()
{'name': 'Buğra İşgüzar', 'username': 'bugraisguzar', 'birthday': None, 'biography': 'geliştirici@peptr', 'website': 'bisguzar.com', 'profile_photo': 'https://pbs.twimg.com/profile_images/1199305322474745861/nByxOcDZ_400x400.jpg', 'likes_count': 2512, 'tweets_count': 756, 'followers_count': 483, 'following_count': 255}

Contributing to twitter-scraper

To contribute to twitter-scraper, follow these steps:

  1. Fork this repository.
  2. Create a branch with clear name: git checkout -b <branch_name>.
  3. Make your changes and commit them: git commit -m '<commit_message>'
  4. Push to the original branch: git push origin <project_name>/<location>
  5. Create the pull request.

Alternatively see the GitHub documentation on creating a pull request.

Contributors

Thanks to the following people who have contributed to this project:

  • @kennethreitz (author)
  • @bisguzar (maintainer)
  • @lionking6792

Contact

If you want to contact me you can reach me at @bugraisguzar.

License

This project uses the following license: MIT.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twitter-scraper-0.4.1.tar.gz (10.3 kB view details)

Uploaded Source

Built Distribution

twitter_scraper-0.4.1-py2.py3-none-any.whl (9.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file twitter-scraper-0.4.1.tar.gz.

File metadata

  • Download URL: twitter-scraper-0.4.1.tar.gz
  • Upload date:
  • Size: 10.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/44.0.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.8.1

File hashes

Hashes for twitter-scraper-0.4.1.tar.gz
Algorithm Hash digest
SHA256 ee1cb13cabcd202c41765043f2a7d3b26ba269ad3d11aedb6518268ef77d8542
MD5 594b1d220acd639de793580bcc2cbe76
BLAKE2b-256 de23ac87f90417ca02f36dccbd39249ccd397118be93668ec1e45847ec169b4d

See more details on using hashes here.

File details

Details for the file twitter_scraper-0.4.1-py2.py3-none-any.whl.

File metadata

  • Download URL: twitter_scraper-0.4.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 9.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/44.0.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.8.1

File hashes

Hashes for twitter_scraper-0.4.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 7d97f2f8b8970996511963dc4c55dae1b68f07387c3500ad67fe0ab1f47fa508
MD5 075a88e20df531cd3f1cc85bbf682950
BLAKE2b-256 69d2080ad55919d547dba653936eb62f1cb25512b9715873a1e69337fb1d5b78

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page