Ingests tweets using Twitter's RecentAPI.
Project description
av-tweet-ingestion
Ingestion of tweets using Twitter's RecentAPI and upload to S3.
Installation
pip install av-tweet-ingestion
Usage example
Setup Environmental Variables
BEARER_TOKEN="<twitter_bearer_token>"
S3_ACESS_KEY="<acess_key>"
S3_SECRET_KEY="<secret_key>"
S3_BUCKET_NAME="<bucket_name>"
S3_LANDING_LAYER="<landing_zone>"
Define Query
query_params = {
'query': 'from: elonmusk',
'user.fields':'id,location,name,public_metrics,created_at',
'tweet.fields': 'author_id',
# 'expansions':'geo.place_id,author_id,entities.mentions.username,in_reply_to_user_id,referenced_tweets.id.author_id',
'max_results':'10'
}
Ingest
BatchIngestor(query_params, # Query
1, # Number of pages to ingest
RecentAPI, # Titter's API to call
'test/RecentAPI', # Set address to save data
S3Writer).ingest() # Writer to be used
Contributing
-
Fork it (https://github.com/andreveit/av-tweet-ingestion/fork)
-
Create your feature branch (
git checkout -b feature/fooBar
) -
Commit your changes (
git commit -am 'Add some fooBar'
) -
Push to the branch (
git push origin feature/fooBar
) -
Create a new Pull Request
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.