Skip to main content

Package for analyzing Twitter data

Project description

Generic badge Generic badge Generic badge Generic badge

Purpose

Every individual user on Twitter has a customized, personal experience and only views a tiny portion of the actual conversations, content and opinions that are put out on the platform. The purpose of this project is to provide tools to allow people to use Twitter data to obtain a more holistic perspective of the social network landscape. The TwitAnalysis modules allow for the live processing of Tweet streams as well as processing mass amounts of posted content related to certain topics, trends or users. This allows for analysis of a much larger sample size of Twitter data, allowing us to estimate the impact/reach of Twitter content. Basically, this means that we can go beyond just seeing what our friends are thinking/saying on the platform, and see the opinions of Twitter at large.

Scope

The scope of the project is limited by a number of different factors which we will attempt to document to allow for transparency. While not necessarily all inclusive, hopefully this can serve as a foundation for Twitter analysis, and a starting point for more targeted projects in the future.

Functionality

Currently the project is split into two main modules. The TwitLive module is used for streaming/processing live Twitter data. The TwitProcess module is used for processing bulk Twitter data.

TwitLive Example

from TwitAnalysis import *
from time import sleep

live = TwitLive()

# Process and display trend analysis
live.TopTrendAnalysis("United States",2,False, 30)
live.trends_summary()

# Stream tweets based on search
stream = live.SearchAnalysis("healthcare",False)
sleep(10)
stream.disconnect()

live.search_summary(stream)

tweets

TwitProcess Example

from TwitAnalysis import *

# Initialize new process object with a specific query
p = TwitProcess("Python Programming")
# Search/Process Tweets
p.analyze()

# Display stats for search results
print("Stats for query: 'Python Programming'")
print(f"Sentiment: {p.overall_sentiment()}")
print(f"Retweets: {p.retweets}")
print(f"Tweets: {p.reg_tweets}")
print(f"Impact: {p.impact}")
#==== OUTPUT ====#
Stats for query: 'Python Programming'
Sentiment: -0.366
Retweets: 485
Tweets: 282
Impact: 6,228,425

Twitter Documentation

https://developer.twitter.com/en/docs/tutorials/building-high-quality-filters

Research

https://www.sciencedirect.com/science/article/pii/S0268401218306005
https://epjdatascience.springeropen.com/articles/10.1140/epjds/s13688-018-0178-0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

TwitAnalysis-1.1.tar.gz (10.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

TwitAnalysis-1.1-py3-none-any.whl (11.1 kB view details)

Uploaded Python 3

File details

Details for the file TwitAnalysis-1.1.tar.gz.

File metadata

  • Download URL: TwitAnalysis-1.1.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for TwitAnalysis-1.1.tar.gz
Algorithm Hash digest
SHA256 1dbe37234f5361ae8b26da65b3f57b6b5cf5842609313e7b9b26267519aec15b
MD5 c6a20817b7b59aee6575620687fb0fca
BLAKE2b-256 61309c06ef0b6e406e5d079684648fe5415d4c84a68a998b5ca0c40865a681a2

See more details on using hashes here.

File details

Details for the file TwitAnalysis-1.1-py3-none-any.whl.

File metadata

  • Download URL: TwitAnalysis-1.1-py3-none-any.whl
  • Upload date:
  • Size: 11.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for TwitAnalysis-1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 071f3c163e195229bfaff4d43ac46bc4d4c9d450c81767198870fa40179f08d1
MD5 bcc407ed937782d18970d8a505beaaec
BLAKE2b-256 cff508757be4419db7cac0d9d27a17d772aa46b8195ac3e86927d372865b8038

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page