Skip to main content

A web crawler for x.com

Project description

🚀 x-web-crawler

x-web-crawler is a Python package for automating interactions on social media platforms like Twitter (X) and GitHub.

Installation

You can install x-web-crawler using pip:

pip install x-web-crawler

Usage

Here’s an example of how to use x-web-crawler to automate actions on Twitter (X) and GitHub:

from xagent import XAgent

def main():
    driver_path = "YOUR_DRIVER_PATH"
    profile_path = "YOUR_PROFILE_PATH"
    twitter_username = "YOUR_TWITTER_USERNAME" # works best with your twitter handle and not email
    twitter_password = "YOUR_TWITTER_PASSWORD"
    url = "https://github.com/orgs/Azure/people"

    agent = XAgent(
        driver_path=driver_path,
        profile_path=profile_path,
        x_username=twitter_username,
        x_password=twitter_password,
    )

    try:
        agent.like_x_posts(duration=300)
        agent.follow_x_users(["https://x.com/jacob_somer_"], duration=300)
    finally:
        agent.close()

if __name__ == "__main__":
    main()

Examples

Automating Twitter (X) Actions

Here's how to automate actions on Twitter (X) using the Chrome browser:

from xagent import XAgent

driver_path = "/Users/jacobsomer/Documents/side_prod/salesBook/chromedriver-mac-arm64/chromedriver"
profile_path = "/Users/jacobsomer/Library/Application Support/Google/Chrome/chromeProfile"
x_username = "YOUR_TWITTER_USERNAME"
x_password = "YOUR_TWITTER_PASSWORD"

agent = XAgent(
    driver_path=driver_path,
    profile_path=profile_path,
    x_username=x_username,
    x_password=x_password,
)

try:
    # Like posts on your feed for 5 minutes
    agent.like_x_posts(duration=300)
    
    # Follow specific users
    agent.follow_x_users(["https://x.com/jacob_somer_"], duration=300)
finally:
    agent.close()

Using Microsoft Edge Browser

To automate actions using the Edge browser, modify the driver and profile paths:

from xagent.agent import XAgent

edge_driver_path = "/Users/jacobsomer/Documents/side_prod/salesBook/edgedriver_mac64_m1 (1)/msedgedriver"
edge_profile_path = "/Users/jacobsomer/Library/Application Support/Microsoft Edge/User Data"

agent = XAgent(
    driver_path=edge_driver_path, 
    profile_path=edge_profile_path, 
    browser="edge"
)

try:
    # Like posts on your feed for 5 minutes
    agent.like_x_posts(duration=300)
    
    # Follow specific users
    agent.follow_x_users(["https://x.com/jacob_somer_"], duration=300)
finally:
    agent.close()

XAgent Methods

__init__(self, driver_path, profile_path, x_username=None, x_password=None, browser="chrome")

Initializes the XAgent.

  • driver_path: Path to the ChromeDriver executable. Download ChromeDriver from here.
  • profile_path: Path to the user profile directory for Chrome. To find this, type "chrome://version" into your Chrome browser's address bar, and look for the "Profile Path" variable.
  • x_username (optional): Twitter (X) username for authentication.
  • x_password (optional): Twitter (X) password for authentication.
  • browser: The browser to use, either "chrome" or "edge".

like_x_posts(duration=300)

👍 Likes posts on the user's feed for the specified duration.

follow_x_users(users, duration=300)

👥 Follows the specified Twitter (X) users for the specified duration.

get_x_followers(username)

📈 Gets the followers of a specified Twitter (X) user.

get_x_following(username)

📊 Gets the users that a specified Twitter (X) user is following.

unfollow_x_users(users)

🚫 Unfollows the specified Twitter (X) users.

unfollow_x_users_alternative(user, users)

🔄 Unfollows the specified Twitter (X) users using an alternative method.

get_x_handles(query, num_handles=10)

🔍 Gets the handles of users based on a query.

Running Tests

To run the tests for this package, use the unittest framework:

python -m unittest discover tests

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

x_web_crawler-0.1.6.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

x_web_crawler-0.1.6-py3-none-any.whl (7.8 kB view details)

Uploaded Python 3

File details

Details for the file x_web_crawler-0.1.6.tar.gz.

File metadata

  • Download URL: x_web_crawler-0.1.6.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.3

File hashes

Hashes for x_web_crawler-0.1.6.tar.gz
Algorithm Hash digest
SHA256 cbd945f5f764f47f0795420bad94293395c19cc0f3735bb486d3dca6d6974f37
MD5 97d6f39757fd041325ddca54db61db14
BLAKE2b-256 788c02b37d47cc39698821c5939e9af108222ed9c545d059501039471d7593a9

See more details on using hashes here.

File details

Details for the file x_web_crawler-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for x_web_crawler-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 0c8846c333d71ba0e03fac27c54ae868843c74e43195520799a00a3e1f4f8b38
MD5 0e1fca8f9944d9b4d9e5488aec2bf10c
BLAKE2b-256 739005324c36dae3ad64d2aec8b8648bfb8cae6e91e507eaf592c27d3149ca6c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page