Skip to main content

Twitter API wrapper for python with **no API key required**.

Project description

Number of GitHub stars GitHub commit activity Version Tweet Discord BuyMeACoffee

[日本語] [中文]

Twikit

A Simple Twitter API Scraper

You can use functions such as posting or searching for tweets without an API key using this library.

Discord

Important Notice: Twikit Sync Support Ending in Version 2

We're planning to discontinue support for synchronous operations starting from version 2 of Twikit. As our codebase has grown, maintaining both synchronous and asynchronous versions has become challenging, impacting our ability to uphold code quality effectively.

The release date for version 2 is not yet determined. We recommend transitioning your code to the asynchronous version (twikit_async) as soon as possible. You can find documentation for twikit_async here.

Features

No API Key Required

This library uses scraping and does not require an API key.

Free

This library is free to use.

Both Synchronous and Asynchronous Support

Twikit supports both synchronous and asynchronous.

Functionality

By using Twikit, you can access functionalities such as the following:

  • Create tweets

  • Search tweets

  • Retrieve trending topics

  • etc...

Installing

pip install twikit

Quick Example

Define a client and log in to the account.

import asyncio
from twikit import Client

USERNAME = 'example_user'
EMAIL = 'email@example.com'
PASSWORD = 'password0000'

# Initialize client
client = Client('en-US')

async def main():
    await client.login(
        auth_info_1=USERNAME ,
        auth_info_2=EMAIL,
        password=PASSWORD
    )

asyncio.run(main())

Create a tweet with media attached.

# Upload media files and obtain media_ids
media_ids = [
    await client.upload_media('media1.jpg'),
    await client.upload_media('media2.jpg')
]

# Create a tweet with the provided text and attached media
await client.create_tweet(
    text='Example Tweet',
    media_ids=media_ids
)

Search the latest tweets based on a keyword

tweets = await client.search_tweet('python', 'Latest')

for tweet in tweets:
    print(
        tweet.user.name,
        tweet.text,
        tweet.created_at
    )

Retrieve user tweets

tweets = await client.get_user_tweets('123456', 'Tweet')

for tweet in tweets:
    print(tweet.text)

More Examples: examples

Contributing

If you encounter any bugs or issues, please report them on issues.

If you find this library useful, consider starring this repository⭐️

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twikit-2.0.0b2.tar.gz (55.1 kB view details)

Uploaded Source

Built Distribution

twikit-2.0.0b2-py3-none-any.whl (62.5 kB view details)

Uploaded Python 3

File details

Details for the file twikit-2.0.0b2.tar.gz.

File metadata

  • Download URL: twikit-2.0.0b2.tar.gz
  • Upload date:
  • Size: 55.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.5

File hashes

Hashes for twikit-2.0.0b2.tar.gz
Algorithm Hash digest
SHA256 d946515150b75fab9ff6acc3a35a9f3ed9a4a5149f8fbbc5ba32512bb226e8b0
MD5 6ef18d7fc72c8adfe37dba88f75173d0
BLAKE2b-256 e042577b0d6cae9c0264b9c0bc58d1946d5b780239f5385146cdb45e6df6fde2

See more details on using hashes here.

File details

Details for the file twikit-2.0.0b2-py3-none-any.whl.

File metadata

  • Download URL: twikit-2.0.0b2-py3-none-any.whl
  • Upload date:
  • Size: 62.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.5

File hashes

Hashes for twikit-2.0.0b2-py3-none-any.whl
Algorithm Hash digest
SHA256 b84127c60a40ac48e1c5ec0e7e8b91a2046d5a17f12396b6310eb44fd9ede2e1
MD5 9a0816a56be3cf9d0dd8cab15b943fae
BLAKE2b-256 3d9f270c5b0cb0c74f6380ace00ba080ff8ee6e269d0b027279352928febc205

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page