Skip to main content

Twitter API wrapper for python with **no API key required**.

Project description

Number of GitHub stars GitHub commit activity Version Tweet Discord BuyMeACoffee

[日本語] [中文]

Twikit

A Simple Twitter API Scraper

You can use functions such as posting or searching for tweets without an API key using this library.

🔵 Discord

[!IMPORTANT] With the release of version 2.0.0 on July 11, there have been some specification changes, including the discontinuation of the synchronous version. Existing code will no longer work with v2.0.0 or later, so please refer to the documentation or the code in the examples folder for adjustments. We apologize for any inconvenience this may cause.

Features

No API Key Required

This library uses scraping and does not require an API key.

Free

This library is free to use.

Functionality

By using Twikit, you can access functionalities such as the following:

  • Create tweets

  • Search tweets

  • Retrieve trending topics

  • etc...

Installing

pip install twikit

Quick Example

Define a client and log in to the account.

import asyncio
from twikit import Client

USERNAME = 'example_user'
EMAIL = 'email@example.com'
PASSWORD = 'password0000'

# Initialize client
client = Client('en-US')

async def main():
    await client.login(
        auth_info_1=USERNAME ,
        auth_info_2=EMAIL,
        password=PASSWORD
    )

asyncio.run(main())

Create a tweet with media attached.

# Upload media files and obtain media_ids
media_ids = [
    await client.upload_media('media1.jpg'),
    await client.upload_media('media2.jpg')
]

# Create a tweet with the provided text and attached media
await client.create_tweet(
    text='Example Tweet',
    media_ids=media_ids
)

Search the latest tweets based on a keyword

tweets = await client.search_tweet('python', 'Latest')

for tweet in tweets:
    print(
        tweet.user.name,
        tweet.text,
        tweet.created_at
    )

Retrieve user tweets

tweets = await client.get_user_tweets('123456', 'Tweets')

for tweet in tweets:
    print(tweet.text)

Send a dm

await client.send_dm('123456789', 'Hello')

Get trends

await client.get_trends('trending')

More Examples: examples

Contributing

If you encounter any bugs or issues, please report them on issues.

If you find this library useful, consider starring this repository⭐️

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twikit-2.1.2.tar.gz (62.4 kB view details)

Uploaded Source

Built Distribution

twikit-2.1.2-py3-none-any.whl (71.7 kB view details)

Uploaded Python 3

File details

Details for the file twikit-2.1.2.tar.gz.

File metadata

  • Download URL: twikit-2.1.2.tar.gz
  • Upload date:
  • Size: 62.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.5

File hashes

Hashes for twikit-2.1.2.tar.gz
Algorithm Hash digest
SHA256 982d756581b0eb053a4948ff8f9069afaaae79b9e295a0540a95fd4878b6d648
MD5 c1ff511321c34b8c398c31e59b5e0890
BLAKE2b-256 2a048383d23406710a0e848fd11a1b8937cb787e1dafb98898879f468955c50e

See more details on using hashes here.

File details

Details for the file twikit-2.1.2-py3-none-any.whl.

File metadata

  • Download URL: twikit-2.1.2-py3-none-any.whl
  • Upload date:
  • Size: 71.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.5

File hashes

Hashes for twikit-2.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c76b9b677b4936bbe13608d063d0f82cb8ae64f72e2873070bd7f6193dcc66e5
MD5 87145f5ec641006f0b2ddcc1ccb845fc
BLAKE2b-256 15bbe8e55ac971629204040a4648eb2799ee1da2758fbfb58dd0ca4c6d2470d9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page