Skip to main content

Reddit upvoted and saved media downloader

Project description

License: MIT

My Reddit Downloader

Download upvoted and saved media from Reddit

 

Index

Requirements

  • Python 3.6 or above
  • requests
  • praw

Pre-Installation

Create a developer application on reddit if needed

Installation

pip install myredditdl

 

Manual Installation

1. Clone this repository

$ git clone https://github.com/emanuel2718/myredditdl
$ cd myredditdl

2. Install requirements

$ pip install -r requirements.txt

3. Install myredditdl

# you might need to install setuptools (pip install setuptools)
$ python3 setup.py install

4. Fill reddit developer app information

$ myredditdl --add-client

How to use

$ myredditdl [REQUIRED] [OPTIONS]
REQUIRED
-U, --upvote            Download upvoted media
-S, --saved             Download saved media
OPTIONS

 

Optional arguments:
-h, --help                show this message and exit
-v, --version             display the current version of myreddit-dl

--sub [SUBREDDIT ...]     only download media that belongs to the given subreddit(s)
--limit [LIMIT]           limit the amount of media to download (default: None)
--max-depth [MAX_DEPTH]   maximum amount of posts to iterate through

--no-video                don't download video files (.mp4, .gif, .gifv, etc.)
--only-video              only download video files
--no-nsfw                 disable NSFW content download
Confgiguration:
--add-client              add a new Reddit account
--change-client           change to another valid existing reddit client (account)
--prefix OPT              set filename prefix (post author username and/or post subreddit name)

                          Options:
                              '--config-prefix username'           --> username_id.ext
                              '--config-prefix username subreddit' --> username_subreddit_id.ext
                              '--config-prefix subreddit username' --> subreddit_username_id.ext
                              '--config-prefix subreddit'          --> subreddit_id.ext

                          Default: subreddit --> subreddit_id.ext

--path PATH               path to the folder were media will be downloaded to
--get-config              prints the configuration file information to the terminal
Metadata:
--no-metadata             don't save metadata for the downloaded media
--get-metadata FILE       print all the reddit metadata of the given FILE
--get-link FILE           print reddit link of given FILE
--get-title FILE          print post title of given FILE
--delete-database         delete the database of the current active reddit client user

Configuration

Set the reddit client information to be able to use myredditdl

$ myredditdl --add-client

Set the path to the destination folder for the downloaded media

$ myredditdl --path ~/Path/to/destination

Set the filenames prefix scheme of the downloaded media

# This will save all the files with the scheme: `postAuthorUsername_uniqueId.extension`
$ myredditdl --prefix username
# This will save all the files with the scheme: `subredditName_postAuthorUsername_uniqueId.extension`
$ myredditdl --prefix subreddit username
# This will save all the files with the scheme: `postAuthorName_subredditName_uniqueId.extension`
$ myredditdl --config-prefix username subreddit

Show the current configuration

$ myredditdl --show-config

Example usage:

Download all user upvoted media (limited to 1000 posts: Praw's API hard limit)

$ myredditdl -U

Download all user saved media and don't save metadata of posts

$ myredditdl -S --no-metadata

Download all user upvoted and saved media except NSFW posts

$ myredditdl -U -S --no-nsfw

Download all the user upvoted posts from the r/MechanicalKeyboards subreddit

$ myredditdl -U --sub MechanicalKeyboards

Download all the user upvoted posts from the r/MechanicalKeyboards and r/Battlestations subreddits

# There's no limit to how many subreddits can be chained together
$ myredditdl -U --sub MechanicalKeyboards Battlestations

Download only 10 posts media and only download images (don't download videos)

$ myredditdl -U --limit 10 --no-video

Get the post link of a downloaded media

# This will print the reddit post link of that image
$ myredditdl --get-link random_image.png

Get the post title of a downloaded media

# This will print the reddit post title of that video
$ myredditdl --get-title random_video.mp4

Get the metadata of downloaded media

# This will print the metadata of the image
$ myredditdl --get-metadata random_image.jpg

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

myredditdl-0.0.1.tar.gz (18.1 kB view details)

Uploaded Source

Built Distribution

myredditdl-0.0.1-py3-none-any.whl (20.4 kB view details)

Uploaded Python 3

File details

Details for the file myredditdl-0.0.1.tar.gz.

File metadata

  • Download URL: myredditdl-0.0.1.tar.gz
  • Upload date:
  • Size: 18.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.10

File hashes

Hashes for myredditdl-0.0.1.tar.gz
Algorithm Hash digest
SHA256 81c02621b7301f7d46b3311d43bb0627a01d81f2acf4d324ec7c2eac3fe316bc
MD5 82816421844b0c69cde3dfe814fd4296
BLAKE2b-256 e13a8f0da3c7cf131af8055f4c6ba8eecc38f11bcb0f21f3fa6119cf9e4b2ae7

See more details on using hashes here.

File details

Details for the file myredditdl-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: myredditdl-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 20.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.8.10

File hashes

Hashes for myredditdl-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 204c94063c2f7c3e5b939bc2c44c109c3a17cdadcdf018f1c67236b601e3cd4c
MD5 54e7f8c12c78689c428c3d1fcf1476b4
BLAKE2b-256 82bc9d7110bb1eba915e5cfdcb89aea7a920db9bdb7eed048a66a0dd08ed824c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page