Skip to main content

MTG image dataset with automatic image scraping and conversion.

Project description

🧙‍♂️ Magic The Gathering ✨
🧚‍♀️ Dataset 🧝‍♀️

With automated multithreaded image downloading,
caching and optional dataset conversion.


Example reconstructions of dataset elements
using a simple Beta-VAE


⚡️  Quickstart

  1. Install mtgdata with pip install mtgdata

  2. Prepare or convert the MTG data using the command line with python -m mtgdata --help


📋  Features

MTG Card Face Dataset

  • Automatically scrape and download card images from Scryfall
  • Multithreaded download through a randomized proxy list
  • Only return valid images that are not placeholders
  • Return all the faces of a card
  • Normalise data, some images are incorrectly sized
  • Cached

Convert to HDF5

  • Convert the data to an hdf5 dataset
  • Much faster than raw jpg or png image accesses
  • Metadata json file allows linking back to original scryfall information.

Pickle HD5F Dataset Class

  • Load the converted HDF5 dataset from disk from multiple threads / processes

⬇️  Download Images

Command Line

You can prepare (download) all the normal quality images from the default Scryfall bulk data by running mtgdata/__main__.py:

python3 mtgdata prepare --help

Otherwise, you can instead convert the downloaded images into an hdf5 dataset by running:

python3 mtgdata convert --help

Programmatically

Alternatively you can download the images from within python by simply instantiating the mtgdata.ScryfallDataset object. Similar arguments can be specified as that of the command line approach.

from mtgdata import ScryfallDataset 

data = ScryfallDataset(
    img_type='border_crop',
    bulk_type='default_cards',
    transform=None,
)

# you can access the dataset elements like usual

Proxy Issues?

The scrape logic used to obtain the proxy list for mtgdata.utils.proxy.ProxyDownloader will probably go out of date. You can override the default scrape logic used by the Dataset download logic by registering a new scrape function.

from mtgdata.util.proxy import register_proxy_scraper
from typing import List, Dict

@register_proxy_scraper(name='my_proxy_source', is_default=True)
def custom_proxy_scraper(proxy_type: str) -> List[Dict[str, str]]:
    # you should respect this setting, but we will just ignore it
    assert proxy_type in ('all', 'http', 'https')
    # proxies is a list of dictionaries, where each dictionary only has one entry:
    # - the key is the protocol
    # - the value is the matching full url
    return [
        {'HTTP': 'http://<my-http-proxy>.com'},
        {'HTTPS': 'https://<my-https-proxy>.com'},
    ]

🔄  Convert Images to an HDF5 Dataset

Command Line

The images can be convert to hdf5 format by running the file mtgdata.scryfall_convert. Various arguments can be specified, please see the argparse arguments at the bottom of the file for more information.

python3 mtgdata/scryfall_convert.py

The resulting data file will have the data key corresponding to the images data.

Programmatically

Alternatively you can convert and generate the hdf5 dataset from within python by simply calling the mtgdata.scryfall_convert.generate_converted_dataset function. Similar arguments can be specified as that of the command line approach.

from mtgdata import generate_converted_dataset 

generate_converted_dataset(
    out_img_type='border_crop',
    out_bulk_type='default_cards',
    save_root='./data/converted/',
    out_obs_size=(224, 160),
    convert_speed_test=True,
)

Loading The Data

We provide a helper dataset class for loading this generated file.

from torch.utils.data import DataLoader
from mtgdata import Hdf5Dataset


# this h5py dataset supports pickling, and can be wrapped with a pytorch dataset.
data = Hdf5Dataset(
    h5_path='data/converted/mtg-default_cards-normal-60459x224x160x3.h5',  # name will differ
    h5_dataset_name='data',
    transform=None,
)

# you can wrap the dataset with a pytorch dataloader like usual, and specify more than one worker
dataloader = DataLoader(data, shuffle=True, num_workers=2, batch_size=64)

# to load the data into memory as a numpy array, you can call `data = data.numpy()`
# this will take a long time depending on your disk speed and use a lot of memory.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mtgdata-0.1.0.tar.gz (22.2 kB view details)

Uploaded Source

Built Distribution

mtgdata-0.1.0-py3-none-any.whl (25.4 kB view details)

Uploaded Python 3

File details

Details for the file mtgdata-0.1.0.tar.gz.

File metadata

  • Download URL: mtgdata-0.1.0.tar.gz
  • Upload date:
  • Size: 22.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for mtgdata-0.1.0.tar.gz
Algorithm Hash digest
SHA256 fdcf798deba0c916f59b37a5837315410e70aac451cebc79437a9da108eb6a40
MD5 3967d4a35e10d7fdca3a02a886a72232
BLAKE2b-256 cfad1950e166725b41fb4a21e923f9c1f6a1011bfebfea0197ed49d34482bf08

See more details on using hashes here.

File details

Details for the file mtgdata-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: mtgdata-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 25.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for mtgdata-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7e56262e0f268cd68fc18aa62a56551307d5c511e28e975d8a7fe766cfa277b
MD5 8c62faf7f394156eede914349d93c2f8
BLAKE2b-256 4a52fb3f3350c6e9b1f5f627d15a3eedf3a00040037d7e9de79fcdb974eb30b8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page