Skip to main content

A file utility for accessing both local and remote files through a unified interface

Project description

cached-path

A file utility library that provides a unified, simple interface for accessing both local and remote files. This can be used behind other APIs that need to access files agnostic to where they are located.

CI PyPI Documentation Status License

Quick links

Installation

cached-path requires Python 3.7 or later.

Installing with pip

cached-path is available on PyPI. Just run

pip install cached-path

Installing from source

To install cached-path from source, first clone the repository:

git clone https://github.com/allenai/cached_path.git
cd cached_path

Then run

pip install -e .

Usage

from cached_path import cached_path

Given something that might be a URL or local path, cached_path() determines which. If it's a remote resource, it downloads the file and caches it to the cache directory, and then returns the path to the cached file. If it's already a local path, it makes sure the file exists and returns the path.

For URLs, http://, https://, s3:// (AWS S3), gs:// (Google Cloud Storage), and hf:// (HuggingFace Hub) are all supported out-of-the-box. Optionally beaker:// URLs in the form of beaker://{user_name}/{dataset_name}/{file_path} are supported, which requires beaker-py to be installed.

For example, to download the PyTorch weights for the model epwalsh/bert-xsmall-dummy on HuggingFace, you could do:

cached_path("hf://epwalsh/bert-xsmall-dummy/pytorch_model.bin")

For paths or URLs that point to a tarfile or zipfile, you can also add a path to a specific file to the url_or_filename preceeded by a "!", and the archive will be automatically extracted (provided you set extract_archive to True), returning the local path to the specific file. For example:

cached_path("model.tar.gz!weights.th", extract_archive=True)

Cache directory

By default the cache directory is ~/.cache/cached_path/, however there are several ways to override this setting:

  • set the environment variable CACHED_PATH_CACHE_ROOT,
  • call set_cache_dir(), or
  • set the cache_dir argument each time you call cached_path().

Team

cached-path is developed and maintained by the AllenNLP team, backed by the Allen Institute for Artificial Intelligence (AI2). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. To learn more about who specifically contributed to this codebase, see our contributors page.

License

cached-path is licensed under Apache 2.0. A full copy of the license can be found on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cached_path-1.5.1.tar.gz (30.9 kB view details)

Uploaded Source

Built Distribution

cached_path-1.5.1-py3-none-any.whl (34.2 kB view details)

Uploaded Python 3

File details

Details for the file cached_path-1.5.1.tar.gz.

File metadata

  • Download URL: cached_path-1.5.1.tar.gz
  • Upload date:
  • Size: 30.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for cached_path-1.5.1.tar.gz
Algorithm Hash digest
SHA256 a934f535ffa7a4c55fd9e1073ff9b628e11a737f21369a71cace23a12c87dfbe
MD5 4d92b7397a3577f7f4fd32c190f92d5f
BLAKE2b-256 59c6784299bf0e48fc773ca84a63f7c4d716c737851e9f9280022e8a3f09ec00

See more details on using hashes here.

Provenance

File details

Details for the file cached_path-1.5.1-py3-none-any.whl.

File metadata

  • Download URL: cached_path-1.5.1-py3-none-any.whl
  • Upload date:
  • Size: 34.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for cached_path-1.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a94df4d97f96d809e68f364ca8140186815f747c288163b8da630020fb83cfb7
MD5 88d0b57b6926fb51e14116f4b0135720
BLAKE2b-256 309056ee25e9b1fdd68f587cc497ac373f121b3cb27e0c938604d75075b3793a

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page