Skip to main content

A file utility for accessing both local and remote files through a unified interface

Project description

cached-path

A file utility library that provides a unified, simple interface for accessing both local and remote files. This can be used behind other APIs that need to access files agnostic to where they are located.

CI PyPI Documentation Status License

Quick links

Installation

cached-path requires Python 3.7 or later.

Installing with pip

cached-path is available on PyPI. Just run

pip install cached-path

Installing from source

To install cached-path from source, first clone the repository:

git clone https://github.com/allenai/cached_path.git
cd cached_path

Then run

pip install -e .

Usage

from cached_path import cached_path

Given something that might be a URL or local path, cached_path() determines which. If it's a remote resource, it downloads the file and caches it to the cache directory, and then returns the path to the cached file. If it's already a local path, it makes sure the file exists and returns the path.

For URLs, http://, https://, s3:// (AWS S3), gs:// (Google Cloud Storage), and hf:// (HuggingFace Hub) are all supported out-of-the-box. Optionally beaker:// URLs in the form of beaker://{user_name}/{dataset_name}/{file_path} are supported, which requires beaker-py to be installed.

For example, to download the PyTorch weights for the model epwalsh/bert-xsmall-dummy on HuggingFace, you could do:

cached_path("hf://epwalsh/bert-xsmall-dummy/pytorch_model.bin")

For paths or URLs that point to a tarfile or zipfile, you can also add a path to a specific file to the url_or_filename preceeded by a "!", and the archive will be automatically extracted (provided you set extract_archive to True), returning the local path to the specific file. For example:

cached_path("model.tar.gz!weights.th", extract_archive=True)

Cache directory

By default the cache directory is ~/.cache/cached_path/, however there are several ways to override this setting:

  • set the environment variable CACHED_PATH_CACHE_ROOT,
  • call set_cache_dir(), or
  • set the cache_dir argument each time you call cached_path().

Team

cached-path is developed and maintained by the AllenNLP team, backed by the Allen Institute for Artificial Intelligence (AI2). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. To learn more about who specifically contributed to this codebase, see our contributors page.

License

cached-path is licensed under Apache 2.0. A full copy of the license can be found on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cached_path-1.5.0.tar.gz (30.8 kB view details)

Uploaded Source

Built Distribution

cached_path-1.5.0-py3-none-any.whl (34.2 kB view details)

Uploaded Python 3

File details

Details for the file cached_path-1.5.0.tar.gz.

File metadata

  • Download URL: cached_path-1.5.0.tar.gz
  • Upload date:
  • Size: 30.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for cached_path-1.5.0.tar.gz
Algorithm Hash digest
SHA256 8f03afe7e5d0ac27e7fa54e8561966d92f9393eb35f3d6e738b46a36a995b668
MD5 b87fa1885525f12907b16a5654654932
BLAKE2b-256 f9e30f77065e7f1d496e9957f7ed5467f71487796e265eef434b766e86f0264a

See more details on using hashes here.

Provenance

File details

Details for the file cached_path-1.5.0-py3-none-any.whl.

File metadata

  • Download URL: cached_path-1.5.0-py3-none-any.whl
  • Upload date:
  • Size: 34.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for cached_path-1.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 00b844979aaa4ba0fa4c89dc20bc6be1478fdf8156395bf790d91f026604f764
MD5 ff0c6caf9e858eeee57cfe03600359bd
BLAKE2b-256 2a02714285bbb6ed76e366a8ce429da7b88e7b188ceb2e6edda617131312b0bb

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page