Simple. Powerful. Persistent LRU caching for the requests library.
Project description
.. __START__ Source defined in docs/github_docs.py
.. This document was procedurally generated by docs/github_docs.py on Thursday, December 17, 2015
.. __END__ Source defined in docs/github_docs.py
.. __START__ Source defined in docs/github_docs.py
.. role:: mod(literal)
.. role:: func(literal)
.. role:: data(literal)
.. role:: const(literal)
.. role:: class(literal)
.. role:: meth(literal)
.. role:: attr(literal)
.. role:: exc(literal)
.. role:: obj(literal)
.. role:: envvar(literal)
.. __END__ Source defined in docs/github_docs.py
.. __START__ Source defined in docs/source/readme_title.rst
==============
cache_requests
==============
.. image:: https://img.shields.io/github/downloads/bionikspoon/cache_requests/total.svg
:target: https://github.com/bionikspoon/cache_requests
:alt: Github Downloads
.. image:: https://badge.fury.io/py/cache_requests.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/status/cache_requests.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: Development Status
.. image:: https://travis-ci.org/bionikspoon/cache_requests.svg?branch=develop
:target: https://travis-ci.org/bionikspoon/cache_requests?branch=develop
:alt: Build Status
.. image:: https://coveralls.io/repos/bionikspoon/cache_requests/badge.svg?branch=develop
:target: https://coveralls.io/github/bionikspoon/cache_requests?branch=develop&service=github
:alt: Coverage Status
.. image:: https://readthedocs.org/projects/cache_requests/badge/?version=develop
:target: https://cache_requests.readthedocs.org/en/develop/?badge=develop
:alt: Documentation Status
------------
.. image:: https://img.shields.io/badge/Python-2.7,_3.3,_3.4,_3.5,_pypy-brightgreen.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: Supported Python versions
.. image:: https://img.shields.io/pypi/l/cache_requests.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: License
**Simple. Powerful. Persistent LRU caching for the requests library.**
.. __END__ Source defined in docs/source/readme_title.rst
.. __START__ Source defined in docs/source/readme_features.rst
Features
--------
* Free software: MIT license
* Documentation: https://cache_requests.readthedocs.org.
* Python version agnostic: tested against Python 2.7, 3.3, 3.4, 3.5 and Pypy
..
* Drop in decorator for the requests library.
* Automatic timer based expiration on stored items (optional).
* Backed by yahoo's powerful ``redislite``.
* Scalable with redis. Optionally accepts a ``redis`` connection.
* Exposes the powerful underlying ``Memoize`` decorator to decorate any function.
* Tested with high coverage.
* Lightweight. Simple logic.
* Lightning fast.
..
* Jump start your development cycle.
* Collect and reuse entire response objects.
.. __END__ Source defined in docs/source/readme_features.rst
.. __START__ Source defined in docs/source/installation.rst
============
Installation
============
At the command line either via easy_install or pip
.. code-block:: shell
$ pip install cache_requests
.. code-block:: shell
$ easy_install cache_requests
Or, if you have virtualenvwrapper installed
.. code-block:: shell
$ mkvirtualenv cache_requests
$ pip install cache_requests
**Uninstall**
.. code-block:: shell
$ pip uninstall cache_requests
.. __END__ Source defined in docs/source/installation.rst
.. __START__ Source defined in docs/source/usage.rst
=====
Usage
=====
To use cache_requests in a project
.. code-block:: python
import cache_requests
Quick Start
-----------
To use ``cache_requests`` in a project
.. code-block:: python
>>> from cache_requests import Session()
requests = Session()
# from python-requests.org
>>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
>>> r.status_code
200
>>> r.headers['content-type']
'application/json; charset=utf8'
>>> r.encoding
'utf-8'
>>> r.text
u'{"type":"User"...'
>>> r.json()
{u'private_gists': 419, u'total_private_repos': 77, ...}
Config Options
--------------
:mod:`cache_requests.config`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:data:`config.ex`
sets the default expiration (seconds) for new cache entries. Can be configured with env :envvar:`REDIS_EX`.
:data:`config.dbfilename`
sets the default location for the database. The default location is a spot in your OS' temp directory. Can be configured with env :envvar:`REDIS_DBFILENAME`.
:data:`config.connection`
creates the connection to the :mod:`redis` or :mod:`redislite` database. By default this is a :mod:`redislite` connection, but a redis connection can be dropped in for an easy upgrade. Can be configured with env :envvar:`REDIS_CONNECTION`.
:mod:`cache_requests.Session`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Caching individual session methods is turned on and off independently.
These methods are accessed through the Session objects ``cache.[method name]``.
They can be overridden with the ``cache.all`` setting.
For example
.. code-block:: python
from cache_requests import Session
requests = Session()
requests.cache.delete = True
# cached, only called once.
requests.delete('http://google.com')
requests.delete('http://google.com')
requests.cache.delete = True
# not cached, called twice.
requests.delete('http://google.com')
requests.delete('http://google.com')
# cache ALL methods
requests.cache.all = True
# don't cache any methods
requests.cache.all = False
# Use individual method cache options.
requests.cache.all = None
Default settings
****************
=========== ========
Method Cached
=========== ========
``get`` ``True``
``head`` ``True``
``options`` ``True``
``post`` ``False``
``put`` ``False``
``patch`` ``False``
``delete`` ``False``
``all`` ``None``
=========== ========
Use Case Scenarios
------------------
Development: 3rd Party APIs
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Scenario:
Working on a project that uses a 3rd party API or service.
Things you want:
* A cache that persists between sessions and is lightning fast.
* Ability to rapidly explore the API and it's parameters.
* Ability to inspect and debug response content.
* Ability to focus on progress.
* Perfect transition to a production environment.
Things you don't want:
* Dependency on network and server stability for development.
* Spamming the API. Especially APIs with limits.
* Responses that change in non-meaningful ways.
* Burning energy with copypasta or fake data to run piece of your program.
* Slow. Responses.
Make a request one time. Cache the results for the rest of your work session.
.. code-block:: python
import os
if os.environ.get('ENV') == 'DEVELOP':
from cache_requests import Session, config
config.ex = 60 * 60 # 60 min
request = Session()
else:
import requests
# strange, complicated request you might make
headers = {"accept-encoding": "gzip, deflate, sdch", "accept-language": "en-US,en;q=0.8"}
payload = dict(sourceid="chrome-instant", ion="1", espv="2", ie="UTF-8", client="ubuntu",
q="hash%20a%20dictionary%20python")
response = requests.get('http://google.com/search', headers=headers, params=payload)
# spam to prove a point
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
# tweak your query, we're exploring here
payload = dict(sourceid="chrome-instant", ion="1", espv="2", ie="UTF-8", client="ubuntu",
q="hash%20a%20dictionary%20python2")
# do you see what changed? the caching tool did.
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
Optionally. Setup with environment variables.
.. code-block:: shell
$ export ENV=DEVELOP
$ export REDIS_DBFILENAME='redis/requests.redislite' # make sure directory exists
$ export REDIS_EX=3600 # 1 hour; default
Production: Web Scraping
~~~~~~~~~~~~~~~~~~~~~~~~
Automatically expire old content.
* How often? After a day? A week? A Month? etc. 100% of this logic is built in with the ``config.ex`` setting.
* Effectively it can manage all of the time-based rotation.
* Perfect if you theres more data then what your API caps allow.
One line of code to use a ``redis`` full database.
* Try ``redislite``; it can handle quite a bit. The ``redislite`` api used by this module is 1:1 with the redis package. Just replace the connection parameter/config value.
* ``redis`` is a drop in:
.. code-block:: python
config.connection = redis.StrictRedis(host='localhost', port=6379, db=0)
* Everything else just works. There's no magic required.
.. code-block:: python
from cache_requests import Session, config
config.connection = redis.StrictRedis(host='localhost', port=6379, db=0)
config.ex = 7 * 24 * 60 * 60 # 1 week
requests = Session()
for i in range(1000)
payload = dict(q=i)
response = requests.get('http://google.com/search', params=payload)
print(response.text)
Usage: memoize
~~~~~~~~~~~~~~
.. code-block:: python
from cache_requests import memoize, config
config.ex = 15 * 60 # 15 min, defult, 60 min
@memoize
def amazing_but_expensive_function(*args, **kwargs)
print("You're going to like this")
.. __END__ Source defined in docs/source/usage.rst
.. __START__ Source defined in docs/source/readme_credits.rst
Credits
-------
Tools used in rendering this package:
* Cookiecutter_
* `bionikspoon/cookiecutter-pypackage`_ forked from `audreyr/cookiecutter-pypackage`_
.. _Cookiecutter: https://github.com/audreyr/cookiecutter
.. _`bionikspoon/cookiecutter-pypackage`: https://github.com/bionikspoon/cookiecutter-pypackage
.. _`audreyr/cookiecutter-pypackage`: https://github.com/audreyr/cookiecutter-pypackage
.. __END__ Source defined in docs/source/readme_credits.rst
=======
History
=======
Next Release
------------
* Coming Soon
2.0.0 (2015-12-12)
------------------
* API completely rewritten
* New API extends ``requests`` internals as opposed to monkeypatching.
* Entire package is redesigned to be more maintainable, more modular, and more usable.
* Dependencies are pinned.
* Tests are expanded.
* PY26 and PY32 support is dropped, because of dependency constraints.
* PY35 support is added.
* Docs are rewritten.
* Move towards idiomatic code.
1.0.0 (2015-04-23)
------------------
* First real release.
* Feature/ Unit test suite, very high coverage.
* Feature/ ``redislite`` integration.
* Feature/ Documentation. https://cache-requests.readthedocs.org.
* Feature/ Exposed the beefed up ``Memoize`` decorator.
* Feature/ Upgraded compatibility to:
* PY26
* PY27
* PY33
* PY34
* PYPY
* Added examples and case studies.
0.1.0 (2015-04-19)
------------------
* First release on PyPI.
.. This document was procedurally generated by docs/github_docs.py on Thursday, December 17, 2015
.. __END__ Source defined in docs/github_docs.py
.. __START__ Source defined in docs/github_docs.py
.. role:: mod(literal)
.. role:: func(literal)
.. role:: data(literal)
.. role:: const(literal)
.. role:: class(literal)
.. role:: meth(literal)
.. role:: attr(literal)
.. role:: exc(literal)
.. role:: obj(literal)
.. role:: envvar(literal)
.. __END__ Source defined in docs/github_docs.py
.. __START__ Source defined in docs/source/readme_title.rst
==============
cache_requests
==============
.. image:: https://img.shields.io/github/downloads/bionikspoon/cache_requests/total.svg
:target: https://github.com/bionikspoon/cache_requests
:alt: Github Downloads
.. image:: https://badge.fury.io/py/cache_requests.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/status/cache_requests.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: Development Status
.. image:: https://travis-ci.org/bionikspoon/cache_requests.svg?branch=develop
:target: https://travis-ci.org/bionikspoon/cache_requests?branch=develop
:alt: Build Status
.. image:: https://coveralls.io/repos/bionikspoon/cache_requests/badge.svg?branch=develop
:target: https://coveralls.io/github/bionikspoon/cache_requests?branch=develop&service=github
:alt: Coverage Status
.. image:: https://readthedocs.org/projects/cache_requests/badge/?version=develop
:target: https://cache_requests.readthedocs.org/en/develop/?badge=develop
:alt: Documentation Status
------------
.. image:: https://img.shields.io/badge/Python-2.7,_3.3,_3.4,_3.5,_pypy-brightgreen.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: Supported Python versions
.. image:: https://img.shields.io/pypi/l/cache_requests.svg
:target: https://pypi.python.org/pypi/cache_requests/
:alt: License
**Simple. Powerful. Persistent LRU caching for the requests library.**
.. __END__ Source defined in docs/source/readme_title.rst
.. __START__ Source defined in docs/source/readme_features.rst
Features
--------
* Free software: MIT license
* Documentation: https://cache_requests.readthedocs.org.
* Python version agnostic: tested against Python 2.7, 3.3, 3.4, 3.5 and Pypy
..
* Drop in decorator for the requests library.
* Automatic timer based expiration on stored items (optional).
* Backed by yahoo's powerful ``redislite``.
* Scalable with redis. Optionally accepts a ``redis`` connection.
* Exposes the powerful underlying ``Memoize`` decorator to decorate any function.
* Tested with high coverage.
* Lightweight. Simple logic.
* Lightning fast.
..
* Jump start your development cycle.
* Collect and reuse entire response objects.
.. __END__ Source defined in docs/source/readme_features.rst
.. __START__ Source defined in docs/source/installation.rst
============
Installation
============
At the command line either via easy_install or pip
.. code-block:: shell
$ pip install cache_requests
.. code-block:: shell
$ easy_install cache_requests
Or, if you have virtualenvwrapper installed
.. code-block:: shell
$ mkvirtualenv cache_requests
$ pip install cache_requests
**Uninstall**
.. code-block:: shell
$ pip uninstall cache_requests
.. __END__ Source defined in docs/source/installation.rst
.. __START__ Source defined in docs/source/usage.rst
=====
Usage
=====
To use cache_requests in a project
.. code-block:: python
import cache_requests
Quick Start
-----------
To use ``cache_requests`` in a project
.. code-block:: python
>>> from cache_requests import Session()
requests = Session()
# from python-requests.org
>>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
>>> r.status_code
200
>>> r.headers['content-type']
'application/json; charset=utf8'
>>> r.encoding
'utf-8'
>>> r.text
u'{"type":"User"...'
>>> r.json()
{u'private_gists': 419, u'total_private_repos': 77, ...}
Config Options
--------------
:mod:`cache_requests.config`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:data:`config.ex`
sets the default expiration (seconds) for new cache entries. Can be configured with env :envvar:`REDIS_EX`.
:data:`config.dbfilename`
sets the default location for the database. The default location is a spot in your OS' temp directory. Can be configured with env :envvar:`REDIS_DBFILENAME`.
:data:`config.connection`
creates the connection to the :mod:`redis` or :mod:`redislite` database. By default this is a :mod:`redislite` connection, but a redis connection can be dropped in for an easy upgrade. Can be configured with env :envvar:`REDIS_CONNECTION`.
:mod:`cache_requests.Session`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Caching individual session methods is turned on and off independently.
These methods are accessed through the Session objects ``cache.[method name]``.
They can be overridden with the ``cache.all`` setting.
For example
.. code-block:: python
from cache_requests import Session
requests = Session()
requests.cache.delete = True
# cached, only called once.
requests.delete('http://google.com')
requests.delete('http://google.com')
requests.cache.delete = True
# not cached, called twice.
requests.delete('http://google.com')
requests.delete('http://google.com')
# cache ALL methods
requests.cache.all = True
# don't cache any methods
requests.cache.all = False
# Use individual method cache options.
requests.cache.all = None
Default settings
****************
=========== ========
Method Cached
=========== ========
``get`` ``True``
``head`` ``True``
``options`` ``True``
``post`` ``False``
``put`` ``False``
``patch`` ``False``
``delete`` ``False``
``all`` ``None``
=========== ========
Use Case Scenarios
------------------
Development: 3rd Party APIs
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Scenario:
Working on a project that uses a 3rd party API or service.
Things you want:
* A cache that persists between sessions and is lightning fast.
* Ability to rapidly explore the API and it's parameters.
* Ability to inspect and debug response content.
* Ability to focus on progress.
* Perfect transition to a production environment.
Things you don't want:
* Dependency on network and server stability for development.
* Spamming the API. Especially APIs with limits.
* Responses that change in non-meaningful ways.
* Burning energy with copypasta or fake data to run piece of your program.
* Slow. Responses.
Make a request one time. Cache the results for the rest of your work session.
.. code-block:: python
import os
if os.environ.get('ENV') == 'DEVELOP':
from cache_requests import Session, config
config.ex = 60 * 60 # 60 min
request = Session()
else:
import requests
# strange, complicated request you might make
headers = {"accept-encoding": "gzip, deflate, sdch", "accept-language": "en-US,en;q=0.8"}
payload = dict(sourceid="chrome-instant", ion="1", espv="2", ie="UTF-8", client="ubuntu",
q="hash%20a%20dictionary%20python")
response = requests.get('http://google.com/search', headers=headers, params=payload)
# spam to prove a point
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
# tweak your query, we're exploring here
payload = dict(sourceid="chrome-instant", ion="1", espv="2", ie="UTF-8", client="ubuntu",
q="hash%20a%20dictionary%20python2")
# do you see what changed? the caching tool did.
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
response = requests.get('http://google.com/search', headers=headers, params=payload)
Optionally. Setup with environment variables.
.. code-block:: shell
$ export ENV=DEVELOP
$ export REDIS_DBFILENAME='redis/requests.redislite' # make sure directory exists
$ export REDIS_EX=3600 # 1 hour; default
Production: Web Scraping
~~~~~~~~~~~~~~~~~~~~~~~~
Automatically expire old content.
* How often? After a day? A week? A Month? etc. 100% of this logic is built in with the ``config.ex`` setting.
* Effectively it can manage all of the time-based rotation.
* Perfect if you theres more data then what your API caps allow.
One line of code to use a ``redis`` full database.
* Try ``redislite``; it can handle quite a bit. The ``redislite`` api used by this module is 1:1 with the redis package. Just replace the connection parameter/config value.
* ``redis`` is a drop in:
.. code-block:: python
config.connection = redis.StrictRedis(host='localhost', port=6379, db=0)
* Everything else just works. There's no magic required.
.. code-block:: python
from cache_requests import Session, config
config.connection = redis.StrictRedis(host='localhost', port=6379, db=0)
config.ex = 7 * 24 * 60 * 60 # 1 week
requests = Session()
for i in range(1000)
payload = dict(q=i)
response = requests.get('http://google.com/search', params=payload)
print(response.text)
Usage: memoize
~~~~~~~~~~~~~~
.. code-block:: python
from cache_requests import memoize, config
config.ex = 15 * 60 # 15 min, defult, 60 min
@memoize
def amazing_but_expensive_function(*args, **kwargs)
print("You're going to like this")
.. __END__ Source defined in docs/source/usage.rst
.. __START__ Source defined in docs/source/readme_credits.rst
Credits
-------
Tools used in rendering this package:
* Cookiecutter_
* `bionikspoon/cookiecutter-pypackage`_ forked from `audreyr/cookiecutter-pypackage`_
.. _Cookiecutter: https://github.com/audreyr/cookiecutter
.. _`bionikspoon/cookiecutter-pypackage`: https://github.com/bionikspoon/cookiecutter-pypackage
.. _`audreyr/cookiecutter-pypackage`: https://github.com/audreyr/cookiecutter-pypackage
.. __END__ Source defined in docs/source/readme_credits.rst
=======
History
=======
Next Release
------------
* Coming Soon
2.0.0 (2015-12-12)
------------------
* API completely rewritten
* New API extends ``requests`` internals as opposed to monkeypatching.
* Entire package is redesigned to be more maintainable, more modular, and more usable.
* Dependencies are pinned.
* Tests are expanded.
* PY26 and PY32 support is dropped, because of dependency constraints.
* PY35 support is added.
* Docs are rewritten.
* Move towards idiomatic code.
1.0.0 (2015-04-23)
------------------
* First real release.
* Feature/ Unit test suite, very high coverage.
* Feature/ ``redislite`` integration.
* Feature/ Documentation. https://cache-requests.readthedocs.org.
* Feature/ Exposed the beefed up ``Memoize`` decorator.
* Feature/ Upgraded compatibility to:
* PY26
* PY27
* PY33
* PY34
* PYPY
* Added examples and case studies.
0.1.0 (2015-04-19)
------------------
* First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
cache_requests-2.0.5.tar.gz
(26.9 kB
view hashes)
Built Distribution
Close
Hashes for cache_requests-2.0.5-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8f904dc5e190491a6e01fb137a8911fd1b308ffdda05c86275fd8091b7f1dddd |
|
MD5 | eb6c8ecb8a29712d8e4817461f543d5e |
|
BLAKE2b-256 | 8c902907af5603b9ef0a92d74de8500163d73eecee1ef2da4014a98577cf5134 |