Skip to main content

Geocaching.com site crawler. Provides tools for searching, fetching caches and geocoding.

Project description

Features

  • login to Geocaching.com

  • search caches

    • normal search (unlimited number of caches from any point)

    • quick search (all caches inside some area)

  • load cache and its details

    • normal loading (loads all details)

    • quick loading (loads just basic info very quickly)

    • lazy loading (create cache object and load info on demand)

    • load logbook for given cache

  • post log to cache logbook

  • load trackable details by tracking code

  • geocode given location

Installation

Stable version - using pip:

pip install pycaching

Dev version - manually from GIT:

git clone https://github.com/tomasbedrich/pycaching.git
pip install ./pycaching

Pycaching has following requirements:

Python>=3.4
MechanicalSoup>=0.3.0
geopy>=1.0.0

Example usage

Login

import pycaching
geocaching = pycaching.login("user", "pass")

Load a cache details

cache = geocaching.get_cache("GC1PAR2")
print(cache.name)  # cache.load() is automatically called
print(cache.location)  # stored in cache, printed immediately

This uses lazy loading, so the Cache object is created immediately and the page is loaded when needed (accessing the name).

You can use different method of loading cache details. It will be much faster, but it will load less details:

cache = geocaching.get_cache("GC1PAR2")
cache.load_quick()  # takes a small while
print(cache.name)  # stored in cache, printed immediately
print(cache.location)  # NOT stored in cache, will trigger full loading

You can also load a logbook for cache:

for log in cache.load_logbook(limit=200):
    print(log.visited, log.type, log.author, log.text)

Or its trackables:

for trackable in cache.load_trackables(limit=5):
    print(trackable.name)

Post a log to cache

geocaching.post_log("GC1PAR2", "Found cache in the rain. Nice Place, TFTC!")

It is also possible to call post_log on Cache object, but you would have to create Log object manually and pass it to this method.

Search for all traditional caches around

from pycaching import Point
from pycaching.cache import Type

point = Point(56.25263, 15.26738)

for cache in geocaching.search(point, limit=50):
    if cache.type == Type.traditional:
        print(cache.name)

Notice the limit in search function. It is because search() returns a generator object, which would fetch the caches forever in case of simple loop.

Geocode adress and search around

point = geocaching.geocode("Prague")

for cache in geocaching.search(point, limit=10):
    print(cache.name)

Find caches with their approximate locations in some area

from pycaching import Point, Rectangle

rect = Rectangle(Point(60.15, 24.95), Point(60.17, 25.00))

for cache in geocaching.search_quick(rect, strict=True):
    print(cache.name, cache.location.precision)

Load a trackable details

trackable = geocaching.get_trackable("TB3ZGT2")
print(trackable.name, trackable.goal, trackable.description, trackable.location)

Appendix

Inspiration

Original version was inspired by these packages:

Although the new version was massively rewritten, I’d like to thank to their authors.

Author

Tomáš Bedřich

Thanks to all contributors!


Build Status Coverage Status PyPI monthly downloads

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycaching-3.3.tar.gz (30.1 kB view details)

Uploaded Source

File details

Details for the file pycaching-3.3.tar.gz.

File metadata

  • Download URL: pycaching-3.3.tar.gz
  • Upload date:
  • Size: 30.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for pycaching-3.3.tar.gz
Algorithm Hash digest
SHA256 3015eedd7ef5d15dd7ff146a0fed26d34d99598bee1632332b0feedf504e02df
MD5 bab694fe586788e04190049f47e9f901
BLAKE2b-256 d349ed6fa6c6d11e92212019676dce9b9c35645a8872fa06c95b980cddee0999

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page