Skip to main content

a module for polling urls and stats from homepages

Project description

travis-img pageone ====== a module for polling urls and stats from homepages

Install

pip install pageone

Test

Requires nose

nosetests

Usage

pageone does two things: extract article urls from a site’s homepage and also uses selenium and phantomjs to find the relative positions of these urls.

To get stats about the positions of links, use link_stats:

from pageone import PageOne

p = PageOne(url='http://www.propublica.org/')

# get stats about links positions
for link in p.link_stats():
    print link

This will return a list of dictionaries that look like this:

{
 'bucket': 4,
 'datetime': datetime.datetime(2014, 6, 7, 16, 6, 3, 533818),
 'font_size': 13,
 'has_img': 1,
 'headline': u'',
 'homepage': 'http://www.propublica.org/',
 'img_area': 3969,
 'img_height': 63,
 'img_src': u'http://www.propublica.org/images/ngen/gypsy_image_medium/mpmh_victory_drive_140x140_130514_1.jpg',
 'img_width': 63,
 'url': u'http://www.propublica.org/article/protect-service-members-defense-department-plans-broad-ban-high-cost-loans',
 'x': 61,
 'x_bucket': 1,
 'y': 730,
 'y_bucket': 4
}

Here bucket variables represent where a link falls in 200x200 pixel grid. For x_bucket this number moves from left-to-right. For y_bucket, it moves top-to-bottom. bucket moves from top-left to bottom right. You can customize the size of this grid by passing in bucket_pixels to link_stats, eg:

from pageone import PageOne

p = PageOne(url='http://www.propublica.org/')

# get stats about links positions
for link in p.link_stats(bucket_pixels = 100):
    print link

To get simply get all of the article urls on a homepage, use articles:

from pageone import PageOne
p = PageOne(url='http://www.propublica.org/')

for article in p.articles():
  print article

If you want to get article urls from other sites, use incl_external:

from pageone import PageOne
p = PageOne(url='http://www.propublica.org/')

for article in p.articles(incl_external=True):
  print article

How do I know which urls are articles?

pageone uses siegfried for url parsing and validation. If you want to apply a custom regex for article url validation, you can pass in a pattern to either link_stats or articles, eg:

from pageone import PageOne
import re

pattern = re.compile(r'.*propublica.org/[a-z]+/[a-z0-9/-]+')

p = PageOne(url='http://www.propublica.org/')

for article in p.articles(pattern=pattern):
  print article

PhantomJS

pageone requires phantomjs to run link_stats. pageone defaults to looking for phantomjs in /usr/bin/local/phantomjs, but if you want to specify another path, pass in phantom_path to linkstats:

from pageone import PageOne

p = PageOne(url='http://www.propublica.org/')
for link in p.link_stats(phantom_path="/usr/bin/phantomjs"):
    print link

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pageone-0.1.5.tar.gz (5.3 kB view details)

Uploaded Source

Built Distribution

pageone-0.1.5.macosx-10.9-intel.exe (71.5 kB view details)

Uploaded Source

File details

Details for the file pageone-0.1.5.tar.gz.

File metadata

  • Download URL: pageone-0.1.5.tar.gz
  • Upload date:
  • Size: 5.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for pageone-0.1.5.tar.gz
Algorithm Hash digest
SHA256 d4f19d93a0f09fb3dd9dce4bfa0495d2c056ca980bdb6d64f7b5d50542181036
MD5 7ca95cb950f976b49760e19947086b8f
BLAKE2b-256 73a66f373659d887cc498f70c2561a6bf6927b74844f39e8031cc591a41bf394

See more details on using hashes here.

File details

Details for the file pageone-0.1.5.macosx-10.9-intel.exe.

File metadata

File hashes

Hashes for pageone-0.1.5.macosx-10.9-intel.exe
Algorithm Hash digest
SHA256 c29c73024ad6e7a83d918ed5a21e342ecf96c6b0bffc65ff425efdf52b5129f6
MD5 6aad81ab5e26e02f0f6ef27e43ee3cdb
BLAKE2b-256 664ba0aff173dfbc219c2a5ef4bee93a23abb0b9cc2bd7958ce83d5d58bbdb67

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page