Find the feed URLs for a website.
This is an asynchronous Python library for finding links feeds on a website.
It is based on the synchronous (requests based) feedfinder2, written by Dan Foreman-Mackey, which is based on feedfinder - originally written by Mark Pilgrim and subsequently maintained by Aaron Swartz until his untimely death.
Feedfinder2 offers a single public function: find_feeds. You would use it as following:
import asyncio from aio_feedfinder2 import find_feeds loop = asyncio.get_event_loop() task = asyncio.ensure_future(find_feeds("xkcd.com")) feeds = loop.run_until_complete(future)
Now, feeds is the list: ['http://xkcd.com/atom.xml', 'http://xkcd.com/rss.xml']. There is some attempt made to rank feeds from best candidate to worst but… well… you never know.
This asyncio variant is ideally suited to find feeds on multiple domains/ sites in an asynchronous way:
import asyncio from aio_feedfinder2 import find_feeds loop = asyncio.get_event_loop() tasks = [find_feeds(url) for url in ["xkcd.com", "abstrusegoose.com"]] feeds = loop.run_until_complete(asyncio.gather(*tasks)) >>> feeds ... [ ... ['http://xkcd.com/atom.xml', 'http://xkcd.com/rss.xml'], ... ['http://abstrusegoose.com/feed.xml', 'http://abstrusegoose.com/atomfeed.xml'] ... ]
Feedfinder2 is licensed under the MIT license (see LICENSE).
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size aio_feedfinder2-0.3.0-py3-none-any.whl (6.3 kB)||File type Wheel||Python version 3.4||Upload date||Hashes View|
|Filename, size aio-feedfinder2-0.3.0.tar.gz (2.0 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for aio_feedfinder2-0.3.0-py3-none-any.whl