This is a pre-production deployment of Warehouse, however changes made here WILL affect the production instance of PyPI.
Latest Version Dependencies status unknown Test status unknown Test coverage unknown
Project Description

Django robots.txt generator. Based on using decorated django.conf.urls.defaults.url. It gets urlpatterns and replaces ambiguous parts by *.

Installation & Usage

The recommended way to install django-url-robots is with pip

  1. Install from PyPI with easy_install or pip:

    pip install django-url-robots
    
  2. Add 'url_robots' to your INSTALLED_APPS:

    INSTALLED_APPS = (
        ...
        'url_robots',
        ...
        )
    
  3. Add url_robots view to your root URLconf:

    urlpatterns += patterns('',
        url(r'^robots.txt$', 'url_robots.views.robots_txt'),
        )
    
  4. Describe rules by boolean keyword argument robots_allow using for it url_robots.utils.url instead django.conf.urls.defaults.url:

    from url_robots.utils import url
    
    urlpatterns += patterns('',
       url('^profile/private$', 'view', robots_allow=False),
       )
    

django-url-robots tested with Django-1.3. Encodes unicode characters by percent-encoding.

Settings

In this moment there are only one option to define template of robots.txt file:

urlpatterns += patterns('',
    url(r'^robots.txt$', 'url_robots.views.robots_txt', {'template': 'my_awesome_robots_template.txt'}),
    )

Example

robots_template.txt:

User-agent: *
Disallow: /*  # disallow all
{{ rules|safe }}

urls.py:

from django.conf.urls.defaults import patterns, include

urlpatterns = patterns('',
    url(r'^profile', include('url_robots.tests.urls_profile')),
)

urls_profile.py:

from django.conf.urls.defaults import patterns
from url_robots.utils import url

urlpatterns = patterns('',
    url(r'^s$', 'view', name='profiles', robots_allow=True),
    url(r'^/(?P<nick>\w+)$', 'view'),
    url(r'^/(?P<nick>\w+)/private', 'view', name='profile_private', robots_allow=False),
    url(r'^/(?P<nick>\w+)/public', 'view', name='profile_public', robots_allow=True),
    )

Resulting robots.txt:

User-agent: *
Disallow: /*  # disallow all
Allow: /profiles$
Disallow: /profile/*/private*
Allow: /profile/*/public*
Release History

Release History

1.1

This version

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

1.0.4

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

1.0.3

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

1.0.2

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

1.0.1

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

1.0

History Node

TODO: Figure out how to actually get changelog content.

Changelog content for this version goes here.

Donec et mollis dolor. Praesent et diam eget libero egestas mattis sit amet vitae augue. Nam tincidunt congue enim, ut porta lorem lacinia consectetur. Donec ut libero sed arcu vehicula ultricies a non tortor. Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Show More

Download Files

Download Files

TODO: Brief introduction on what you do with files - including link to relevant help section.

File Name & Checksum SHA256 Checksum Help Version File Type Upload Date
django-url-robots-1.1.zip (7.8 kB) Copy SHA256 Checksum SHA256 Source Dec 29, 2012

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS HPE HPE Development Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting