Skip to main content

Control robots.txt files from environment variables and templates.

Reason this release was yanked:

clumsy implementation

Project description

Django Env Robots (.txt)

Serve different robots.txt from your production | stage | etc servers by setting environment variables. Rules are managed via templates.

Installation

Install from PyPI:

pip install django-env-robots

Then add the following to your project's INSTALLED_APPS.

'django_env_robots',

Usage

settings.py

# robots
SERVER_ENV = Env.get('SERVER_ENV', 'production')
ROBOTS_ROOT = os.path.join(BASE_DIR, 'robots')
ROBOTS_SITEMAP_URLS = Env.list('ROBOTS_SITEMAP_URLS', '/sitemap.xml')

urls.py

from django_env_robots import urls as robots_urls
...
urlpatterns = [
    path("robots.txt", include(robots_urls)),
]

Other considertions

A robots.txt being served from a Whitenose public directory will win over this app. That is because of whitenoise's middleware behaviour - quite correct but watch out for that.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django_env_robots-0.0.1.tar.gz (3.7 kB view hashes)

Uploaded Source

Built Distribution

django_env_robots-0.0.1-py3-none-any.whl (5.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page