Skip to main content

Control robots.txt files from environment variables and templates.

Project description

Django Env Robots (.txt)

Serve different robots.txt from your production | stage | etc servers by setting environment variables. Rules are managed via templates. By default it excludes robots entirely.

Installation

Install from PyPI:

pip install django-env-robots

Then add the following to your project's INSTALLED_APPS.

'django_env_robots',

Usage

settings.py

Set the following:

  • SERVER_ENV identifies the nature of the server and thus the robots.txt template that will be used.

E.g:

SERVER_ENV = 'production'

urls.py

from django_env_robots import urls as robots_urls
...
urlpatterns = [
    path("robots.txt", include(robots_urls)),
]

robots templates

Create corresponding template files for each SERVER_ENV you will be using. These live in your projects templates directory in a robots subfolder.

For example, if SERVER_ENV can be production or stage, then create:

  • templates/robots/production.txt
  • templates/robots/stage.txt

e.g:

User-agent: *
Disallow: /admin/

Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www2.example.com/sitemap.xml

Other considertions

A robots.txt being served from a Whitenose public directory will win over this app. That is because of whitenoise's middleware behaviour - quite correct but watch out for that.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django_env_robots-0.0.5.tar.gz (5.0 kB view hashes)

Uploaded Source

Built Distribution

django_env_robots-0.0.5-py3-none-any.whl (5.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page