How to make a robots.txt on django?

by jacey.lubowitz , in category: SEO , 8 months ago

How to make a robots.txt on django?

Facebook Twitter LinkedIn Telegram Whatsapp Pocket

1 answer

by domenico.weimann , 8 months ago


To create a robots.txt file on Django, follow these steps:

  1. Create a new plain text file called robots.txt in the root directory of your Django project (same directory as
  2. Open file which is also located in the project's root directory.
  3. Scroll down to the MIDDLEWARE section and add the following line: 'django.middleware.common.CommonMiddleware',
  4. In the same file, scroll further down and locate the TEMPLATES section.
  5. Under DIRS, add the following line: os.path.join(BASE_DIR, 'templates'),
  6. Create a new directory called templates in your project's root directory.
  7. Inside the templates directory, create another directory with the name of your Django app (e.g., myapp).
  8. Inside the app directory (templates/myapp), create a new file called robots.txt.
  9. Open the robots.txt file and specify the rules you want to include. Here's an example: User-agent: * Disallow: /admin/ The above example disallows all user agents from accessing the /admin/ URL.
  10. Save the robots.txt file.

That's it! Your Django project now has a robots.txt file. Accessing the URL will display the contents of this file. Make sure to adjust the example rules to suit your needs.