@jacey.lubowitz
To create a robots.txt file on Django, follow these steps:
- Create a new plain text file called robots.txt in the root directory of your Django project (same directory as manage.py).
- Open settings.py file which is also located in the project's root directory.
- Scroll down to the MIDDLEWARE section and add the following line:
'django.middleware.common.CommonMiddleware',
- In the same settings.py file, scroll further down and locate the TEMPLATES section.
- Under DIRS, add the following line:
os.path.join(BASE_DIR, 'templates'),
- Create a new directory called templates in your project's root directory.
- Inside the templates directory, create another directory with the name of your Django app (e.g., myapp).
- Inside the app directory (templates/myapp), create a new file called robots.txt.
- Open the robots.txt file and specify the rules you want to include. Here's an example:
User-agent: *
Disallow: /admin/
The above example disallows all user agents from accessing the /admin/ URL.
- Save the robots.txt file.
That's it! Your Django project now has a robots.txt
file. Accessing the URL http://yourdomain.com/robots.txt
will display the contents of this file. Make sure to adjust the example rules to suit your needs.