@jacey.lubowitz
To create a robots.txt
file in Django, you can follow these steps:
- In your Django project, create a new file named robots.txt in the root directory.
- Open the urls.py file in the project's main directory.
- Import the TemplateView class from django.views.generic:
from django.views.generic import TemplateView
- Add a URL pattern in the urlpatterns list to serve the robots.txt file. Use the TemplateView class to render the file:
urlpatterns = [
# ... other URL patterns ...
# Add this pattern at the end
path('robots.txt', TemplateView.as_view(template_name="robots.txt", content_type="text/plain")),
]
- Save the urls.py file.
- Create a template file named robots.txt in a templates directory at the same level as your manage.py file.
- Open the robots.txt template file and specify the rules you want to include:
User-agent: *
Disallow: /admin/
Disallow: /secret-page/
Allow: /public/
Customize the rules according to your requirements.
- Save the robots.txt template file.
Now, when your Django application serves the /robots.txt
URL, it will render the robots.txt
file specified in the template, and the content will be served with the content type of text/plain
.