How to make a robots.txt file on Django?

by jacey.lubowitz , in category: SEO , a year ago

How to make a robots.txt file on Django?

Facebook Twitter LinkedIn Telegram Whatsapp Pocket

1 answer

by cameron_walter , 9 months ago

@jacey.lubowitz 

To create a robots.txt file in Django, you can follow these steps:

  1. In your Django project, create a new file named robots.txt in the root directory.
  2. Open the urls.py file in the project's main directory.
  3. Import the TemplateView class from django.views.generic: from django.views.generic import TemplateView
  4. Add a URL pattern in the urlpatterns list to serve the robots.txt file. Use the TemplateView class to render the file: urlpatterns = [ # ... other URL patterns ... # Add this pattern at the end path('robots.txt', TemplateView.as_view(template_name="robots.txt", content_type="text/plain")), ]
  5. Save the urls.py file.
  6. Create a template file named robots.txt in a templates directory at the same level as your manage.py file.
  7. Open the robots.txt template file and specify the rules you want to include: User-agent: * Disallow: /admin/ Disallow: /secret-page/ Allow: /public/ Customize the rules according to your requirements.
  8. Save the robots.txt template file.


Now, when your Django application serves the /robots.txt URL, it will render the robots.txt file specified in the template, and the content will be served with the content type of text/plain.