How to add `nofollow, noindex` all pages in robots.txt?

by declan_ritchie , in category: SEO , 6 months ago

How to add `nofollow, noindex` all pages in robots.txt?

Facebook Twitter LinkedIn Telegram Whatsapp Pocket

1 answer

by harrison.goodwin , 6 months ago

@declan_ritchie 

To add nofollow, noindex directives to all pages in your robots.txt file, you can follow these steps:

  1. Open the robots.txt file for your website. It is typically located in the root directory of your website.
  2. Add the following lines to the robots.txt file:
1
2
User-agent: *
Disallow: /


The User-agent: * line specifies that these directives apply to all search engine bots. The Disallow: / line tells search engine bots to not access any page on your website.

  1. Save the robots.txt file and upload it to the root directory of your website if you haven't done so already.


These directives in the robots.txt file will instruct search engine bots not to crawl or index any pages on your website, essentially adding the nofollow, noindex attributes to all pages. However, keep in mind that although these directives discourage search engines from indexing your content, they are not foolproof and some search engines may still index pages regardless. If you want to ensure that certain pages are not indexed, it is recommended to use other methods such as adding meta tags or using the robots meta tag in the HTML of the pages you want to exclude.