@declan_ritchie
To add nofollow, noindex
directives to all pages in your robots.txt file, you can follow these steps:
1 2 |
User-agent: * Disallow: / |
The User-agent: *
line specifies that these directives apply to all search engine bots. The Disallow: /
line tells search engine bots to not access any page on your website.
These directives in the robots.txt file will instruct search engine bots not to crawl or index any pages on your website, essentially adding the nofollow, noindex
attributes to all pages. However, keep in mind that although these directives discourage search engines from indexing your content, they are not foolproof and some search engines may still index pages regardless. If you want to ensure that certain pages are not indexed, it is recommended to use other methods such as adding meta tags or using the robots
meta tag in the HTML of the pages you want to exclude.