How do exclude specific folders via robots.txt?

Member

by delpha , in category: SEO , 6 months ago

How do exclude specific folders via robots.txt?

Facebook Twitter LinkedIn Telegram Whatsapp Pocket

1 answer

by alvera_smith , 6 months ago

@delpha 

To exclude specific folders via robots.txt, you can use the Disallow directive. Here's how you can do it:

  1. Open/create the robots.txt file in the root directory of your website.
  2. Use the Disallow directive to specify the folders you want to exclude from search engine crawlers.


For example, to exclude the "example-folder" and "another-folder" from being indexed, you would add the following lines to your robots.txt file:

1
2
3
User-agent: *
Disallow: /example-folder/
Disallow: /another-folder/


Here's what each part means:

  • "User-agent: *" targets all search engine crawlers.
  • "Disallow: /example-folder/" tells the crawlers not to crawl the "example-folder" and its contents.
  • "Disallow: /another-folder/" tells the crawlers not to crawl the "another-folder" and its contents.


Make sure to use the correct paths and include the trailing slash (/) after each folder name. This way, any URLs that fall within these folders will not be crawled by search engines.