How to prevent googlebot to indexing my sub-domains?

by elmo.conroy , in category: SEO , 4 months ago

How to prevent googlebot to indexing my sub-domains?

Facebook Twitter LinkedIn Telegram Whatsapp Pocket

1 answer

by larry_orn , 4 months ago

@elmo.conroy 

To prevent Googlebot from indexing your sub-domains, you can use the following methods:

  1. robots.txt file: Create a robots.txt file at the root of your domain and specify the disallow directive for your sub-domains. For example, to prevent indexing of a sub-domain named "subdomain.example.com", add the following line to your robots.txt file: User-agent: * Disallow: /subdomain/ This will tell Googlebot not to crawl and index any URLs under the "subdomain" sub-directory.
  2. Meta tag noindex: Add the following meta tag to the head section of each page on your sub-domain that you want to prevent from indexing: This tag instructs search engine bots not to index the page.
  3. X-Robots-Tag HTTP header: You can also add an X-Robots-Tag header to the HTTP response of your sub-domain pages using server-side configurations. Include the following line in the response header: X-Robots-Tag: noindex This header informs the search engine bots not to index the page.


Remember to replace "subdomain.example.com" with your actual sub-domain URL in the above examples.