@elmo.conroy
To prevent Googlebot from indexing your sub-domains, you can use the following methods:
- robots.txt file: Create a robots.txt file at the root of your domain and specify the disallow directive for your sub-domains. For example, to prevent indexing of a sub-domain named "subdomain.example.com", add the following line to your robots.txt file:
User-agent: *
Disallow: /subdomain/
This will tell Googlebot not to crawl and index any URLs under the "subdomain" sub-directory.
- Meta tag noindex: Add the following meta tag to the head section of each page on your sub-domain that you want to prevent from indexing:
This tag instructs search engine bots not to index the page.
- X-Robots-Tag HTTP header: You can also add an X-Robots-Tag header to the HTTP response of your sub-domain pages using server-side configurations. Include the following line in the response header:
X-Robots-Tag: noindex
This header informs the search engine bots not to index the page.
Remember to replace "subdomain.example.com" with your actual sub-domain URL in the above examples.