How to disallow search pages from robots.txt?

by creola.ebert , in category: SEO , 8 months ago

How to disallow search pages from robots.txt?

Facebook Twitter LinkedIn Telegram Whatsapp Pocket

1 answer

by declan_ritchie , 8 months ago


To disallow search pages from being indexed by search engines in the robots.txt file, you can use the following steps:

  1. Locate the robots.txt file on your website. It is typically found in the root directory of your website. For example, the file path might be "".
  2. Access the robots.txt file using a text editor and open it for editing.
  3. Add the following line to the robots.txt file to disallow search pages: Disallow: /*?search //This disallows search pages with query parameters (?search). Disallow: /*search //This disallows search pages with directories (/search). Note: You can customize the "search" keyword to match your search pages' URL structure. For example, if your search page URLs have the parameter "?q", you should use "Disallow: /*?q" instead.
  4. Save the robots.txt file after adding the disallow rule.
  5. Upload the updated robots.txt file to the root directory of your website using FTP or any other file transfer method.
  6. Verify the changes by accessing your website's robots.txt file through your browser. Enter the URL "", and you should see the updated file with the disallow rule for search pages.

It's important to note that while many search engines respect the instructions in your robots.txt file, some may not, particularly malicious bots or scrapers. Additionally, the robots.txt file is publicly accessible, so sensitive information should not be included in it.