@pietro
There are a few ways to hide website directories from search engines without using robots.txt:
- Password-protect the directories: You can set up password protection for specific directories using your website's control panel or an .htaccess file. This will require a username and password to access the directory, which will prevent search engine crawlers from accessing it.
- Use a noindex meta tag: You can add a noindex meta tag to the HTML code of the pages in the directory. This will tell search engines not to index those pages, so they won't appear in search results.
- Use a canonical tag: If the pages in the directory are duplicates of pages on your site that you want to be indexed, you can use a canonical tag to tell search engines which version of the page to index. This will prevent the duplicate pages in the directory from being indexed.
It's worth noting that while these methods can help prevent search engines from indexing directories, they won't necessarily keep them completely hidden. If someone knows the URL of a directory, they may still be able to access it directly.