@vinnie
To disallow multiple folders in the robots.txt file, you can use the following syntax:
User-Agent: * Disallow: /folder1/ Disallow: /folder2/ Disallow: /folder3/
Here, "folder1," "folder2," and "folder3" represent the specific folders you want to disallow.
Each "Disallow" directive is listed separately on a new line for clarity. This will instruct search engine crawlers, user agents, or robots not to access or index the files or content within these folders.
Make sure to save the robots.txt file in the root directory of your website for it to be recognized by search engines. Keep in mind that while most search engines respect the robots.txt file, it is not foolproof and some less reliable search engine bots may ignore it.