There are several ways to verify robots.txt rules:
Use a robots.txt testing tool: There are several free online tools that allow you to test your robots.txt file to ensure that it's working correctly. Examples of such tools include Google's robots.txt Tester and the SEO Book Robots.txt Tester.
Use Google Search Console: If you have a Google Search Console account set up for your website, you can use the robots.txt testing tool in the "Crawl" section to test your rules.
Check your website's access logs: You can check your website's access logs to see if search engine bots are obeying your robots.txt rules. If you see entries for pages that are disallowed in your robots.txt file, you may need to update your rules.
Test your website in a sandbox environment: If you want to test your robots.txt rules without affecting your live website, you can set up a sandbox environment and test your rules there.