User-agent: *
Disallow: /admin/
Disallow: /login/
Allow: /
Sitemap:
https://www.yoursite.com/sitemap.xml
Explanation:
User-agent: * applies the rules to all web crawlers.
Disallow blocks crawlers from accessing sensitive or private directories like /admin/ and /login/.
Allow: / lets crawlers access all other parts of the site.
The Sitemap directive helps search engines find your sitemap easily, improving crawling.
Tips for the best robots.txt:
Keep it simple—avoid blocking important content unintentionally.
Use specific disallow rules to protect sensitive pages.
Always test your robots.txt in Google Search Console’s tester tool before deploying.
If you want, I can help create a customized robots.txt for your website!