Conversation
Code Review — PR #57: Add robots.txt crawl rules
OverviewAdds a What's Good
Issues & Suggestions1. Crawlers allow everything by default; User-agent: *
Allow: /
-Allow: /posts/
-Allow: /rss.xml
Disallow: /og-preview
Disallow: /posts/*-og.png2. Wildcard The original robots.txt spec does not support wildcards. Major crawlers (Google, Bing) do support 3. Verify The Astro project uses Satori for OG image generation. Please confirm that 4. Sitemap filename — verify Astro's Summary
Overall this is a low-risk, useful addition. The main actionable follow-up is verifying that |
Summary
robots.txtfor the siteTesting