You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 17, 2025. It is now read-only.
Would be ideal that, given a domain, the crawler was capable of detecting the sitemap or accepting a sitemap as parameter.
I would think in the following workflow:
This would be both for main urls or secondary urls (there are sub-sitemaps and specially if you are using routes and behind reverse proxy server).