Enter a website. We'll fetch its robots.txt file and highlight which paths and bots are allowed or blocked, including search and AI crawlers
This tool fetches the robots.txt file from any website you enter and analyzes it to show you which search engines and AI crawlers are allowed or blocked. It breaks down the rules by bot type and displays restrictions, sitemaps, and crawl delays in an easy-to-read format.