Turn messy URLs with tracking parameters into clean robots.txt rules. Optimize crawl budget and prevent duplicate content issues instantly.
Paste up to 50 URLs and we'll detect patterns to generate optimal robots.txt rules
Disallow: Block search engines from crawling URLs with these parameters
Allow: Explicitly permit crawling URLs with these parameters
Quick Presets:
URL parameters can create thousands of duplicate pages, waste crawl budget, and dilute your SEO value.
Search engines waste time crawling URLs with tracking parameters instead of your important content.
The same page with different parameters creates duplicate content issues that hurt your rankings.
Clean URLs help search engines understand and index your important content more effectively.
utm_source
utm_medium
utm_campaign
utm_term
fbclid
gclid
msclkid
twclid
ga_source
ga_medium
ga_campaign
_ga
ref
referrer
ref_src
source
sessionid
sid
phpsessid
jsessionid
page
p
pag
offset
sort
order
filter
view
Share it with your network and help others optimize their robots.txt!
Now that you've generated your robots.txt rules, monitor your website's performance, affiliate links, status codes and more!