1. Configure Your Rules
Advanced Options (for custom rules)
2. Your Custom Robots.txt
Copy the code below and paste it into your Blogger settings.
3. How to Add to Blogger
Step 1: Log in to your Blogger Dashboard.
Step 2: In the left-hand menu, go to Settings.
Step 3: Scroll down to the "Crawlers and indexing" section.
Step 4: Enable the "Enable custom robots.txt" option.
Step 5: Click on "Custom robots.txt". A text box will appear.
Step 6: Delete any existing content in the box and paste the code generated above.
Step 7: Click Save. You're all set!
Frequently Asked Questions
What is a robots.txt file?
A robots.txt file is a simple text file that tells search engine crawlers (like Googlebot) which pages or files on your site they can or cannot request. It's a fundamental part of technical SEO.
Why do I need a custom one for Blogger?
While Blogger has a default setup, a custom robots.txt gives you more control. You can ensure low-value pages like search results or specific labels aren't indexed, which concentrates your "crawl budget" on your important posts and pages, potentially improving your SEO.
Is it safe to disallow /search
and ?m=1
?
Yes, it's highly recommended. Disallowing /search
prevents Google from indexing your internal search result pages, which are considered low-quality "duplicate content". Disallowing ?m=1
(the old mobile URL parameter) is good practice because modern responsive themes don't need it, and it prevents duplicate content issues.
What about the Sitemap?
The sitemap is a map of your blog for search engines. Our generator automatically adds the correct sitemap path (/sitemap.xml
) to your robots.txt file based on the URL you provide. This is crucial for helping Google find all your content quickly.