How to Add Custom Robots.txt in Blogger 2024

Blogger/Blogspot users, are you looking to grow your website and attract more views? Well, the “trick” is that using a custom robots.txt file can help you rank your best posts and pages more effectively.

In fact, this approach helped me grow my website from 10,000 to over 45,000 monthly views in just two weeks.

But first, what is a robots.txt file? Basically, it’s a file used to instruct web crawlers (like Google) on which pages to archive and which ones to ignore. This is important to prevent duplicate content issues and ensure that only the most relevant pages are indexed.

Here’s how you can add a custom robots.txt file to your Blogger/Blogspot blog:

Step #1: Go to the Blogger Website

First, head over to the Blogger/Blogspot website and log in with your credentials. Once logged in, you’ll be taken to your dashboard.

Next, select the blog you want to edit from your list. This will take you to the blog’s overview page, where you can manage posts, settings, and other features.

Step #2: Access the Settings

Once you’re on the blog’s overview page, look for the “Settings” option in the left sidebar. Click on it to open a set of options for configuring your blog.

Scroll down to the “Crawlers and indexing” section. Here, you’ll find an option labeled “Enable custom robots.txt.” Toggle it to “Yes.”

Step #3: Create Your Custom Robots.txt File

Now that you’ve enabled custom robots.txt, it’s time to create your own file. Here’s an example of what it could look like:

The code moved to here!

  • User-agent: * specifies that the rules apply to all web crawlers.
  • Disallow: /search* prevents crawlers from indexing search result pages on your blog.
  • Allow: / allows crawlers to index all other pages.
  • Sitemap: https://rawandev.top/sitemap.xml provides the location of your sitemap to help crawlers find and index your content more efficiently.

Input your custom robots.txt content into the provided field and click “Save” to apply the changes.

Add robots.txt file in Blogger

  1. Go to the settings in your blog.
  2. Scroll down the page a bit.
  3. You will find options to configure the archive settings, including the robots.txt file.
  4. Enable the option.
  5. And click on the add robots.txt file.
  6. Copy the following file by changing the URL of your blog, then save.
User-agent: Mediapartners-Google
Disallow: 
User-agent: Googlebot
Disallow: 
User-agent: googlebot-image
Disallow: 
User-agent: googlebot-mobile
Disallow: 
User-agent: MSNBot
Disallow: 
User-agent: Slurp
Disallow: 
User-agent: Teoma
Disallow: 
User-agent: Gigabot
Disallow: 
User-agent: Robozilla
Disallow: 
User-agent: Nutch
Disallow: 
User-agent: ia_archiver
Disallow: 
User-agent: baiduspider
Disallow: 
User-agent: naverbot
Disallow: 
User-agent: yeti
Disallow: 
User-agent: yahoo-mmcrawler
Disallow: 
User-agent: psbot
Disallow: 
User-agent: yahoo-blogs/v3.9
Disallow: 

User-agent: *
Allow: /
Disallow: /search?q=
Disallow: /search?updated-min=
Disallow: /search?updated-max=
Disallow: /search/label/*?updated-min=
Disallow: /search/label/*?updated-max=

Sitemap: https://www.rawandev.top/sitemap.xml
Sitemap: https://www.rawandev.top/sitemap-pages.xml

Final Thoughts

Adding a custom robots.txt file to your Blogger/Blogspot blog is a simple way to boost your site’s SEO. By strategically allowing and disallowing certain pages, you can improve how your site is indexed by search engines.

As a result, this strategy significantly increased my monthly views from 10,000 to over 45,000 in just two weeks. Additionally, it helped me gain 30,000 more views on other platforms by enhancing my SEO efforts.

So, take control of your SEO today by adding a custom robots.txt file to your blog, and watch your traffic soar!

2 thoughts on “How to Add Custom Robots.txt in Blogger 2024”

Leave a Comment