If you are a web master or blog owner, you must have probably heard about /robots.txt. It’s a text file that tells the search engine what to index and what not to. This will help the search engines to neglect the 404 error pages and other pages that you don’t want to be in search engines. Officially as per google, Robots.txt is defined as
A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages. (All respectable robots will respect the directives in a robots.txt file, although some may interpret them differently. However, a robots.txt is not enforceable, and some spammers and other troublemakers may ignore it. For this reason, we recommend password protecting confidential information.)
They are a must for your blog in case of seo and all wordpress blogs have a strong robots.txt for the search engines. Now, blogger also have the same feature which you can use to enhance your seo and tell the search engine to skip your unwanted or 404 pages.
This is quite simple. Blogger has made this process also really simple as it always does ! Thanks to google.
- Login to your blogger account
- Go to settings of your blog
- Navigate to Search Preferences
- Select custom robots.txt or Custom robots header tags.
If you use custom robots.txt, then you will have to manually create the robots file, which if you don’t know how to may create errors. So I prefer that you use the Custom robots header tag, where you can edit the robots.txt file completely by ticking the needed boxes. Take care while doing the same as errors in robots file could seriously affect your search engine rankings