Beginner guide to use SEO robots.txt file with example

Beginner guide to use SEO robots.txt file with example

SHARE:

This article is for beginners who only recently began to explore the creation and promotion of websites. If you have used Wordpress CMS before then you must be knowing something about robot.txt. Robot.txt file can be used with any kind of websites and blog. So here you can learn about how to create robot.txt file and some of its basic commands and directives.

This article is for beginners who only recently began to explore the creation and promotion of websites. If you have used Wordpress CMS before then you must be knowing something about robot.txt. Robot.txt file can be used with any kind of websites and blog. So here you can learn about how to create the robot.txt file and some of its basic commands and directives.

What is Robots.txt


Robots.txt is an important element of each website on the Internet. Robots.txt is useful in raising the rankings of site and SEO traffic. Robots.txt is a special file which is majorly present in the root of many sites which contains the different rules of indexing used by search engines.


what is robots.txt in seo



It is always accepted as a name "robots.txt". The main tasks of robot.txt are to block indexing certain pages, show where the sitemap file is and to specify the locations of the mirror site. Robots.txt also helps to avoid leakage of sensitive data of your website.

Also read: 6 Quick Tips for Improving Your WordPress Security

Why we need Robots.txt


Incorrect use of robots.txt file may lead to non-indexing of website by search engines. If you don't use it properly you may end up indexing junk data, also your sensitive data may be available to your customers and a wide range of people. If you want to check your robot.txt file you can use the services such Yandex.Webmaster robots.txt Analysis tool.

Basic commands in Robots.txt 


To create a robots.txt file you don't need any specialized programs. We just need a text editor and some definitive codes or instructions to enter in the robot.txt file. After the creation of robot.txt file you have to place it in the root directory of the website. After that search engines crawl the site and begin indexing.

Example 1:

User-agent: *
Disallow: /


Disallow and User-agent commands

Disallow command specifies that page mentioned should not be indexed and User-agent command actually refer to the search engine crawlers. Creating a robots.txt file is rarely complete without Allow directive, which is the opposite of Disallow. They both define what part or page should be indexed and which shouldn't.

In the example above we are defining User-agent: * which simply refers to all types of crawlers and bots indexing the site. The next line Disallow: / means NO crawlers should index the whole website.

Example 2:

User-agent: *
Disallow: /wp-admin/
Disallow: /trackback/

The above code means crawler shouldn't index the wp-admin and trackback folders of the website.


Additional commands in Robots.txt for seo


Additional commands in Robots.txt


There are many additional commands which can be used by robot.txt, they are as follows.


  1. Host - It detects the presence of your site of the primary mirror, if there are several.
  2. Sitemap - This command shows the search engine position of your site map.
  3. Crawl-delay - It is used to create a delay before loading the page. This command is only useful for sites with many pages. For example Crawl-delay: 10 - creates delay between the loading of pages is 10 seconds.
  4. Request-rate - It allows you to set the frequency of loading pages. For example Request-rate: 1/2 load one page every two seconds.
  5. Visit-time - This command allows the crawler to index only in a strictly allotted time to UTC.
So in brief Robots.txt file is used mainly to de-index or stop crawling certain part of your blog. Some people think that this file is used to noindexing, but it not for that purpose.

Beginner guide to use SEO robots.txt file with example


So I am now done with this article. Here are some great example of robot.txt files.

Example 3: for Disallowing Google images

User-agent: Googlebot-Image
Disallow:
Allow: /*

Example 4: Wordpress robot.txt

User-agent: *
Crawl-delay: 2
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /cgi-bin
Disallow: /category
Disallow: /tag
Disallow: /author
Disallow: /*.html/$
Disallow: /*feed*
Disallow: /trackback
Disallow: /*trackback
Disallow: /*trackback*
Disallow: /*/trackback
Disallow: /*?*

Thank you for reading this post.

COMMENTS

BLOGGER

Related Articles

Name

Android,41,Blogger,24,Blogging,28,Business,21,Computer,71,Development,5,Games,5,Guest,330,Health,16,iOS,8,Marketing,11,Online Tools,1,Programming,25,SEO,68,Social,267,Software,4,Startup,13,Technology,21,Website,14,Wordpress,18,
ltr
item
MindxMaster: Beginner guide to use SEO robots.txt file with example
Beginner guide to use SEO robots.txt file with example
This article is for beginners who only recently began to explore the creation and promotion of websites. If you have used Wordpress CMS before then you must be knowing something about robot.txt. Robot.txt file can be used with any kind of websites and blog. So here you can learn about how to create robot.txt file and some of its basic commands and directives.
https://2.bp.blogspot.com/-NhW1uGxnIzE/WF1eD4CfZnI/AAAAAAAADuY/b7PPMqvE-wgADMa25Y0J63rLypZtdFfqwCK4B/s1600/robottxt.jpg
https://2.bp.blogspot.com/-NhW1uGxnIzE/WF1eD4CfZnI/AAAAAAAADuY/b7PPMqvE-wgADMa25Y0J63rLypZtdFfqwCK4B/s72-c/robottxt.jpg
MindxMaster
https://www.mindxmaster.com/2016/12/beginner-guide-to-use-seo-robots-txt-example.html
https://www.mindxmaster.com/
https://www.mindxmaster.com/
https://www.mindxmaster.com/2016/12/beginner-guide-to-use-seo-robots-txt-example.html
true
5332415103371288268
UTF-8
Loaded All Posts Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share to a social network STEP 2: Click the link on your social network Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy