Of course you already know that Google uses bots to search web pages on the internet, including your website pages. Basically, search engine bots will crawl all pages on the website. However, it is less efficient. Well, to make crawling search engine bots more efficient, you need robots txt.

Still unfamiliar with robots txt? Relax, we will explain in full in this article. We’ll cover what a txts robot is, why it’s important, and of course how to set up txt robots on your website.

What is Robots txt?

Robots txt is a file that contains instructions to tell search engine bots what pages to crawl. Search engine bots will automatically crawl all web pages. It even crawls even unimportant pages such as plugin and theme directories.

Crawling to these pages is certainly not necessary because it is useless. In addition, crawling to unnecessary pages also slows down the crawling process on your website. This is due to the increasing number of pages search engine bots need to crawl.

Therefore, you need to set up robots txt on your website. Robots txt will direct search engine bots to crawl only important pages. That way the crawling process on your website will be completed faster.

Where is the Robots txt file stored and what are the rules?

Before learning how to organize a robots txt file, you need to know where it is stored. When the website is created, it will automatically create a robots.txt file which is located in the server’s main folder.

For example, the name of your website is www.mywebsite.com. You can find the robots.txt file at www.mywebsite.com/robots.txt like this:

There are three common commands found in the robots.txt file. The three commands are

User Agent: shows search engine bots affected by commands from robots txt. You can just sort out which bots are affected. But to make it easier for you, you should immediately give an asterisk (). An asterisk () means that all bots are affected by the robots.txt command. Disallow: is a command that prohibits search engine bots from crawling to certain pages. Allow: is a command that allows search engine bots to crawl to certain pages. In general, you do not need to use the Allow command. Because search engine bots will automatically crawl all web pages. So you’ll be using the Disallow command a lot more.

How to Set Robots txt in WordPress Using Plugins

To be able to set the robots txt, you can use a plugin. There are two plugins that you can use, namely Yoast SEO and All in One SEO Pack. Both have the same function. You just need to choose which one is more suitable for your website.

Oh yes, before starting the guide below, you should already have a list of what pages do not need to be crawled by search engine bots. That way, your work will be completed faster.

Let’s start with the first method!

Yoast SEO Plugin

Yoast SEO is indeed a free SEO plugin with full features. Besides being useful for analyzing the quality of SEO on a website, Yoast SEO also has a feature to activate and manage txt robots.

Here are the steps to enable txt robots on your website using Yoast SEO:

First, make sure you have Yoast SEO installed. If you haven’t already installed it on your website, you can install it via the WordPress plugin directory or download it here.

Second, open the WordPress dashboard and click the SEO menu > Tools > File Editor. After that, you can create a robots txt file by clicking Create robot.txt file.

Third, in this step you need to create a rule that suits the needs of your website. Enter any pages that search engine bots need and don’t need to crawl. After that, click Save changes to robots.txt.

Done, you have successfully created a robots txt file using Yoast SEO.

All in One SEO Pack Plugin

Apart from the Yoast SEO plugin, you can also take advantage of other plugins. One of them is the All in One SEO Pack plugin. The steps are a bit different than Yoast SEO. However, the function remains the same, namely managing the robots txt file.

Follow the steps below to enable robots txt file in WordPress using All in One SEO Pack:

First, make sure you have the All in One SEO Pack plugin installed. If not, you can install it directly through the plugin directory in your WordPress or download it here.

Second, open the WordPress dashboard and click All in One SEO > Feature Manager. Next, select the Robots.txt menu and click Activate.

Third, after the Robots txt feature is activated, you will find the menu in the All in One SEO Pack dashboard panel on the left. Click the Robot.txt menu and you will be directed to a page to create a robots txt file.

Fourth, determine the rules you want to apply on the website. Select the Block rule to block a page from crawling and select the Allow rule to allow crawling on a page. Then enter the URL of the page you want to block or allow crawling.

Done, you have successfully activated the txt robot in WordPress using the All in One SEO Pack plugin.

How to Check Robots txt File Using Google Search Console

After successfully creating a robots.txt file using the steps above, you need to check it. If you’ve followed the guide above, actually the possibility of an error is very small. However, just in case, you should still check every time you create or update a robots txt file.

To check whether or not there is an error in the robots txt file created, you can use Google Search Console. It’s easy. Open Google Search Console. Then on the Google Search Console dashboard, you will find the Crawl menu > robots.txt crawler.

After that, you will see the robots.txt file that you created earlier then click Submit. If there is an error, Google Search Console will notify you. However, as mentioned earlier, the possibility of an error is very small. So you don’t have to worry.

Conclusion

By enabling the robots txt file, you have redirected search engine bots to the most important pages on your website. So search engine bots don’t need to crawl to pages that visitors don’t need to know about.

As a result the crawling process will be faster. The faster the crawling process, the more likely your website will appear in search results faster. In addition, people will also not find irrelevant pages from your website.